Mar 13 11:48:03 crc systemd[1]: Starting Kubernetes Kubelet... Mar 13 11:48:03 crc restorecon[4678]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:03 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 11:48:04 crc restorecon[4678]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 11:48:04 crc restorecon[4678]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 13 11:48:04 crc kubenswrapper[4837]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 11:48:04 crc kubenswrapper[4837]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 13 11:48:04 crc kubenswrapper[4837]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 11:48:04 crc kubenswrapper[4837]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 11:48:04 crc kubenswrapper[4837]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 13 11:48:04 crc kubenswrapper[4837]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.789774 4837 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795132 4837 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795170 4837 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795182 4837 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795194 4837 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795205 4837 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795215 4837 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795223 4837 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795231 4837 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795239 4837 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795247 4837 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795256 4837 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795264 4837 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795272 4837 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795283 4837 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795291 4837 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795299 4837 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795306 4837 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795314 4837 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795322 4837 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795329 4837 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795337 4837 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795344 4837 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795352 4837 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795362 4837 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795371 4837 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795379 4837 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795396 4837 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795404 4837 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795411 4837 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795419 4837 feature_gate.go:330] unrecognized feature gate: Example Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795427 4837 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795436 4837 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795444 4837 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795453 4837 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795462 4837 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795470 4837 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795478 4837 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795486 4837 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795495 4837 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795502 4837 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795510 4837 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795518 4837 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795529 4837 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795539 4837 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795549 4837 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795559 4837 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795570 4837 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795579 4837 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795591 4837 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795601 4837 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795611 4837 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795620 4837 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795630 4837 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795667 4837 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795676 4837 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795685 4837 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795693 4837 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795705 4837 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795713 4837 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795721 4837 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795729 4837 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795736 4837 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795745 4837 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795757 4837 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795767 4837 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795774 4837 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795782 4837 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795790 4837 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795798 4837 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795806 4837 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.795813 4837 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.796863 4837 flags.go:64] FLAG: --address="0.0.0.0" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.796894 4837 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.796918 4837 flags.go:64] FLAG: --anonymous-auth="true" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.796932 4837 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.796969 4837 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.796981 4837 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.796997 4837 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.797013 4837 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.797022 4837 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.797031 4837 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.797041 4837 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.797050 4837 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.797060 4837 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.797070 4837 flags.go:64] FLAG: --cgroup-root="" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.797078 4837 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.797089 4837 flags.go:64] FLAG: --client-ca-file="" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.797098 4837 flags.go:64] FLAG: --cloud-config="" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.797107 4837 flags.go:64] FLAG: --cloud-provider="" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.797117 4837 flags.go:64] FLAG: --cluster-dns="[]" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.797127 4837 flags.go:64] FLAG: --cluster-domain="" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.797136 4837 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.797145 4837 flags.go:64] FLAG: --config-dir="" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.797153 4837 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.797164 4837 flags.go:64] FLAG: --container-log-max-files="5" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.797175 4837 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.797184 4837 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.797193 4837 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.797203 4837 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.797212 4837 flags.go:64] FLAG: --contention-profiling="false" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.797221 4837 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.797230 4837 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.797240 4837 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.797248 4837 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.797260 4837 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.797269 4837 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.797277 4837 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.797286 4837 flags.go:64] FLAG: --enable-load-reader="false" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.797295 4837 flags.go:64] FLAG: --enable-server="true" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.797303 4837 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.797317 4837 flags.go:64] FLAG: --event-burst="100" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.797326 4837 flags.go:64] FLAG: --event-qps="50" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.797335 4837 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.797345 4837 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.797353 4837 flags.go:64] FLAG: --eviction-hard="" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.797364 4837 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.797373 4837 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798141 4837 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798164 4837 flags.go:64] FLAG: --eviction-soft="" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798174 4837 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798185 4837 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798195 4837 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798207 4837 flags.go:64] FLAG: --experimental-mounter-path="" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798216 4837 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798225 4837 flags.go:64] FLAG: --fail-swap-on="true" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798234 4837 flags.go:64] FLAG: --feature-gates="" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798245 4837 flags.go:64] FLAG: --file-check-frequency="20s" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798255 4837 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798264 4837 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798273 4837 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798283 4837 flags.go:64] FLAG: --healthz-port="10248" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798293 4837 flags.go:64] FLAG: --help="false" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798302 4837 flags.go:64] FLAG: --hostname-override="" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798311 4837 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798320 4837 flags.go:64] FLAG: --http-check-frequency="20s" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798329 4837 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798338 4837 flags.go:64] FLAG: --image-credential-provider-config="" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798346 4837 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798355 4837 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798364 4837 flags.go:64] FLAG: --image-service-endpoint="" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798374 4837 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798383 4837 flags.go:64] FLAG: --kube-api-burst="100" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798392 4837 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798401 4837 flags.go:64] FLAG: --kube-api-qps="50" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798410 4837 flags.go:64] FLAG: --kube-reserved="" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798419 4837 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798428 4837 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798438 4837 flags.go:64] FLAG: --kubelet-cgroups="" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798447 4837 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798456 4837 flags.go:64] FLAG: --lock-file="" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798465 4837 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798474 4837 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798483 4837 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798497 4837 flags.go:64] FLAG: --log-json-split-stream="false" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798506 4837 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798514 4837 flags.go:64] FLAG: --log-text-split-stream="false" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798523 4837 flags.go:64] FLAG: --logging-format="text" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798533 4837 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798543 4837 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798552 4837 flags.go:64] FLAG: --manifest-url="" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798561 4837 flags.go:64] FLAG: --manifest-url-header="" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798574 4837 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798583 4837 flags.go:64] FLAG: --max-open-files="1000000" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798594 4837 flags.go:64] FLAG: --max-pods="110" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798603 4837 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798612 4837 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798621 4837 flags.go:64] FLAG: --memory-manager-policy="None" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798655 4837 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798665 4837 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798674 4837 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798684 4837 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798710 4837 flags.go:64] FLAG: --node-status-max-images="50" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798723 4837 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798735 4837 flags.go:64] FLAG: --oom-score-adj="-999" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798747 4837 flags.go:64] FLAG: --pod-cidr="" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798758 4837 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798776 4837 flags.go:64] FLAG: --pod-manifest-path="" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798787 4837 flags.go:64] FLAG: --pod-max-pids="-1" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798799 4837 flags.go:64] FLAG: --pods-per-core="0" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798810 4837 flags.go:64] FLAG: --port="10250" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798821 4837 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798831 4837 flags.go:64] FLAG: --provider-id="" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798843 4837 flags.go:64] FLAG: --qos-reserved="" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798856 4837 flags.go:64] FLAG: --read-only-port="10255" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798868 4837 flags.go:64] FLAG: --register-node="true" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798879 4837 flags.go:64] FLAG: --register-schedulable="true" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798889 4837 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798905 4837 flags.go:64] FLAG: --registry-burst="10" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798915 4837 flags.go:64] FLAG: --registry-qps="5" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798923 4837 flags.go:64] FLAG: --reserved-cpus="" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798932 4837 flags.go:64] FLAG: --reserved-memory="" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798944 4837 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798954 4837 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798963 4837 flags.go:64] FLAG: --rotate-certificates="false" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798973 4837 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798982 4837 flags.go:64] FLAG: --runonce="false" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.798991 4837 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.799000 4837 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.799010 4837 flags.go:64] FLAG: --seccomp-default="false" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.799019 4837 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.799027 4837 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.799037 4837 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.799046 4837 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.799055 4837 flags.go:64] FLAG: --storage-driver-password="root" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.799064 4837 flags.go:64] FLAG: --storage-driver-secure="false" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.799073 4837 flags.go:64] FLAG: --storage-driver-table="stats" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.799082 4837 flags.go:64] FLAG: --storage-driver-user="root" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.799090 4837 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.799100 4837 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.799109 4837 flags.go:64] FLAG: --system-cgroups="" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.799117 4837 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.799131 4837 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.799140 4837 flags.go:64] FLAG: --tls-cert-file="" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.799149 4837 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.799160 4837 flags.go:64] FLAG: --tls-min-version="" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.799169 4837 flags.go:64] FLAG: --tls-private-key-file="" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.799177 4837 flags.go:64] FLAG: --topology-manager-policy="none" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.799187 4837 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.799204 4837 flags.go:64] FLAG: --topology-manager-scope="container" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.799214 4837 flags.go:64] FLAG: --v="2" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.799226 4837 flags.go:64] FLAG: --version="false" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.799237 4837 flags.go:64] FLAG: --vmodule="" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.799248 4837 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.799257 4837 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804180 4837 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804277 4837 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804284 4837 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804293 4837 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804298 4837 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804303 4837 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804315 4837 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804319 4837 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804323 4837 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804330 4837 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804335 4837 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804342 4837 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804351 4837 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804356 4837 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804361 4837 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804368 4837 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804373 4837 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804377 4837 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804385 4837 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804390 4837 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804395 4837 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804399 4837 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804404 4837 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804409 4837 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804412 4837 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804420 4837 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804426 4837 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804432 4837 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804437 4837 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804441 4837 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804448 4837 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804453 4837 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804457 4837 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804464 4837 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804469 4837 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804474 4837 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804479 4837 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804483 4837 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804488 4837 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804492 4837 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804499 4837 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804502 4837 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804507 4837 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804516 4837 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804522 4837 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804527 4837 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804533 4837 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804537 4837 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804543 4837 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804548 4837 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804554 4837 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804560 4837 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804565 4837 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804578 4837 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804583 4837 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804592 4837 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804597 4837 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804601 4837 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804606 4837 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804610 4837 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804613 4837 feature_gate.go:330] unrecognized feature gate: Example Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804685 4837 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804714 4837 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804725 4837 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804737 4837 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804747 4837 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804756 4837 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804781 4837 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804795 4837 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804806 4837 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.804862 4837 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.804877 4837 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.819225 4837 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.819330 4837 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.819521 4837 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.819553 4837 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.819567 4837 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.819581 4837 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.819594 4837 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.819605 4837 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.819617 4837 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.819628 4837 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.819670 4837 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.819682 4837 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.819693 4837 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.819704 4837 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.819714 4837 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.819727 4837 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.819741 4837 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.819753 4837 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.819763 4837 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.819778 4837 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.819792 4837 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.819804 4837 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.819816 4837 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.819826 4837 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.819839 4837 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.819850 4837 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.819861 4837 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.819872 4837 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.819882 4837 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.819894 4837 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.819905 4837 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.819916 4837 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.819930 4837 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.819941 4837 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.819953 4837 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.819964 4837 feature_gate.go:330] unrecognized feature gate: Example Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.819974 4837 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.819987 4837 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.819997 4837 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.820012 4837 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.820027 4837 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.820039 4837 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.820051 4837 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.820062 4837 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.820074 4837 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.820086 4837 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.820095 4837 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.820105 4837 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.820116 4837 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.820130 4837 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.820142 4837 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.820154 4837 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.820165 4837 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.820176 4837 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.820187 4837 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.820198 4837 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.820210 4837 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.820220 4837 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.820230 4837 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.820240 4837 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.820251 4837 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.820261 4837 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.820271 4837 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.820288 4837 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.820300 4837 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.820314 4837 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.820325 4837 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.820340 4837 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.820351 4837 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.820367 4837 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.820380 4837 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.820393 4837 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.820405 4837 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.820423 4837 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.820791 4837 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.820809 4837 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.820822 4837 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.820834 4837 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.820845 4837 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.820856 4837 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.820866 4837 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.820877 4837 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.820888 4837 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.820898 4837 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.820909 4837 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.820919 4837 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.820933 4837 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.820947 4837 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.820958 4837 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.820969 4837 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.820980 4837 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.820991 4837 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.821001 4837 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.821011 4837 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.821022 4837 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.821033 4837 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.821044 4837 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.821055 4837 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.821066 4837 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.821076 4837 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.821088 4837 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.821099 4837 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.821111 4837 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.821121 4837 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.821132 4837 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.821143 4837 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.821153 4837 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.821165 4837 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.821176 4837 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.821186 4837 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.821196 4837 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.821210 4837 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.821225 4837 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.821238 4837 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.821248 4837 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.821259 4837 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.821270 4837 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.821281 4837 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.821291 4837 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.821301 4837 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.821311 4837 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.821322 4837 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.821335 4837 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.821346 4837 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.821359 4837 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.821369 4837 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.821380 4837 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.821391 4837 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.821402 4837 feature_gate.go:330] unrecognized feature gate: Example Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.821419 4837 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.821436 4837 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.821450 4837 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.821462 4837 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.821475 4837 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.821488 4837 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.821499 4837 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.821511 4837 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.821522 4837 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.821534 4837 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.821546 4837 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.821557 4837 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.821568 4837 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.821580 4837 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.821591 4837 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.821602 4837 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.821620 4837 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.823069 4837 server.go:940] "Client rotation is on, will bootstrap in background" Mar 13 11:48:04 crc kubenswrapper[4837]: E0313 11:48:04.832119 4837 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.836412 4837 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.836576 4837 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.838696 4837 server.go:997] "Starting client certificate rotation" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.838749 4837 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.839051 4837 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.866385 4837 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 13 11:48:04 crc kubenswrapper[4837]: E0313 11:48:04.869140 4837 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.870573 4837 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.889174 4837 log.go:25] "Validated CRI v1 runtime API" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.919893 4837 log.go:25] "Validated CRI v1 image API" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.925949 4837 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.930907 4837 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-13-11-43-51-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.930953 4837 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.946794 4837 manager.go:217] Machine: {Timestamp:2026-03-13 11:48:04.945007037 +0000 UTC m=+0.583273810 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654120448 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:91a43e7e-d083-4b9e-bcd8-790411e8b2f1 BootID:205607ff-4e76-4a9e-84cc-5670826221a2 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:c2:ee:3b Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:c2:ee:3b Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:5b:5d:30 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:a9:ef:51 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:7b:39:fc Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:ed:c2:eb Speed:-1 Mtu:1496} {Name:eth10 MacAddress:52:53:d9:d8:46:ff Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:fa:d1:e0:0f:52:a7 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654120448 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.947069 4837 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.947222 4837 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.948233 4837 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.948412 4837 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.948462 4837 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.950733 4837 topology_manager.go:138] "Creating topology manager with none policy" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.950762 4837 container_manager_linux.go:303] "Creating device plugin manager" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.951420 4837 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.951450 4837 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.951715 4837 state_mem.go:36] "Initialized new in-memory state store" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.951814 4837 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.955920 4837 kubelet.go:418] "Attempting to sync node with API server" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.955943 4837 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.955962 4837 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.955976 4837 kubelet.go:324] "Adding apiserver pod source" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.955989 4837 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.961068 4837 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.962172 4837 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.963692 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Mar 13 11:48:04 crc kubenswrapper[4837]: E0313 11:48:04.963779 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.963701 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Mar 13 11:48:04 crc kubenswrapper[4837]: E0313 11:48:04.963823 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.964369 4837 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.966727 4837 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.966790 4837 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.966806 4837 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.966822 4837 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.966846 4837 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.966864 4837 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.966880 4837 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.966905 4837 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.966924 4837 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.966958 4837 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.967017 4837 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.967033 4837 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.969934 4837 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.970887 4837 server.go:1280] "Started kubelet" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.971185 4837 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.971396 4837 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.972074 4837 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.972868 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Mar 13 11:48:04 crc systemd[1]: Started Kubernetes Kubelet. Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.977152 4837 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.977207 4837 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.977629 4837 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.977709 4837 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 13 11:48:04 crc kubenswrapper[4837]: E0313 11:48:04.977740 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.977957 4837 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.978573 4837 server.go:460] "Adding debug handlers to kubelet server" Mar 13 11:48:04 crc kubenswrapper[4837]: W0313 11:48:04.979054 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.979280 4837 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.979322 4837 factory.go:55] Registering systemd factory Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.979342 4837 factory.go:221] Registration of the systemd container factory successfully Mar 13 11:48:04 crc kubenswrapper[4837]: E0313 11:48:04.979172 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.981798 4837 factory.go:153] Registering CRI-O factory Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.981865 4837 factory.go:221] Registration of the crio container factory successfully Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.981911 4837 factory.go:103] Registering Raw factory Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.981949 4837 manager.go:1196] Started watching for new ooms in manager Mar 13 11:48:04 crc kubenswrapper[4837]: E0313 11:48:04.982838 4837 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="200ms" Mar 13 11:48:04 crc kubenswrapper[4837]: I0313 11:48:04.984700 4837 manager.go:319] Starting recovery of all containers Mar 13 11:48:04 crc kubenswrapper[4837]: E0313 11:48:04.993033 4837 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.138:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189c642a087ab8f8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:04.970797304 +0000 UTC m=+0.609064107,LastTimestamp:2026-03-13 11:48:04.970797304 +0000 UTC m=+0.609064107,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.002164 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.002253 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.002281 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.002305 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.002328 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.002350 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.002375 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.002407 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.002432 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.002453 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.002473 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.002501 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.002525 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.002550 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.002576 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.002673 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.002697 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.002718 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.002739 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.002759 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.002779 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.002802 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.002823 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.002843 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.002865 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.002885 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.002911 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.002934 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.003005 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.003028 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.003048 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.003068 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.003090 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.003116 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.003145 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.003174 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.003206 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.003234 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.003307 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.003336 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.003364 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.003391 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.003417 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.003445 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.003469 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.003491 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.003511 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.003531 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.003555 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.003577 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.003599 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.003620 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.003678 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.003703 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.003727 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.003751 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.003773 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.003794 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.003815 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.003836 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.003856 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.003881 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.003902 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.003926 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.003967 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.003986 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.004007 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.004025 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.004045 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.004066 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.004087 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.004107 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.004129 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.004149 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.004172 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.004191 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.004211 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.004230 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.004251 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.004270 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.004291 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.004312 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.004334 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.004353 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.004374 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.004393 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.004416 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.004436 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.004456 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.004475 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.004500 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.004520 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.004540 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.004566 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.004586 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.004607 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.004630 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.004676 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.004696 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.004715 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.004735 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.004756 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.004776 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.004796 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.004822 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.004847 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.004868 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.004893 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.004917 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.004940 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.004974 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.004997 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.005023 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.005046 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.005067 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.005088 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.005109 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.005130 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.005152 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.005173 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.005192 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.005214 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.005236 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.005283 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.005307 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.005332 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.005353 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.005371 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.005394 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.005414 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.005433 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.005454 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.005475 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.005495 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.005515 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.005535 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.005594 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.009601 4837 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.009710 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.009745 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.009770 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.009795 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.009816 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.009846 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.009870 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.009892 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.009913 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.009936 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.009972 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.010005 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.010028 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.010054 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.010077 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.010098 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.010120 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.010735 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.010765 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.010794 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.010814 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.010834 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.010913 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.010940 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.010961 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.010983 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.011004 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.011037 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.011076 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.011095 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.011134 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.011207 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.011264 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.011296 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.011337 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.011371 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.011403 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.011428 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.011489 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.011518 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.011553 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.011633 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.011689 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.011710 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.011741 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.011763 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.011878 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.011902 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.011946 4837 manager.go:324] Recovery completed Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.011961 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.012021 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.012049 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.012076 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.012103 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.012133 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.012219 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.012259 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.012292 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.012310 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.012334 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.012352 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.012370 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.012406 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.012552 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.012571 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.012597 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.012618 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.012662 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.012720 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.012749 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.012815 4837 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.012882 4837 reconstruct.go:97] "Volume reconstruction finished" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.012905 4837 reconciler.go:26] "Reconciler: start to sync state" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.023483 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.027522 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.027567 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.027581 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.028443 4837 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.028464 4837 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.028494 4837 state_mem.go:36] "Initialized new in-memory state store" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.044568 4837 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.046806 4837 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.046876 4837 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.046916 4837 kubelet.go:2335] "Starting kubelet main sync loop" Mar 13 11:48:05 crc kubenswrapper[4837]: E0313 11:48:05.047078 4837 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.049017 4837 policy_none.go:49] "None policy: Start" Mar 13 11:48:05 crc kubenswrapper[4837]: W0313 11:48:05.049181 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Mar 13 11:48:05 crc kubenswrapper[4837]: E0313 11:48:05.049311 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.051157 4837 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.051186 4837 state_mem.go:35] "Initializing new in-memory state store" Mar 13 11:48:05 crc kubenswrapper[4837]: E0313 11:48:05.078212 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:48:05 crc kubenswrapper[4837]: E0313 11:48:05.101515 4837 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.138:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189c642a087ab8f8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:04.970797304 +0000 UTC m=+0.609064107,LastTimestamp:2026-03-13 11:48:04.970797304 +0000 UTC m=+0.609064107,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.122997 4837 manager.go:334] "Starting Device Plugin manager" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.123052 4837 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.123070 4837 server.go:79] "Starting device plugin registration server" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.124057 4837 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.124080 4837 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.124218 4837 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.124594 4837 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.124617 4837 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 13 11:48:05 crc kubenswrapper[4837]: E0313 11:48:05.132177 4837 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.147734 4837 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.147864 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.149227 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.149261 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.149273 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.149420 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.150011 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.150052 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.151546 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.151573 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.151586 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.151753 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.152215 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.152252 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.152714 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.152758 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.152804 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.153605 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.153673 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.153691 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.153992 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.154094 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.154113 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.154523 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.155274 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.155363 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.157914 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.157984 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.158009 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.158091 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.158165 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.158186 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.158443 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.158753 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.158818 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.159909 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.159956 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.159977 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.160221 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.160273 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.160282 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.160304 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.160334 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.161480 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.161498 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.161508 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:05 crc kubenswrapper[4837]: E0313 11:48:05.184520 4837 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="400ms" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.216428 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.216508 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.216558 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.216586 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.216607 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.216632 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.216669 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.216881 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.216900 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.217053 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.217179 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.217305 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.217343 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.217396 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.217418 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.224595 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.226139 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.226184 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.226196 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.226229 4837 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 11:48:05 crc kubenswrapper[4837]: E0313 11:48:05.226762 4837 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.138:6443: connect: connection refused" node="crc" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.318908 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.319394 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.319424 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.319447 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.319470 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.319503 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.319538 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.319558 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.319536 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.319617 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.319600 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.319184 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.319567 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.319695 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.319712 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.319729 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.319786 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.319804 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.319702 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.319821 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.319876 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.319893 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.319894 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.319916 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.319946 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.319909 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.319959 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.319989 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.319918 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.320178 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.427183 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.429220 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.429274 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.429284 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.429320 4837 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 11:48:05 crc kubenswrapper[4837]: E0313 11:48:05.429846 4837 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.138:6443: connect: connection refused" node="crc" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.490661 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.500680 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.527119 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.550828 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:48:05 crc kubenswrapper[4837]: W0313 11:48:05.552871 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-ed78d8e4768e0800e9708f2a71bd11f983ebf3d69efb7ef41280897e00598fd5 WatchSource:0}: Error finding container ed78d8e4768e0800e9708f2a71bd11f983ebf3d69efb7ef41280897e00598fd5: Status 404 returned error can't find the container with id ed78d8e4768e0800e9708f2a71bd11f983ebf3d69efb7ef41280897e00598fd5 Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.553994 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 11:48:05 crc kubenswrapper[4837]: W0313 11:48:05.556142 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-99ca6f098d7c84fa84c2600be42136bf0bbaf5c1a3347060394ff6de274bd3ce WatchSource:0}: Error finding container 99ca6f098d7c84fa84c2600be42136bf0bbaf5c1a3347060394ff6de274bd3ce: Status 404 returned error can't find the container with id 99ca6f098d7c84fa84c2600be42136bf0bbaf5c1a3347060394ff6de274bd3ce Mar 13 11:48:05 crc kubenswrapper[4837]: W0313 11:48:05.566133 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-64312d58cedec4c3773a1f9055b2a8cf2b9e15b3e4d861d57f62d47d6868dc36 WatchSource:0}: Error finding container 64312d58cedec4c3773a1f9055b2a8cf2b9e15b3e4d861d57f62d47d6868dc36: Status 404 returned error can't find the container with id 64312d58cedec4c3773a1f9055b2a8cf2b9e15b3e4d861d57f62d47d6868dc36 Mar 13 11:48:05 crc kubenswrapper[4837]: W0313 11:48:05.570171 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-cecadc8b23beb9519d973317a5e749fc6b4e7f3dd44c6106f0e3896e194deff7 WatchSource:0}: Error finding container cecadc8b23beb9519d973317a5e749fc6b4e7f3dd44c6106f0e3896e194deff7: Status 404 returned error can't find the container with id cecadc8b23beb9519d973317a5e749fc6b4e7f3dd44c6106f0e3896e194deff7 Mar 13 11:48:05 crc kubenswrapper[4837]: W0313 11:48:05.571126 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-a7e73c64ad82cebae204f5b72154fd5ff23ca2c12261ae626070480716cc921e WatchSource:0}: Error finding container a7e73c64ad82cebae204f5b72154fd5ff23ca2c12261ae626070480716cc921e: Status 404 returned error can't find the container with id a7e73c64ad82cebae204f5b72154fd5ff23ca2c12261ae626070480716cc921e Mar 13 11:48:05 crc kubenswrapper[4837]: E0313 11:48:05.585803 4837 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="800ms" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.830775 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.832193 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.832246 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.832264 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.832297 4837 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 11:48:05 crc kubenswrapper[4837]: E0313 11:48:05.832930 4837 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.138:6443: connect: connection refused" node="crc" Mar 13 11:48:05 crc kubenswrapper[4837]: I0313 11:48:05.973953 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Mar 13 11:48:06 crc kubenswrapper[4837]: W0313 11:48:06.040409 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Mar 13 11:48:06 crc kubenswrapper[4837]: E0313 11:48:06.040495 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Mar 13 11:48:06 crc kubenswrapper[4837]: I0313 11:48:06.053285 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"cecadc8b23beb9519d973317a5e749fc6b4e7f3dd44c6106f0e3896e194deff7"} Mar 13 11:48:06 crc kubenswrapper[4837]: I0313 11:48:06.054781 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"64312d58cedec4c3773a1f9055b2a8cf2b9e15b3e4d861d57f62d47d6868dc36"} Mar 13 11:48:06 crc kubenswrapper[4837]: I0313 11:48:06.055834 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"99ca6f098d7c84fa84c2600be42136bf0bbaf5c1a3347060394ff6de274bd3ce"} Mar 13 11:48:06 crc kubenswrapper[4837]: I0313 11:48:06.056856 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"ed78d8e4768e0800e9708f2a71bd11f983ebf3d69efb7ef41280897e00598fd5"} Mar 13 11:48:06 crc kubenswrapper[4837]: I0313 11:48:06.057903 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a7e73c64ad82cebae204f5b72154fd5ff23ca2c12261ae626070480716cc921e"} Mar 13 11:48:06 crc kubenswrapper[4837]: W0313 11:48:06.330754 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Mar 13 11:48:06 crc kubenswrapper[4837]: E0313 11:48:06.331202 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Mar 13 11:48:06 crc kubenswrapper[4837]: W0313 11:48:06.366357 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Mar 13 11:48:06 crc kubenswrapper[4837]: E0313 11:48:06.366491 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Mar 13 11:48:06 crc kubenswrapper[4837]: E0313 11:48:06.387540 4837 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="1.6s" Mar 13 11:48:06 crc kubenswrapper[4837]: I0313 11:48:06.633309 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:06 crc kubenswrapper[4837]: I0313 11:48:06.634957 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:06 crc kubenswrapper[4837]: I0313 11:48:06.635001 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:06 crc kubenswrapper[4837]: I0313 11:48:06.635016 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:06 crc kubenswrapper[4837]: I0313 11:48:06.635050 4837 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 11:48:06 crc kubenswrapper[4837]: E0313 11:48:06.635371 4837 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.138:6443: connect: connection refused" node="crc" Mar 13 11:48:06 crc kubenswrapper[4837]: W0313 11:48:06.646790 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Mar 13 11:48:06 crc kubenswrapper[4837]: E0313 11:48:06.646872 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Mar 13 11:48:06 crc kubenswrapper[4837]: I0313 11:48:06.920480 4837 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 13 11:48:06 crc kubenswrapper[4837]: E0313 11:48:06.922130 4837 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Mar 13 11:48:06 crc kubenswrapper[4837]: I0313 11:48:06.974082 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Mar 13 11:48:07 crc kubenswrapper[4837]: I0313 11:48:07.063408 4837 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="50f0e16118f5b414af37ef05c357d964583bfd8467d1f7434ce8e778334909a2" exitCode=0 Mar 13 11:48:07 crc kubenswrapper[4837]: I0313 11:48:07.063500 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"50f0e16118f5b414af37ef05c357d964583bfd8467d1f7434ce8e778334909a2"} Mar 13 11:48:07 crc kubenswrapper[4837]: I0313 11:48:07.063718 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:07 crc kubenswrapper[4837]: I0313 11:48:07.065600 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:07 crc kubenswrapper[4837]: I0313 11:48:07.065696 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:07 crc kubenswrapper[4837]: I0313 11:48:07.065712 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:07 crc kubenswrapper[4837]: I0313 11:48:07.067765 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"234dda28dd022687ffca5497d2739026d15c1495269c9dbc78aa23a5344315e0"} Mar 13 11:48:07 crc kubenswrapper[4837]: I0313 11:48:07.067791 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c24c04727e0b72ba159369c3c8d8e99885fd2082fad7452568fda7ef0f31b752"} Mar 13 11:48:07 crc kubenswrapper[4837]: I0313 11:48:07.067828 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f78690d91eabf6f2c116b2e2bea9989a42acaeeef513ed5a6050a251c3d03066"} Mar 13 11:48:07 crc kubenswrapper[4837]: I0313 11:48:07.067842 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"29dcf2d4dbca31492c07df5fcf50217d44ab7914e536e5ae6d8187e8b2b3e62f"} Mar 13 11:48:07 crc kubenswrapper[4837]: I0313 11:48:07.068001 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:07 crc kubenswrapper[4837]: I0313 11:48:07.069342 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:07 crc kubenswrapper[4837]: I0313 11:48:07.069387 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:07 crc kubenswrapper[4837]: I0313 11:48:07.069405 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:07 crc kubenswrapper[4837]: I0313 11:48:07.072121 4837 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6" exitCode=0 Mar 13 11:48:07 crc kubenswrapper[4837]: I0313 11:48:07.072236 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6"} Mar 13 11:48:07 crc kubenswrapper[4837]: I0313 11:48:07.072369 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:07 crc kubenswrapper[4837]: I0313 11:48:07.073788 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:07 crc kubenswrapper[4837]: I0313 11:48:07.073826 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:07 crc kubenswrapper[4837]: I0313 11:48:07.073838 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:07 crc kubenswrapper[4837]: I0313 11:48:07.076977 4837 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75" exitCode=0 Mar 13 11:48:07 crc kubenswrapper[4837]: I0313 11:48:07.077192 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75"} Mar 13 11:48:07 crc kubenswrapper[4837]: I0313 11:48:07.077278 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:07 crc kubenswrapper[4837]: I0313 11:48:07.079064 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:07 crc kubenswrapper[4837]: I0313 11:48:07.079118 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:07 crc kubenswrapper[4837]: I0313 11:48:07.079142 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:07 crc kubenswrapper[4837]: I0313 11:48:07.079603 4837 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="1f1e59b3f4d6931337d42b5716a5ab247f9314e2a0eb400f8fc438c0e1ff95bb" exitCode=0 Mar 13 11:48:07 crc kubenswrapper[4837]: I0313 11:48:07.079725 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"1f1e59b3f4d6931337d42b5716a5ab247f9314e2a0eb400f8fc438c0e1ff95bb"} Mar 13 11:48:07 crc kubenswrapper[4837]: I0313 11:48:07.079932 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:07 crc kubenswrapper[4837]: I0313 11:48:07.082786 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:07 crc kubenswrapper[4837]: I0313 11:48:07.082833 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:07 crc kubenswrapper[4837]: I0313 11:48:07.082846 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:07 crc kubenswrapper[4837]: I0313 11:48:07.083188 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:07 crc kubenswrapper[4837]: I0313 11:48:07.085764 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:07 crc kubenswrapper[4837]: I0313 11:48:07.085876 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:07 crc kubenswrapper[4837]: I0313 11:48:07.086057 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:07 crc kubenswrapper[4837]: I0313 11:48:07.487287 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:48:07 crc kubenswrapper[4837]: I0313 11:48:07.732778 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:48:07 crc kubenswrapper[4837]: I0313 11:48:07.974600 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Mar 13 11:48:07 crc kubenswrapper[4837]: E0313 11:48:07.988512 4837 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="3.2s" Mar 13 11:48:08 crc kubenswrapper[4837]: I0313 11:48:08.085156 4837 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f" exitCode=0 Mar 13 11:48:08 crc kubenswrapper[4837]: I0313 11:48:08.085293 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:08 crc kubenswrapper[4837]: I0313 11:48:08.085291 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f"} Mar 13 11:48:08 crc kubenswrapper[4837]: I0313 11:48:08.086211 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:08 crc kubenswrapper[4837]: I0313 11:48:08.086236 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:08 crc kubenswrapper[4837]: I0313 11:48:08.086247 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:08 crc kubenswrapper[4837]: I0313 11:48:08.089260 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2"} Mar 13 11:48:08 crc kubenswrapper[4837]: I0313 11:48:08.089319 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208"} Mar 13 11:48:08 crc kubenswrapper[4837]: I0313 11:48:08.089336 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab"} Mar 13 11:48:08 crc kubenswrapper[4837]: I0313 11:48:08.092407 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"603ace7b4b1c79d13e8d3fd10baf836c890a60bfbdae807921ae0cc6365bc3dd"} Mar 13 11:48:08 crc kubenswrapper[4837]: I0313 11:48:08.092512 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:08 crc kubenswrapper[4837]: I0313 11:48:08.093840 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:08 crc kubenswrapper[4837]: I0313 11:48:08.093874 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:08 crc kubenswrapper[4837]: I0313 11:48:08.093886 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:08 crc kubenswrapper[4837]: I0313 11:48:08.096183 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"20b14790e78b11453c1d1b4a35d40c25fa01684c6b20f05cac9002eda7645cb8"} Mar 13 11:48:08 crc kubenswrapper[4837]: I0313 11:48:08.096234 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:08 crc kubenswrapper[4837]: I0313 11:48:08.096301 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:08 crc kubenswrapper[4837]: I0313 11:48:08.096237 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a35cb83c3dfbdb94194292c22b9c7a42478f1dff83f6f703c45da3c08613a8da"} Mar 13 11:48:08 crc kubenswrapper[4837]: I0313 11:48:08.096368 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"3f22c5fe3a62270693c25f87ecfb55bdd775a49445bc2d88cb26ec6c6daf2291"} Mar 13 11:48:08 crc kubenswrapper[4837]: I0313 11:48:08.097164 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:08 crc kubenswrapper[4837]: I0313 11:48:08.097217 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:08 crc kubenswrapper[4837]: I0313 11:48:08.097239 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:08 crc kubenswrapper[4837]: I0313 11:48:08.097520 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:08 crc kubenswrapper[4837]: I0313 11:48:08.097542 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:08 crc kubenswrapper[4837]: I0313 11:48:08.097554 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:08 crc kubenswrapper[4837]: I0313 11:48:08.236983 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:08 crc kubenswrapper[4837]: I0313 11:48:08.238930 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:08 crc kubenswrapper[4837]: I0313 11:48:08.238971 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:08 crc kubenswrapper[4837]: I0313 11:48:08.238985 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:08 crc kubenswrapper[4837]: I0313 11:48:08.239067 4837 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 11:48:08 crc kubenswrapper[4837]: E0313 11:48:08.239764 4837 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.138:6443: connect: connection refused" node="crc" Mar 13 11:48:08 crc kubenswrapper[4837]: W0313 11:48:08.500863 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Mar 13 11:48:08 crc kubenswrapper[4837]: E0313 11:48:08.501064 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Mar 13 11:48:08 crc kubenswrapper[4837]: W0313 11:48:08.691804 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Mar 13 11:48:08 crc kubenswrapper[4837]: E0313 11:48:08.691954 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Mar 13 11:48:08 crc kubenswrapper[4837]: W0313 11:48:08.903716 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Mar 13 11:48:08 crc kubenswrapper[4837]: E0313 11:48:08.903844 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.138:6443: connect: connection refused" logger="UnhandledError" Mar 13 11:48:08 crc kubenswrapper[4837]: I0313 11:48:08.973763 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.138:6443: connect: connection refused Mar 13 11:48:09 crc kubenswrapper[4837]: I0313 11:48:09.103126 4837 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51" exitCode=0 Mar 13 11:48:09 crc kubenswrapper[4837]: I0313 11:48:09.103257 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51"} Mar 13 11:48:09 crc kubenswrapper[4837]: I0313 11:48:09.103350 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:09 crc kubenswrapper[4837]: I0313 11:48:09.104789 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:09 crc kubenswrapper[4837]: I0313 11:48:09.104842 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:09 crc kubenswrapper[4837]: I0313 11:48:09.104855 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:09 crc kubenswrapper[4837]: I0313 11:48:09.109051 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:09 crc kubenswrapper[4837]: I0313 11:48:09.109095 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:09 crc kubenswrapper[4837]: I0313 11:48:09.109110 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:09 crc kubenswrapper[4837]: I0313 11:48:09.109337 4837 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 11:48:09 crc kubenswrapper[4837]: I0313 11:48:09.109394 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:09 crc kubenswrapper[4837]: I0313 11:48:09.110158 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"426cdf5295902728def823304a2e4e86538732e81d4f2aa2a575596241730b86"} Mar 13 11:48:09 crc kubenswrapper[4837]: I0313 11:48:09.110303 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17"} Mar 13 11:48:09 crc kubenswrapper[4837]: I0313 11:48:09.113288 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:09 crc kubenswrapper[4837]: I0313 11:48:09.113325 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:09 crc kubenswrapper[4837]: I0313 11:48:09.113339 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:09 crc kubenswrapper[4837]: I0313 11:48:09.113527 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:09 crc kubenswrapper[4837]: I0313 11:48:09.113557 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:09 crc kubenswrapper[4837]: I0313 11:48:09.113570 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:09 crc kubenswrapper[4837]: I0313 11:48:09.113731 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:09 crc kubenswrapper[4837]: I0313 11:48:09.113743 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:09 crc kubenswrapper[4837]: I0313 11:48:09.113779 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:09 crc kubenswrapper[4837]: I0313 11:48:09.113816 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:09 crc kubenswrapper[4837]: I0313 11:48:09.113874 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:09 crc kubenswrapper[4837]: I0313 11:48:09.113894 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:09 crc kubenswrapper[4837]: I0313 11:48:09.944217 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:48:10 crc kubenswrapper[4837]: I0313 11:48:10.117168 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6ae595b4ed8facfb5d9a747dac75233102bd05bc21e4bd5c644c0a1985bb7ef7"} Mar 13 11:48:10 crc kubenswrapper[4837]: I0313 11:48:10.117248 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c00ffa41f4f30f0516fe955d957ac92818f9576557f7e1352070e221ac7b09d9"} Mar 13 11:48:10 crc kubenswrapper[4837]: I0313 11:48:10.117268 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4286e1cf3e088b3ccc0949721368fe176894a5d6bdf8d1dd108b92adecf45952"} Mar 13 11:48:10 crc kubenswrapper[4837]: I0313 11:48:10.117286 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"18f3bbb38d2bec20e9b96f72dee3906973b4cc3e658d067928a46a8de37652f1"} Mar 13 11:48:10 crc kubenswrapper[4837]: I0313 11:48:10.117342 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:10 crc kubenswrapper[4837]: I0313 11:48:10.117395 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:48:10 crc kubenswrapper[4837]: I0313 11:48:10.117411 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:10 crc kubenswrapper[4837]: I0313 11:48:10.118541 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:10 crc kubenswrapper[4837]: I0313 11:48:10.118585 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:10 crc kubenswrapper[4837]: I0313 11:48:10.118598 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:10 crc kubenswrapper[4837]: I0313 11:48:10.118834 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:10 crc kubenswrapper[4837]: I0313 11:48:10.118874 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:10 crc kubenswrapper[4837]: I0313 11:48:10.118912 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:10 crc kubenswrapper[4837]: I0313 11:48:10.220818 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:48:10 crc kubenswrapper[4837]: I0313 11:48:10.494101 4837 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 11:48:10 crc kubenswrapper[4837]: I0313 11:48:10.494358 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 11:48:10 crc kubenswrapper[4837]: I0313 11:48:10.987959 4837 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 13 11:48:11 crc kubenswrapper[4837]: I0313 11:48:11.132499 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:11 crc kubenswrapper[4837]: I0313 11:48:11.132532 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:11 crc kubenswrapper[4837]: I0313 11:48:11.133205 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4a7546e653505747aa787947982ccf181e3209cc3110f8bde34360ea73a1c69d"} Mar 13 11:48:11 crc kubenswrapper[4837]: I0313 11:48:11.133826 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:11 crc kubenswrapper[4837]: I0313 11:48:11.133867 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:11 crc kubenswrapper[4837]: I0313 11:48:11.133880 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:11 crc kubenswrapper[4837]: I0313 11:48:11.133898 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:11 crc kubenswrapper[4837]: I0313 11:48:11.133938 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:11 crc kubenswrapper[4837]: I0313 11:48:11.133949 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:11 crc kubenswrapper[4837]: I0313 11:48:11.430043 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:48:11 crc kubenswrapper[4837]: I0313 11:48:11.440227 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:11 crc kubenswrapper[4837]: I0313 11:48:11.442140 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:11 crc kubenswrapper[4837]: I0313 11:48:11.442205 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:11 crc kubenswrapper[4837]: I0313 11:48:11.442219 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:11 crc kubenswrapper[4837]: I0313 11:48:11.442259 4837 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 11:48:11 crc kubenswrapper[4837]: I0313 11:48:11.501784 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 13 11:48:12 crc kubenswrapper[4837]: I0313 11:48:12.134894 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:12 crc kubenswrapper[4837]: I0313 11:48:12.134895 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:12 crc kubenswrapper[4837]: I0313 11:48:12.136132 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:12 crc kubenswrapper[4837]: I0313 11:48:12.136215 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:12 crc kubenswrapper[4837]: I0313 11:48:12.136228 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:12 crc kubenswrapper[4837]: I0313 11:48:12.136247 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:12 crc kubenswrapper[4837]: I0313 11:48:12.136311 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:12 crc kubenswrapper[4837]: I0313 11:48:12.136329 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:13 crc kubenswrapper[4837]: I0313 11:48:13.138192 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:13 crc kubenswrapper[4837]: I0313 11:48:13.138226 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:13 crc kubenswrapper[4837]: I0313 11:48:13.139426 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:13 crc kubenswrapper[4837]: I0313 11:48:13.139449 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:13 crc kubenswrapper[4837]: I0313 11:48:13.139460 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:13 crc kubenswrapper[4837]: I0313 11:48:13.139473 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:13 crc kubenswrapper[4837]: I0313 11:48:13.139503 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:13 crc kubenswrapper[4837]: I0313 11:48:13.139519 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:13 crc kubenswrapper[4837]: I0313 11:48:13.293849 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 11:48:13 crc kubenswrapper[4837]: I0313 11:48:13.294184 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:13 crc kubenswrapper[4837]: I0313 11:48:13.295912 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:13 crc kubenswrapper[4837]: I0313 11:48:13.295955 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:13 crc kubenswrapper[4837]: I0313 11:48:13.295967 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:13 crc kubenswrapper[4837]: I0313 11:48:13.760357 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:48:13 crc kubenswrapper[4837]: I0313 11:48:13.760790 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:13 crc kubenswrapper[4837]: I0313 11:48:13.762431 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:13 crc kubenswrapper[4837]: I0313 11:48:13.762485 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:13 crc kubenswrapper[4837]: I0313 11:48:13.762499 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:13 crc kubenswrapper[4837]: I0313 11:48:13.767764 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:48:14 crc kubenswrapper[4837]: I0313 11:48:14.142421 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:14 crc kubenswrapper[4837]: I0313 11:48:14.144198 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:14 crc kubenswrapper[4837]: I0313 11:48:14.144266 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:14 crc kubenswrapper[4837]: I0313 11:48:14.144290 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:15 crc kubenswrapper[4837]: E0313 11:48:15.132303 4837 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 11:48:19 crc kubenswrapper[4837]: W0313 11:48:19.437240 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 13 11:48:19 crc kubenswrapper[4837]: I0313 11:48:19.437365 4837 trace.go:236] Trace[3588364]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (13-Mar-2026 11:48:09.435) (total time: 10001ms): Mar 13 11:48:19 crc kubenswrapper[4837]: Trace[3588364]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (11:48:19.437) Mar 13 11:48:19 crc kubenswrapper[4837]: Trace[3588364]: [10.001526103s] [10.001526103s] END Mar 13 11:48:19 crc kubenswrapper[4837]: E0313 11:48:19.437398 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 13 11:48:19 crc kubenswrapper[4837]: W0313 11:48:19.888874 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:19Z is after 2026-02-23T05:33:13Z Mar 13 11:48:19 crc kubenswrapper[4837]: E0313 11:48:19.888981 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:19Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 11:48:19 crc kubenswrapper[4837]: E0313 11:48:19.892273 4837 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:19Z is after 2026-02-23T05:33:13Z" node="crc" Mar 13 11:48:19 crc kubenswrapper[4837]: W0313 11:48:19.894430 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:19Z is after 2026-02-23T05:33:13Z Mar 13 11:48:19 crc kubenswrapper[4837]: E0313 11:48:19.894522 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:19Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 11:48:19 crc kubenswrapper[4837]: I0313 11:48:19.896873 4837 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 13 11:48:19 crc kubenswrapper[4837]: I0313 11:48:19.896934 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 13 11:48:19 crc kubenswrapper[4837]: W0313 11:48:19.898607 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:19Z is after 2026-02-23T05:33:13Z Mar 13 11:48:19 crc kubenswrapper[4837]: E0313 11:48:19.898702 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:19Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 11:48:19 crc kubenswrapper[4837]: E0313 11:48:19.900775 4837 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:19Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 13 11:48:19 crc kubenswrapper[4837]: I0313 11:48:19.902446 4837 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 13 11:48:19 crc kubenswrapper[4837]: I0313 11:48:19.902525 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 13 11:48:19 crc kubenswrapper[4837]: E0313 11:48:19.903584 4837 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:19Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189c642a087ab8f8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:04.970797304 +0000 UTC m=+0.609064107,LastTimestamp:2026-03-13 11:48:04.970797304 +0000 UTC m=+0.609064107,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:19 crc kubenswrapper[4837]: E0313 11:48:19.912310 4837 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:19Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 11:48:19 crc kubenswrapper[4837]: I0313 11:48:19.917655 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:19Z is after 2026-02-23T05:33:13Z Mar 13 11:48:19 crc kubenswrapper[4837]: I0313 11:48:19.952081 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:48:19 crc kubenswrapper[4837]: I0313 11:48:19.952268 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:19 crc kubenswrapper[4837]: I0313 11:48:19.953751 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:19 crc kubenswrapper[4837]: I0313 11:48:19.953805 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:19 crc kubenswrapper[4837]: I0313 11:48:19.953814 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:19 crc kubenswrapper[4837]: I0313 11:48:19.978156 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:19Z is after 2026-02-23T05:33:13Z Mar 13 11:48:20 crc kubenswrapper[4837]: I0313 11:48:20.162072 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 13 11:48:20 crc kubenswrapper[4837]: I0313 11:48:20.163929 4837 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="426cdf5295902728def823304a2e4e86538732e81d4f2aa2a575596241730b86" exitCode=255 Mar 13 11:48:20 crc kubenswrapper[4837]: I0313 11:48:20.163991 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"426cdf5295902728def823304a2e4e86538732e81d4f2aa2a575596241730b86"} Mar 13 11:48:20 crc kubenswrapper[4837]: I0313 11:48:20.164197 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:20 crc kubenswrapper[4837]: I0313 11:48:20.165195 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:20 crc kubenswrapper[4837]: I0313 11:48:20.165230 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:20 crc kubenswrapper[4837]: I0313 11:48:20.165242 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:20 crc kubenswrapper[4837]: I0313 11:48:20.165806 4837 scope.go:117] "RemoveContainer" containerID="426cdf5295902728def823304a2e4e86538732e81d4f2aa2a575596241730b86" Mar 13 11:48:20 crc kubenswrapper[4837]: I0313 11:48:20.455548 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 13 11:48:20 crc kubenswrapper[4837]: I0313 11:48:20.455901 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:20 crc kubenswrapper[4837]: I0313 11:48:20.457343 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:20 crc kubenswrapper[4837]: I0313 11:48:20.457386 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:20 crc kubenswrapper[4837]: I0313 11:48:20.457396 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:20 crc kubenswrapper[4837]: I0313 11:48:20.488787 4837 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 11:48:20 crc kubenswrapper[4837]: I0313 11:48:20.488868 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 11:48:20 crc kubenswrapper[4837]: I0313 11:48:20.512483 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 13 11:48:20 crc kubenswrapper[4837]: I0313 11:48:20.989360 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:20Z is after 2026-02-23T05:33:13Z Mar 13 11:48:21 crc kubenswrapper[4837]: I0313 11:48:21.168086 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 13 11:48:21 crc kubenswrapper[4837]: I0313 11:48:21.168490 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 13 11:48:21 crc kubenswrapper[4837]: I0313 11:48:21.170131 4837 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d48dfb6261690badcc8bdd6d6dc0a6e2c8bc051c0ff9240fe1cfd02c1c56daa5" exitCode=255 Mar 13 11:48:21 crc kubenswrapper[4837]: I0313 11:48:21.170217 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"d48dfb6261690badcc8bdd6d6dc0a6e2c8bc051c0ff9240fe1cfd02c1c56daa5"} Mar 13 11:48:21 crc kubenswrapper[4837]: I0313 11:48:21.170280 4837 scope.go:117] "RemoveContainer" containerID="426cdf5295902728def823304a2e4e86538732e81d4f2aa2a575596241730b86" Mar 13 11:48:21 crc kubenswrapper[4837]: I0313 11:48:21.170361 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:21 crc kubenswrapper[4837]: I0313 11:48:21.170394 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:21 crc kubenswrapper[4837]: I0313 11:48:21.174377 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:21 crc kubenswrapper[4837]: I0313 11:48:21.174426 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:21 crc kubenswrapper[4837]: I0313 11:48:21.174458 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:21 crc kubenswrapper[4837]: I0313 11:48:21.175825 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:21 crc kubenswrapper[4837]: I0313 11:48:21.175857 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:21 crc kubenswrapper[4837]: I0313 11:48:21.175894 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:21 crc kubenswrapper[4837]: I0313 11:48:21.179041 4837 scope.go:117] "RemoveContainer" containerID="d48dfb6261690badcc8bdd6d6dc0a6e2c8bc051c0ff9240fe1cfd02c1c56daa5" Mar 13 11:48:21 crc kubenswrapper[4837]: E0313 11:48:21.179234 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 11:48:21 crc kubenswrapper[4837]: I0313 11:48:21.185386 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 13 11:48:21 crc kubenswrapper[4837]: I0313 11:48:21.435073 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:48:21 crc kubenswrapper[4837]: I0313 11:48:21.978025 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:21Z is after 2026-02-23T05:33:13Z Mar 13 11:48:22 crc kubenswrapper[4837]: I0313 11:48:22.174508 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 13 11:48:22 crc kubenswrapper[4837]: I0313 11:48:22.177067 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:22 crc kubenswrapper[4837]: I0313 11:48:22.177167 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:22 crc kubenswrapper[4837]: I0313 11:48:22.178109 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:22 crc kubenswrapper[4837]: I0313 11:48:22.178174 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:22 crc kubenswrapper[4837]: I0313 11:48:22.178192 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:22 crc kubenswrapper[4837]: I0313 11:48:22.178351 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:22 crc kubenswrapper[4837]: I0313 11:48:22.178390 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:22 crc kubenswrapper[4837]: I0313 11:48:22.178402 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:22 crc kubenswrapper[4837]: I0313 11:48:22.179097 4837 scope.go:117] "RemoveContainer" containerID="d48dfb6261690badcc8bdd6d6dc0a6e2c8bc051c0ff9240fe1cfd02c1c56daa5" Mar 13 11:48:22 crc kubenswrapper[4837]: E0313 11:48:22.179302 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 11:48:22 crc kubenswrapper[4837]: I0313 11:48:22.185447 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:48:22 crc kubenswrapper[4837]: W0313 11:48:22.882910 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:22Z is after 2026-02-23T05:33:13Z Mar 13 11:48:22 crc kubenswrapper[4837]: E0313 11:48:22.882992 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:22Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 11:48:22 crc kubenswrapper[4837]: I0313 11:48:22.978627 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:22Z is after 2026-02-23T05:33:13Z Mar 13 11:48:23 crc kubenswrapper[4837]: I0313 11:48:23.180161 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:23 crc kubenswrapper[4837]: I0313 11:48:23.181689 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:23 crc kubenswrapper[4837]: I0313 11:48:23.181725 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:23 crc kubenswrapper[4837]: I0313 11:48:23.181739 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:23 crc kubenswrapper[4837]: I0313 11:48:23.182365 4837 scope.go:117] "RemoveContainer" containerID="d48dfb6261690badcc8bdd6d6dc0a6e2c8bc051c0ff9240fe1cfd02c1c56daa5" Mar 13 11:48:23 crc kubenswrapper[4837]: E0313 11:48:23.182545 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 11:48:23 crc kubenswrapper[4837]: I0313 11:48:23.979295 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:23Z is after 2026-02-23T05:33:13Z Mar 13 11:48:24 crc kubenswrapper[4837]: I0313 11:48:24.976850 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:24Z is after 2026-02-23T05:33:13Z Mar 13 11:48:25 crc kubenswrapper[4837]: E0313 11:48:25.132452 4837 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 11:48:25 crc kubenswrapper[4837]: I0313 11:48:25.545228 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:48:25 crc kubenswrapper[4837]: I0313 11:48:25.545520 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:25 crc kubenswrapper[4837]: I0313 11:48:25.547708 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:25 crc kubenswrapper[4837]: I0313 11:48:25.547785 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:25 crc kubenswrapper[4837]: I0313 11:48:25.547805 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:25 crc kubenswrapper[4837]: I0313 11:48:25.548865 4837 scope.go:117] "RemoveContainer" containerID="d48dfb6261690badcc8bdd6d6dc0a6e2c8bc051c0ff9240fe1cfd02c1c56daa5" Mar 13 11:48:25 crc kubenswrapper[4837]: E0313 11:48:25.549246 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 11:48:25 crc kubenswrapper[4837]: I0313 11:48:25.976244 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:25Z is after 2026-02-23T05:33:13Z Mar 13 11:48:26 crc kubenswrapper[4837]: I0313 11:48:26.292557 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:26 crc kubenswrapper[4837]: I0313 11:48:26.294259 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:26 crc kubenswrapper[4837]: I0313 11:48:26.294314 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:26 crc kubenswrapper[4837]: I0313 11:48:26.294333 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:26 crc kubenswrapper[4837]: I0313 11:48:26.294374 4837 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 11:48:26 crc kubenswrapper[4837]: E0313 11:48:26.297836 4837 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:26Z is after 2026-02-23T05:33:13Z" node="crc" Mar 13 11:48:26 crc kubenswrapper[4837]: E0313 11:48:26.303600 4837 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:26Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 13 11:48:26 crc kubenswrapper[4837]: I0313 11:48:26.377050 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:48:26 crc kubenswrapper[4837]: I0313 11:48:26.377268 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:26 crc kubenswrapper[4837]: I0313 11:48:26.379627 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:26 crc kubenswrapper[4837]: I0313 11:48:26.379708 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:26 crc kubenswrapper[4837]: I0313 11:48:26.379725 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:26 crc kubenswrapper[4837]: I0313 11:48:26.380516 4837 scope.go:117] "RemoveContainer" containerID="d48dfb6261690badcc8bdd6d6dc0a6e2c8bc051c0ff9240fe1cfd02c1c56daa5" Mar 13 11:48:26 crc kubenswrapper[4837]: E0313 11:48:26.380776 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 11:48:26 crc kubenswrapper[4837]: I0313 11:48:26.977139 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:26Z is after 2026-02-23T05:33:13Z Mar 13 11:48:27 crc kubenswrapper[4837]: I0313 11:48:27.978445 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:27Z is after 2026-02-23T05:33:13Z Mar 13 11:48:28 crc kubenswrapper[4837]: I0313 11:48:28.622339 4837 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 13 11:48:28 crc kubenswrapper[4837]: E0313 11:48:28.628340 4837 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:28Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 11:48:28 crc kubenswrapper[4837]: I0313 11:48:28.978837 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:28Z is after 2026-02-23T05:33:13Z Mar 13 11:48:29 crc kubenswrapper[4837]: E0313 11:48:29.906566 4837 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:29Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189c642a087ab8f8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:04.970797304 +0000 UTC m=+0.609064107,LastTimestamp:2026-03-13 11:48:04.970797304 +0000 UTC m=+0.609064107,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:29 crc kubenswrapper[4837]: I0313 11:48:29.977176 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:29Z is after 2026-02-23T05:33:13Z Mar 13 11:48:30 crc kubenswrapper[4837]: W0313 11:48:30.128243 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:30Z is after 2026-02-23T05:33:13Z Mar 13 11:48:30 crc kubenswrapper[4837]: E0313 11:48:30.128339 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:30Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 11:48:30 crc kubenswrapper[4837]: I0313 11:48:30.488941 4837 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 11:48:30 crc kubenswrapper[4837]: I0313 11:48:30.489046 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 11:48:30 crc kubenswrapper[4837]: I0313 11:48:30.489116 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:48:30 crc kubenswrapper[4837]: I0313 11:48:30.489280 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:30 crc kubenswrapper[4837]: I0313 11:48:30.490576 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:30 crc kubenswrapper[4837]: I0313 11:48:30.490622 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:30 crc kubenswrapper[4837]: I0313 11:48:30.490650 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:30 crc kubenswrapper[4837]: I0313 11:48:30.491173 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"f78690d91eabf6f2c116b2e2bea9989a42acaeeef513ed5a6050a251c3d03066"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 13 11:48:30 crc kubenswrapper[4837]: I0313 11:48:30.491329 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://f78690d91eabf6f2c116b2e2bea9989a42acaeeef513ed5a6050a251c3d03066" gracePeriod=30 Mar 13 11:48:30 crc kubenswrapper[4837]: W0313 11:48:30.912685 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:30Z is after 2026-02-23T05:33:13Z Mar 13 11:48:30 crc kubenswrapper[4837]: E0313 11:48:30.913072 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:30Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 11:48:30 crc kubenswrapper[4837]: I0313 11:48:30.980301 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:30Z is after 2026-02-23T05:33:13Z Mar 13 11:48:31 crc kubenswrapper[4837]: I0313 11:48:31.210273 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 13 11:48:31 crc kubenswrapper[4837]: I0313 11:48:31.210758 4837 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="f78690d91eabf6f2c116b2e2bea9989a42acaeeef513ed5a6050a251c3d03066" exitCode=255 Mar 13 11:48:31 crc kubenswrapper[4837]: I0313 11:48:31.210807 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"f78690d91eabf6f2c116b2e2bea9989a42acaeeef513ed5a6050a251c3d03066"} Mar 13 11:48:31 crc kubenswrapper[4837]: I0313 11:48:31.210852 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6f6e6211c6e06af773a58005617cdc56edcb5787a72302dacb1aa7602572beb8"} Mar 13 11:48:31 crc kubenswrapper[4837]: I0313 11:48:31.210989 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:31 crc kubenswrapper[4837]: I0313 11:48:31.212403 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:31 crc kubenswrapper[4837]: I0313 11:48:31.212447 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:31 crc kubenswrapper[4837]: I0313 11:48:31.212463 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:31 crc kubenswrapper[4837]: I0313 11:48:31.979583 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:31Z is after 2026-02-23T05:33:13Z Mar 13 11:48:32 crc kubenswrapper[4837]: W0313 11:48:32.567007 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:32Z is after 2026-02-23T05:33:13Z Mar 13 11:48:32 crc kubenswrapper[4837]: E0313 11:48:32.567125 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:32Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 11:48:32 crc kubenswrapper[4837]: I0313 11:48:32.979255 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:32Z is after 2026-02-23T05:33:13Z Mar 13 11:48:33 crc kubenswrapper[4837]: I0313 11:48:33.299069 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:33 crc kubenswrapper[4837]: I0313 11:48:33.301393 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:33 crc kubenswrapper[4837]: I0313 11:48:33.301480 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:33 crc kubenswrapper[4837]: I0313 11:48:33.301506 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:33 crc kubenswrapper[4837]: I0313 11:48:33.301557 4837 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 11:48:33 crc kubenswrapper[4837]: E0313 11:48:33.307389 4837 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:33Z is after 2026-02-23T05:33:13Z" node="crc" Mar 13 11:48:33 crc kubenswrapper[4837]: E0313 11:48:33.311012 4837 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:33Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 13 11:48:33 crc kubenswrapper[4837]: I0313 11:48:33.978244 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:33Z is after 2026-02-23T05:33:13Z Mar 13 11:48:34 crc kubenswrapper[4837]: W0313 11:48:34.396880 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:34Z is after 2026-02-23T05:33:13Z Mar 13 11:48:34 crc kubenswrapper[4837]: E0313 11:48:34.396990 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:34Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 11:48:34 crc kubenswrapper[4837]: I0313 11:48:34.979758 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:34Z is after 2026-02-23T05:33:13Z Mar 13 11:48:35 crc kubenswrapper[4837]: E0313 11:48:35.132706 4837 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 11:48:35 crc kubenswrapper[4837]: I0313 11:48:35.978836 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:35Z is after 2026-02-23T05:33:13Z Mar 13 11:48:36 crc kubenswrapper[4837]: I0313 11:48:36.977227 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:48:36Z is after 2026-02-23T05:33:13Z Mar 13 11:48:37 crc kubenswrapper[4837]: I0313 11:48:37.488456 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:48:37 crc kubenswrapper[4837]: I0313 11:48:37.488715 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:37 crc kubenswrapper[4837]: I0313 11:48:37.490179 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:37 crc kubenswrapper[4837]: I0313 11:48:37.490218 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:37 crc kubenswrapper[4837]: I0313 11:48:37.490233 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:37 crc kubenswrapper[4837]: I0313 11:48:37.732876 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:48:37 crc kubenswrapper[4837]: I0313 11:48:37.981318 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:48:38 crc kubenswrapper[4837]: I0313 11:48:38.230548 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:38 crc kubenswrapper[4837]: I0313 11:48:38.231573 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:38 crc kubenswrapper[4837]: I0313 11:48:38.231624 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:38 crc kubenswrapper[4837]: I0313 11:48:38.231669 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:38 crc kubenswrapper[4837]: I0313 11:48:38.981324 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:48:39 crc kubenswrapper[4837]: I0313 11:48:39.047857 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:39 crc kubenswrapper[4837]: I0313 11:48:39.049598 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:39 crc kubenswrapper[4837]: I0313 11:48:39.049685 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:39 crc kubenswrapper[4837]: I0313 11:48:39.049708 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:39 crc kubenswrapper[4837]: I0313 11:48:39.050572 4837 scope.go:117] "RemoveContainer" containerID="d48dfb6261690badcc8bdd6d6dc0a6e2c8bc051c0ff9240fe1cfd02c1c56daa5" Mar 13 11:48:39 crc kubenswrapper[4837]: E0313 11:48:39.950470 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c642a087ab8f8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:04.970797304 +0000 UTC m=+0.609064107,LastTimestamp:2026-03-13 11:48:04.970797304 +0000 UTC m=+0.609064107,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:39 crc kubenswrapper[4837]: E0313 11:48:39.955541 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c642a0bdcd2e3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:05.027558115 +0000 UTC m=+0.665824878,LastTimestamp:2026-03-13 11:48:05.027558115 +0000 UTC m=+0.665824878,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:39 crc kubenswrapper[4837]: E0313 11:48:39.961256 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c642a0bdd0cfa default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:05.027572986 +0000 UTC m=+0.665839749,LastTimestamp:2026-03-13 11:48:05.027572986 +0000 UTC m=+0.665839749,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:39 crc kubenswrapper[4837]: E0313 11:48:39.966578 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c642a0bdd42ce default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:05.027586766 +0000 UTC m=+0.665853529,LastTimestamp:2026-03-13 11:48:05.027586766 +0000 UTC m=+0.665853529,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:39 crc kubenswrapper[4837]: E0313 11:48:39.971343 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c642a11ceac01 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:05.127293953 +0000 UTC m=+0.765560716,LastTimestamp:2026-03-13 11:48:05.127293953 +0000 UTC m=+0.765560716,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:39 crc kubenswrapper[4837]: I0313 11:48:39.977561 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:48:39 crc kubenswrapper[4837]: E0313 11:48:39.977547 4837 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c642a0bdcd2e3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c642a0bdcd2e3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:05.027558115 +0000 UTC m=+0.665824878,LastTimestamp:2026-03-13 11:48:05.149249978 +0000 UTC m=+0.787516751,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:39 crc kubenswrapper[4837]: E0313 11:48:39.979694 4837 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c642a0bdd0cfa\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c642a0bdd0cfa default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:05.027572986 +0000 UTC m=+0.665839749,LastTimestamp:2026-03-13 11:48:05.149268488 +0000 UTC m=+0.787535261,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:39 crc kubenswrapper[4837]: E0313 11:48:39.985188 4837 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c642a0bdd42ce\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c642a0bdd42ce default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:05.027586766 +0000 UTC m=+0.665853529,LastTimestamp:2026-03-13 11:48:05.149281119 +0000 UTC m=+0.787547892,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:39 crc kubenswrapper[4837]: E0313 11:48:39.990236 4837 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c642a0bdcd2e3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c642a0bdcd2e3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:05.027558115 +0000 UTC m=+0.665824878,LastTimestamp:2026-03-13 11:48:05.151566828 +0000 UTC m=+0.789833601,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:39 crc kubenswrapper[4837]: E0313 11:48:39.994084 4837 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c642a0bdd0cfa\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c642a0bdd0cfa default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:05.027572986 +0000 UTC m=+0.665839749,LastTimestamp:2026-03-13 11:48:05.151580709 +0000 UTC m=+0.789847482,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:39 crc kubenswrapper[4837]: E0313 11:48:39.996613 4837 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c642a0bdd42ce\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c642a0bdd42ce default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:05.027586766 +0000 UTC m=+0.665853529,LastTimestamp:2026-03-13 11:48:05.151593439 +0000 UTC m=+0.789860212,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:39 crc kubenswrapper[4837]: E0313 11:48:39.999467 4837 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c642a0bdcd2e3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c642a0bdcd2e3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:05.027558115 +0000 UTC m=+0.665824878,LastTimestamp:2026-03-13 11:48:05.152752869 +0000 UTC m=+0.791019642,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.002760 4837 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c642a0bdd0cfa\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c642a0bdd0cfa default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:05.027572986 +0000 UTC m=+0.665839749,LastTimestamp:2026-03-13 11:48:05.152766249 +0000 UTC m=+0.791033022,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.004564 4837 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c642a0bdd42ce\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c642a0bdd42ce default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:05.027586766 +0000 UTC m=+0.665853529,LastTimestamp:2026-03-13 11:48:05.152812301 +0000 UTC m=+0.791079074,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.006854 4837 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c642a0bdcd2e3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c642a0bdcd2e3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:05.027558115 +0000 UTC m=+0.665824878,LastTimestamp:2026-03-13 11:48:05.153628949 +0000 UTC m=+0.791895722,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.009386 4837 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c642a0bdd0cfa\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c642a0bdd0cfa default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:05.027572986 +0000 UTC m=+0.665839749,LastTimestamp:2026-03-13 11:48:05.153685031 +0000 UTC m=+0.791951804,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.011501 4837 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c642a0bdd42ce\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c642a0bdd42ce default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:05.027586766 +0000 UTC m=+0.665853529,LastTimestamp:2026-03-13 11:48:05.153699881 +0000 UTC m=+0.791966654,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.013316 4837 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c642a0bdcd2e3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c642a0bdcd2e3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:05.027558115 +0000 UTC m=+0.665824878,LastTimestamp:2026-03-13 11:48:05.154028383 +0000 UTC m=+0.792295166,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.015559 4837 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c642a0bdd0cfa\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c642a0bdd0cfa default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:05.027572986 +0000 UTC m=+0.665839749,LastTimestamp:2026-03-13 11:48:05.154105745 +0000 UTC m=+0.792372518,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.016880 4837 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c642a0bdd42ce\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c642a0bdd42ce default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:05.027586766 +0000 UTC m=+0.665853529,LastTimestamp:2026-03-13 11:48:05.154122106 +0000 UTC m=+0.792388879,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.020263 4837 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c642a0bdcd2e3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c642a0bdcd2e3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:05.027558115 +0000 UTC m=+0.665824878,LastTimestamp:2026-03-13 11:48:05.157956937 +0000 UTC m=+0.796223740,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.024444 4837 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c642a0bdd0cfa\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c642a0bdd0cfa default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:05.027572986 +0000 UTC m=+0.665839749,LastTimestamp:2026-03-13 11:48:05.157998819 +0000 UTC m=+0.796265612,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.028084 4837 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c642a0bdd42ce\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c642a0bdd42ce default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:05.027586766 +0000 UTC m=+0.665853529,LastTimestamp:2026-03-13 11:48:05.15802177 +0000 UTC m=+0.796288563,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.032188 4837 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c642a0bdcd2e3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c642a0bdcd2e3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:05.027558115 +0000 UTC m=+0.665824878,LastTimestamp:2026-03-13 11:48:05.158154964 +0000 UTC m=+0.796421747,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.036529 4837 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c642a0bdd0cfa\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c642a0bdd0cfa default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:05.027572986 +0000 UTC m=+0.665839749,LastTimestamp:2026-03-13 11:48:05.158180405 +0000 UTC m=+0.796447178,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.043498 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c642a2bc8c7ca openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:05.563115466 +0000 UTC m=+1.201382259,LastTimestamp:2026-03-13 11:48:05.563115466 +0000 UTC m=+1.201382259,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.047915 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c642a2bee3979 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:05.565569401 +0000 UTC m=+1.203836164,LastTimestamp:2026-03-13 11:48:05.565569401 +0000 UTC m=+1.203836164,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.052499 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c642a2c3454ad openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:05.570163885 +0000 UTC m=+1.208430658,LastTimestamp:2026-03-13 11:48:05.570163885 +0000 UTC m=+1.208430658,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.057439 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c642a2c7a1c16 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:05.574736918 +0000 UTC m=+1.213003681,LastTimestamp:2026-03-13 11:48:05.574736918 +0000 UTC m=+1.213003681,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.060765 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c642a2c9fc638 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:05.577205304 +0000 UTC m=+1.215472067,LastTimestamp:2026-03-13 11:48:05.577205304 +0000 UTC m=+1.215472067,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.065691 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c642a52a2d928 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:06.214940968 +0000 UTC m=+1.853207731,LastTimestamp:2026-03-13 11:48:06.214940968 +0000 UTC m=+1.853207731,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.069933 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c642a52a54eb4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:06.215102132 +0000 UTC m=+1.853368895,LastTimestamp:2026-03-13 11:48:06.215102132 +0000 UTC m=+1.853368895,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.073793 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c642a52bab22d openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:06.216503853 +0000 UTC m=+1.854770616,LastTimestamp:2026-03-13 11:48:06.216503853 +0000 UTC m=+1.854770616,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.077304 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c642a52bc0f35 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:06.216593205 +0000 UTC m=+1.854860008,LastTimestamp:2026-03-13 11:48:06.216593205 +0000 UTC m=+1.854860008,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.081554 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c642a536e16cf openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:06.228260559 +0000 UTC m=+1.866527322,LastTimestamp:2026-03-13 11:48:06.228260559 +0000 UTC m=+1.866527322,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.086145 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c642a53ac62e8 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:06.232343272 +0000 UTC m=+1.870610035,LastTimestamp:2026-03-13 11:48:06.232343272 +0000 UTC m=+1.870610035,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.090352 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c642a53b15e3e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:06.232669758 +0000 UTC m=+1.870936521,LastTimestamp:2026-03-13 11:48:06.232669758 +0000 UTC m=+1.870936521,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.094168 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c642a53b2fe8b openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:06.232776331 +0000 UTC m=+1.871043134,LastTimestamp:2026-03-13 11:48:06.232776331 +0000 UTC m=+1.871043134,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.098050 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c642a53c205f9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:06.233761273 +0000 UTC m=+1.872028076,LastTimestamp:2026-03-13 11:48:06.233761273 +0000 UTC m=+1.872028076,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.102347 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c642a53d77337 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:06.235165495 +0000 UTC m=+1.873432248,LastTimestamp:2026-03-13 11:48:06.235165495 +0000 UTC m=+1.873432248,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.106864 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c642a5407d9ba openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:06.238337466 +0000 UTC m=+1.876604229,LastTimestamp:2026-03-13 11:48:06.238337466 +0000 UTC m=+1.876604229,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.111047 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c642a68d15582 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:06.58708621 +0000 UTC m=+2.225352973,LastTimestamp:2026-03-13 11:48:06.58708621 +0000 UTC m=+2.225352973,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.115200 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c642a699105fb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:06.599648763 +0000 UTC m=+2.237915526,LastTimestamp:2026-03-13 11:48:06.599648763 +0000 UTC m=+2.237915526,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.119163 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c642a69a2aae1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:06.600805089 +0000 UTC m=+2.239071842,LastTimestamp:2026-03-13 11:48:06.600805089 +0000 UTC m=+2.239071842,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.124401 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c642a755c58f2 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:06.797523186 +0000 UTC m=+2.435789949,LastTimestamp:2026-03-13 11:48:06.797523186 +0000 UTC m=+2.435789949,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.128632 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c642a762985c4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:06.81096954 +0000 UTC m=+2.449236303,LastTimestamp:2026-03-13 11:48:06.81096954 +0000 UTC m=+2.449236303,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.133758 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c642a7649ca11 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:06.813084177 +0000 UTC m=+2.451350940,LastTimestamp:2026-03-13 11:48:06.813084177 +0000 UTC m=+2.451350940,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.139027 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c642a82a17e86 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:07.020158598 +0000 UTC m=+2.658425371,LastTimestamp:2026-03-13 11:48:07.020158598 +0000 UTC m=+2.658425371,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.144279 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c642a8369ea2a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:07.033293354 +0000 UTC m=+2.671560127,LastTimestamp:2026-03-13 11:48:07.033293354 +0000 UTC m=+2.671560127,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.149391 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c642a85767727 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:07.067670311 +0000 UTC m=+2.705937114,LastTimestamp:2026-03-13 11:48:07.067670311 +0000 UTC m=+2.705937114,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.159251 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c642a864b3ea9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:07.081615017 +0000 UTC m=+2.719881790,LastTimestamp:2026-03-13 11:48:07.081615017 +0000 UTC m=+2.719881790,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.167777 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c642a865c29c6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:07.082723782 +0000 UTC m=+2.720990555,LastTimestamp:2026-03-13 11:48:07.082723782 +0000 UTC m=+2.720990555,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.174075 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c642a86826e85 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:07.085231749 +0000 UTC m=+2.723498522,LastTimestamp:2026-03-13 11:48:07.085231749 +0000 UTC m=+2.723498522,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.178983 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c642a93f2775a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:07.31067785 +0000 UTC m=+2.948944613,LastTimestamp:2026-03-13 11:48:07.31067785 +0000 UTC m=+2.948944613,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.182941 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c642a94430eba openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:07.315959482 +0000 UTC m=+2.954226245,LastTimestamp:2026-03-13 11:48:07.315959482 +0000 UTC m=+2.954226245,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.187089 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c642a94433020 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:07.315968032 +0000 UTC m=+2.954234795,LastTimestamp:2026-03-13 11:48:07.315968032 +0000 UTC m=+2.954234795,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.191915 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c642a944c26d0 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:07.316555472 +0000 UTC m=+2.954822235,LastTimestamp:2026-03-13 11:48:07.316555472 +0000 UTC m=+2.954822235,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.196612 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c642a94e7e2be openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:07.326761662 +0000 UTC m=+2.965028425,LastTimestamp:2026-03-13 11:48:07.326761662 +0000 UTC m=+2.965028425,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.202202 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c642a94f940e7 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:07.327899879 +0000 UTC m=+2.966166662,LastTimestamp:2026-03-13 11:48:07.327899879 +0000 UTC m=+2.966166662,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.206118 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c642a952981a3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:07.331062179 +0000 UTC m=+2.969328942,LastTimestamp:2026-03-13 11:48:07.331062179 +0000 UTC m=+2.969328942,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.210697 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c642a9567de2f openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:07.335149103 +0000 UTC m=+2.973415856,LastTimestamp:2026-03-13 11:48:07.335149103 +0000 UTC m=+2.973415856,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.215069 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c642a9596afb9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:07.338217401 +0000 UTC m=+2.976484164,LastTimestamp:2026-03-13 11:48:07.338217401 +0000 UTC m=+2.976484164,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.218844 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c642a9598c4bd openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:07.338353853 +0000 UTC m=+2.976620616,LastTimestamp:2026-03-13 11:48:07.338353853 +0000 UTC m=+2.976620616,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.222324 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c642aa12170b6 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:07.53186015 +0000 UTC m=+3.170126913,LastTimestamp:2026-03-13 11:48:07.53186015 +0000 UTC m=+3.170126913,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.227082 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c642aa1400547 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:07.533864263 +0000 UTC m=+3.172131016,LastTimestamp:2026-03-13 11:48:07.533864263 +0000 UTC m=+3.172131016,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.231776 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c642aa2b84e2a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:07.558524458 +0000 UTC m=+3.196791221,LastTimestamp:2026-03-13 11:48:07.558524458 +0000 UTC m=+3.196791221,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.236084 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c642aa2cd30cf openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:07.559893199 +0000 UTC m=+3.198159962,LastTimestamp:2026-03-13 11:48:07.559893199 +0000 UTC m=+3.198159962,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: I0313 11:48:40.237977 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 13 11:48:40 crc kubenswrapper[4837]: I0313 11:48:40.238724 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 13 11:48:40 crc kubenswrapper[4837]: I0313 11:48:40.240742 4837 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="567274bd739faf34091c836a1d1ba1184d2ed741ed419f592fc7dbec60b92e8a" exitCode=255 Mar 13 11:48:40 crc kubenswrapper[4837]: I0313 11:48:40.240872 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"567274bd739faf34091c836a1d1ba1184d2ed741ed419f592fc7dbec60b92e8a"} Mar 13 11:48:40 crc kubenswrapper[4837]: I0313 11:48:40.240980 4837 scope.go:117] "RemoveContainer" containerID="d48dfb6261690badcc8bdd6d6dc0a6e2c8bc051c0ff9240fe1cfd02c1c56daa5" Mar 13 11:48:40 crc kubenswrapper[4837]: I0313 11:48:40.241363 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.242044 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c642aa2f85b9b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:07.562722203 +0000 UTC m=+3.200988966,LastTimestamp:2026-03-13 11:48:07.562722203 +0000 UTC m=+3.200988966,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: I0313 11:48:40.242881 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:40 crc kubenswrapper[4837]: I0313 11:48:40.242917 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:40 crc kubenswrapper[4837]: I0313 11:48:40.242926 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:40 crc kubenswrapper[4837]: I0313 11:48:40.243707 4837 scope.go:117] "RemoveContainer" containerID="567274bd739faf34091c836a1d1ba1184d2ed741ed419f592fc7dbec60b92e8a" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.243883 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.251425 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c642aa31674bb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:07.564694715 +0000 UTC m=+3.202961478,LastTimestamp:2026-03-13 11:48:07.564694715 +0000 UTC m=+3.202961478,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.256929 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c642ac03c2ee2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:08.053706466 +0000 UTC m=+3.691973229,LastTimestamp:2026-03-13 11:48:08.053706466 +0000 UTC m=+3.691973229,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.261223 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c642ac07c1be7 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:08.057895911 +0000 UTC m=+3.696162684,LastTimestamp:2026-03-13 11:48:08.057895911 +0000 UTC m=+3.696162684,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.268515 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c642ac21040c8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:08.084381896 +0000 UTC m=+3.722648659,LastTimestamp:2026-03-13 11:48:08.084381896 +0000 UTC m=+3.722648659,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.274511 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c642ac22f1b11 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:08.086403857 +0000 UTC m=+3.724670660,LastTimestamp:2026-03-13 11:48:08.086403857 +0000 UTC m=+3.724670660,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.279365 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c642ac242e6a6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:08.087701158 +0000 UTC m=+3.725967931,LastTimestamp:2026-03-13 11:48:08.087701158 +0000 UTC m=+3.725967931,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.284282 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c642ac24b02da openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:08.088232666 +0000 UTC m=+3.726499469,LastTimestamp:2026-03-13 11:48:08.088232666 +0000 UTC m=+3.726499469,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.290491 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c642ad1fb0f21 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:08.351428385 +0000 UTC m=+3.989695148,LastTimestamp:2026-03-13 11:48:08.351428385 +0000 UTC m=+3.989695148,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.295378 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c642ad202aa3a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:08.351926842 +0000 UTC m=+3.990193605,LastTimestamp:2026-03-13 11:48:08.351926842 +0000 UTC m=+3.990193605,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.299240 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c642ad2db779a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:08.366135194 +0000 UTC m=+4.004401957,LastTimestamp:2026-03-13 11:48:08.366135194 +0000 UTC m=+4.004401957,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.303261 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c642ad2f0f591 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:08.367543697 +0000 UTC m=+4.005810460,LastTimestamp:2026-03-13 11:48:08.367543697 +0000 UTC m=+4.005810460,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: I0313 11:48:40.307763 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.307931 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c642ad31e950b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:08.370533643 +0000 UTC m=+4.008800416,LastTimestamp:2026-03-13 11:48:08.370533643 +0000 UTC m=+4.008800416,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: I0313 11:48:40.309085 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:40 crc kubenswrapper[4837]: I0313 11:48:40.309139 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:40 crc kubenswrapper[4837]: I0313 11:48:40.309152 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:40 crc kubenswrapper[4837]: I0313 11:48:40.309185 4837 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.312042 4837 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.312391 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c642adec52071 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:08.565997681 +0000 UTC m=+4.204264484,LastTimestamp:2026-03-13 11:48:08.565997681 +0000 UTC m=+4.204264484,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.312776 4837 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.314161 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c642adfa53139 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:08.580682041 +0000 UTC m=+4.218948794,LastTimestamp:2026-03-13 11:48:08.580682041 +0000 UTC m=+4.218948794,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.316101 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c642aff001483 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:09.106732163 +0000 UTC m=+4.744998926,LastTimestamp:2026-03-13 11:48:09.106732163 +0000 UTC m=+4.744998926,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.321231 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c642b0c1423af openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:09.326150575 +0000 UTC m=+4.964417348,LastTimestamp:2026-03-13 11:48:09.326150575 +0000 UTC m=+4.964417348,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.327188 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c642b0c9e3c59 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:09.335200857 +0000 UTC m=+4.973467660,LastTimestamp:2026-03-13 11:48:09.335200857 +0000 UTC m=+4.973467660,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.331306 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c642b0cb1492f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:09.336449327 +0000 UTC m=+4.974716090,LastTimestamp:2026-03-13 11:48:09.336449327 +0000 UTC m=+4.974716090,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.335274 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c642b19c4fe83 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:09.555844739 +0000 UTC m=+5.194111552,LastTimestamp:2026-03-13 11:48:09.555844739 +0000 UTC m=+5.194111552,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.339446 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c642b1a8b5d93 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:09.568845203 +0000 UTC m=+5.207112006,LastTimestamp:2026-03-13 11:48:09.568845203 +0000 UTC m=+5.207112006,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.344051 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c642b1aaa5b9b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:09.570876315 +0000 UTC m=+5.209143118,LastTimestamp:2026-03-13 11:48:09.570876315 +0000 UTC m=+5.209143118,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.348225 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c642b2808fc6b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:09.795181675 +0000 UTC m=+5.433448448,LastTimestamp:2026-03-13 11:48:09.795181675 +0000 UTC m=+5.433448448,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.351935 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c642b28a954b7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:09.805690039 +0000 UTC m=+5.443956812,LastTimestamp:2026-03-13 11:48:09.805690039 +0000 UTC m=+5.443956812,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.355810 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c642b28c03aeb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:09.807190763 +0000 UTC m=+5.445457526,LastTimestamp:2026-03-13 11:48:09.807190763 +0000 UTC m=+5.445457526,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.359891 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c642b34e9bf5b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:10.011238235 +0000 UTC m=+5.649504998,LastTimestamp:2026-03-13 11:48:10.011238235 +0000 UTC m=+5.649504998,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.363520 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c642b35876ea1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:10.021572257 +0000 UTC m=+5.659839060,LastTimestamp:2026-03-13 11:48:10.021572257 +0000 UTC m=+5.659839060,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.367804 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c642b35a01488 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:10.023187592 +0000 UTC m=+5.661454355,LastTimestamp:2026-03-13 11:48:10.023187592 +0000 UTC m=+5.661454355,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.372780 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c642b405a22f1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:10.203153137 +0000 UTC m=+5.841419900,LastTimestamp:2026-03-13 11:48:10.203153137 +0000 UTC m=+5.841419900,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.378342 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c642b4105af05 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:10.214395653 +0000 UTC m=+5.852662416,LastTimestamp:2026-03-13 11:48:10.214395653 +0000 UTC m=+5.852662416,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.384772 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 13 11:48:40 crc kubenswrapper[4837]: &Event{ObjectMeta:{kube-controller-manager-crc.189c642b51b490f6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 13 11:48:40 crc kubenswrapper[4837]: body: Mar 13 11:48:40 crc kubenswrapper[4837]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:10.494292214 +0000 UTC m=+6.132559017,LastTimestamp:2026-03-13 11:48:10.494292214 +0000 UTC m=+6.132559017,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 11:48:40 crc kubenswrapper[4837]: > Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.389365 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c642b51b70040 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:10.494451776 +0000 UTC m=+6.132718579,LastTimestamp:2026-03-13 11:48:10.494451776 +0000 UTC m=+6.132718579,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.396337 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 13 11:48:40 crc kubenswrapper[4837]: &Event{ObjectMeta:{kube-apiserver-crc.189c642d82253ae4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 13 11:48:40 crc kubenswrapper[4837]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 13 11:48:40 crc kubenswrapper[4837]: Mar 13 11:48:40 crc kubenswrapper[4837]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:19.896916708 +0000 UTC m=+15.535183491,LastTimestamp:2026-03-13 11:48:19.896916708 +0000 UTC m=+15.535183491,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 11:48:40 crc kubenswrapper[4837]: > Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.402550 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c642d8225f364 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:19.89696394 +0000 UTC m=+15.535230713,LastTimestamp:2026-03-13 11:48:19.89696394 +0000 UTC m=+15.535230713,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.406505 4837 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c642d82253ae4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 13 11:48:40 crc kubenswrapper[4837]: &Event{ObjectMeta:{kube-apiserver-crc.189c642d82253ae4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 13 11:48:40 crc kubenswrapper[4837]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 13 11:48:40 crc kubenswrapper[4837]: Mar 13 11:48:40 crc kubenswrapper[4837]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:19.896916708 +0000 UTC m=+15.535183491,LastTimestamp:2026-03-13 11:48:19.902502782 +0000 UTC m=+15.540769545,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 11:48:40 crc kubenswrapper[4837]: > Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.410940 4837 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c642d8225f364\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c642d8225f364 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:19.89696394 +0000 UTC m=+15.535230713,LastTimestamp:2026-03-13 11:48:19.902559993 +0000 UTC m=+15.540826766,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.415551 4837 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c642ad2f0f591\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c642ad2f0f591 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:08.367543697 +0000 UTC m=+4.005810460,LastTimestamp:2026-03-13 11:48:20.166859084 +0000 UTC m=+15.805125857,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.420690 4837 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c642adec52071\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c642adec52071 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:08.565997681 +0000 UTC m=+4.204264484,LastTimestamp:2026-03-13 11:48:20.361068104 +0000 UTC m=+15.999334867,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.424807 4837 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c642adfa53139\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c642adfa53139 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:08.580682041 +0000 UTC m=+4.218948794,LastTimestamp:2026-03-13 11:48:20.36997771 +0000 UTC m=+16.008244473,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.431790 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 13 11:48:40 crc kubenswrapper[4837]: &Event{ObjectMeta:{kube-controller-manager-crc.189c642da56d5244 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 13 11:48:40 crc kubenswrapper[4837]: body: Mar 13 11:48:40 crc kubenswrapper[4837]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:20.488843844 +0000 UTC m=+16.127110607,LastTimestamp:2026-03-13 11:48:20.488843844 +0000 UTC m=+16.127110607,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 11:48:40 crc kubenswrapper[4837]: > Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.436683 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c642da56e187a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:20.488894586 +0000 UTC m=+16.127161339,LastTimestamp:2026-03-13 11:48:20.488894586 +0000 UTC m=+16.127161339,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.445106 4837 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c642da56d5244\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 13 11:48:40 crc kubenswrapper[4837]: &Event{ObjectMeta:{kube-controller-manager-crc.189c642da56d5244 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 13 11:48:40 crc kubenswrapper[4837]: body: Mar 13 11:48:40 crc kubenswrapper[4837]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:20.488843844 +0000 UTC m=+16.127110607,LastTimestamp:2026-03-13 11:48:30.489017877 +0000 UTC m=+26.127284640,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 11:48:40 crc kubenswrapper[4837]: > Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.452094 4837 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c642da56e187a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c642da56e187a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:20.488894586 +0000 UTC m=+16.127161339,LastTimestamp:2026-03-13 11:48:30.489077039 +0000 UTC m=+26.127343802,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.457582 4837 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c642ff99eed20 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:30.491315488 +0000 UTC m=+26.129582251,LastTimestamp:2026-03-13 11:48:30.491315488 +0000 UTC m=+26.129582251,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.462624 4837 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c642a53d77337\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c642a53d77337 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:06.235165495 +0000 UTC m=+1.873432248,LastTimestamp:2026-03-13 11:48:30.626406165 +0000 UTC m=+26.264672928,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.469373 4837 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c642a68d15582\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c642a68d15582 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:06.58708621 +0000 UTC m=+2.225352973,LastTimestamp:2026-03-13 11:48:30.831602585 +0000 UTC m=+26.469869348,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.474618 4837 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c642a699105fb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c642a699105fb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:06.599648763 +0000 UTC m=+2.237915526,LastTimestamp:2026-03-13 11:48:30.843791993 +0000 UTC m=+26.482058766,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: I0313 11:48:40.489028 4837 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 11:48:40 crc kubenswrapper[4837]: I0313 11:48:40.489123 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.494892 4837 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c642da56d5244\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 13 11:48:40 crc kubenswrapper[4837]: &Event{ObjectMeta:{kube-controller-manager-crc.189c642da56d5244 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 13 11:48:40 crc kubenswrapper[4837]: body: Mar 13 11:48:40 crc kubenswrapper[4837]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:20.488843844 +0000 UTC m=+16.127110607,LastTimestamp:2026-03-13 11:48:40.489099458 +0000 UTC m=+36.127366231,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 11:48:40 crc kubenswrapper[4837]: > Mar 13 11:48:40 crc kubenswrapper[4837]: E0313 11:48:40.499619 4837 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c642da56e187a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c642da56e187a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:20.488894586 +0000 UTC m=+16.127161339,LastTimestamp:2026-03-13 11:48:40.48915876 +0000 UTC m=+36.127425533,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:48:40 crc kubenswrapper[4837]: I0313 11:48:40.982127 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:48:41 crc kubenswrapper[4837]: I0313 11:48:41.248938 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 13 11:48:41 crc kubenswrapper[4837]: I0313 11:48:41.981022 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:48:42 crc kubenswrapper[4837]: I0313 11:48:42.980784 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:48:43 crc kubenswrapper[4837]: W0313 11:48:43.978894 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 13 11:48:43 crc kubenswrapper[4837]: E0313 11:48:43.979211 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 13 11:48:43 crc kubenswrapper[4837]: I0313 11:48:43.979359 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:48:44 crc kubenswrapper[4837]: W0313 11:48:44.701753 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 13 11:48:44 crc kubenswrapper[4837]: E0313 11:48:44.702321 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 13 11:48:44 crc kubenswrapper[4837]: I0313 11:48:44.978629 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:48:45 crc kubenswrapper[4837]: E0313 11:48:45.133130 4837 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 11:48:45 crc kubenswrapper[4837]: I0313 11:48:45.544911 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:48:45 crc kubenswrapper[4837]: I0313 11:48:45.545208 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:45 crc kubenswrapper[4837]: I0313 11:48:45.546832 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:45 crc kubenswrapper[4837]: I0313 11:48:45.546896 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:45 crc kubenswrapper[4837]: I0313 11:48:45.546923 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:45 crc kubenswrapper[4837]: I0313 11:48:45.548023 4837 scope.go:117] "RemoveContainer" containerID="567274bd739faf34091c836a1d1ba1184d2ed741ed419f592fc7dbec60b92e8a" Mar 13 11:48:45 crc kubenswrapper[4837]: E0313 11:48:45.548699 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 11:48:45 crc kubenswrapper[4837]: I0313 11:48:45.845164 4837 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 13 11:48:45 crc kubenswrapper[4837]: I0313 11:48:45.866691 4837 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 13 11:48:45 crc kubenswrapper[4837]: I0313 11:48:45.980494 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:48:46 crc kubenswrapper[4837]: I0313 11:48:46.377613 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:48:46 crc kubenswrapper[4837]: I0313 11:48:46.377882 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:46 crc kubenswrapper[4837]: I0313 11:48:46.380792 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:46 crc kubenswrapper[4837]: I0313 11:48:46.380837 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:46 crc kubenswrapper[4837]: I0313 11:48:46.380857 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:46 crc kubenswrapper[4837]: I0313 11:48:46.382164 4837 scope.go:117] "RemoveContainer" containerID="567274bd739faf34091c836a1d1ba1184d2ed741ed419f592fc7dbec60b92e8a" Mar 13 11:48:46 crc kubenswrapper[4837]: E0313 11:48:46.382526 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 11:48:46 crc kubenswrapper[4837]: I0313 11:48:46.978799 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:48:47 crc kubenswrapper[4837]: I0313 11:48:47.312989 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:47 crc kubenswrapper[4837]: I0313 11:48:47.315007 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:47 crc kubenswrapper[4837]: I0313 11:48:47.315238 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:47 crc kubenswrapper[4837]: I0313 11:48:47.315421 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:47 crc kubenswrapper[4837]: I0313 11:48:47.315622 4837 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 11:48:47 crc kubenswrapper[4837]: E0313 11:48:47.320882 4837 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 13 11:48:47 crc kubenswrapper[4837]: E0313 11:48:47.320993 4837 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 13 11:48:47 crc kubenswrapper[4837]: I0313 11:48:47.981343 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:48:48 crc kubenswrapper[4837]: W0313 11:48:48.114537 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 13 11:48:48 crc kubenswrapper[4837]: E0313 11:48:48.114623 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 13 11:48:48 crc kubenswrapper[4837]: I0313 11:48:48.980458 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:48:50 crc kubenswrapper[4837]: I0313 11:48:49.978054 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:48:50 crc kubenswrapper[4837]: I0313 11:48:50.488507 4837 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 11:48:50 crc kubenswrapper[4837]: I0313 11:48:50.488623 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 11:48:50 crc kubenswrapper[4837]: E0313 11:48:50.498278 4837 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c642b51b490f6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 13 11:48:50 crc kubenswrapper[4837]: &Event{ObjectMeta:{kube-controller-manager-crc.189c642b51b490f6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 13 11:48:50 crc kubenswrapper[4837]: body: Mar 13 11:48:50 crc kubenswrapper[4837]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:48:10.494292214 +0000 UTC m=+6.132559017,LastTimestamp:2026-03-13 11:48:50.48858393 +0000 UTC m=+46.126850723,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 11:48:50 crc kubenswrapper[4837]: > Mar 13 11:48:50 crc kubenswrapper[4837]: I0313 11:48:50.982200 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:48:51 crc kubenswrapper[4837]: I0313 11:48:51.981122 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:48:52 crc kubenswrapper[4837]: W0313 11:48:52.425492 4837 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 13 11:48:52 crc kubenswrapper[4837]: E0313 11:48:52.425566 4837 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 13 11:48:52 crc kubenswrapper[4837]: I0313 11:48:52.983114 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:48:53 crc kubenswrapper[4837]: I0313 11:48:53.299590 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 11:48:53 crc kubenswrapper[4837]: I0313 11:48:53.299797 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:53 crc kubenswrapper[4837]: I0313 11:48:53.301224 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:53 crc kubenswrapper[4837]: I0313 11:48:53.301280 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:53 crc kubenswrapper[4837]: I0313 11:48:53.301297 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:53 crc kubenswrapper[4837]: I0313 11:48:53.977427 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:48:54 crc kubenswrapper[4837]: I0313 11:48:54.322085 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:54 crc kubenswrapper[4837]: I0313 11:48:54.323957 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:54 crc kubenswrapper[4837]: I0313 11:48:54.324056 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:54 crc kubenswrapper[4837]: I0313 11:48:54.324080 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:54 crc kubenswrapper[4837]: I0313 11:48:54.324138 4837 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 11:48:54 crc kubenswrapper[4837]: E0313 11:48:54.329994 4837 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 13 11:48:54 crc kubenswrapper[4837]: E0313 11:48:54.329838 4837 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 13 11:48:54 crc kubenswrapper[4837]: I0313 11:48:54.981114 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:48:55 crc kubenswrapper[4837]: E0313 11:48:55.133355 4837 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 11:48:55 crc kubenswrapper[4837]: I0313 11:48:55.980193 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:48:56 crc kubenswrapper[4837]: I0313 11:48:56.979511 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:48:57 crc kubenswrapper[4837]: I0313 11:48:57.981026 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:48:58 crc kubenswrapper[4837]: I0313 11:48:58.982121 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:48:59 crc kubenswrapper[4837]: I0313 11:48:59.047945 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:48:59 crc kubenswrapper[4837]: I0313 11:48:59.049993 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:48:59 crc kubenswrapper[4837]: I0313 11:48:59.050096 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:48:59 crc kubenswrapper[4837]: I0313 11:48:59.050115 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:48:59 crc kubenswrapper[4837]: I0313 11:48:59.051070 4837 scope.go:117] "RemoveContainer" containerID="567274bd739faf34091c836a1d1ba1184d2ed741ed419f592fc7dbec60b92e8a" Mar 13 11:48:59 crc kubenswrapper[4837]: E0313 11:48:59.051281 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 11:48:59 crc kubenswrapper[4837]: I0313 11:48:59.983410 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:49:00 crc kubenswrapper[4837]: I0313 11:49:00.488900 4837 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 11:49:00 crc kubenswrapper[4837]: I0313 11:49:00.489038 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 11:49:00 crc kubenswrapper[4837]: I0313 11:49:00.489150 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:49:00 crc kubenswrapper[4837]: I0313 11:49:00.489491 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:49:00 crc kubenswrapper[4837]: I0313 11:49:00.491310 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:00 crc kubenswrapper[4837]: I0313 11:49:00.491370 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:00 crc kubenswrapper[4837]: I0313 11:49:00.491384 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:00 crc kubenswrapper[4837]: I0313 11:49:00.492448 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"6f6e6211c6e06af773a58005617cdc56edcb5787a72302dacb1aa7602572beb8"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 13 11:49:00 crc kubenswrapper[4837]: I0313 11:49:00.492664 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://6f6e6211c6e06af773a58005617cdc56edcb5787a72302dacb1aa7602572beb8" gracePeriod=30 Mar 13 11:49:00 crc kubenswrapper[4837]: I0313 11:49:00.980113 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:49:01 crc kubenswrapper[4837]: I0313 11:49:01.321074 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 13 11:49:01 crc kubenswrapper[4837]: I0313 11:49:01.323508 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 13 11:49:01 crc kubenswrapper[4837]: I0313 11:49:01.324053 4837 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="6f6e6211c6e06af773a58005617cdc56edcb5787a72302dacb1aa7602572beb8" exitCode=255 Mar 13 11:49:01 crc kubenswrapper[4837]: I0313 11:49:01.324111 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"6f6e6211c6e06af773a58005617cdc56edcb5787a72302dacb1aa7602572beb8"} Mar 13 11:49:01 crc kubenswrapper[4837]: I0313 11:49:01.324192 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b0cb44c62a16dac6c4ffe8a78228279de3d95df063c3450e21ba1bd7d3d27f29"} Mar 13 11:49:01 crc kubenswrapper[4837]: I0313 11:49:01.324230 4837 scope.go:117] "RemoveContainer" containerID="f78690d91eabf6f2c116b2e2bea9989a42acaeeef513ed5a6050a251c3d03066" Mar 13 11:49:01 crc kubenswrapper[4837]: I0313 11:49:01.324325 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:49:01 crc kubenswrapper[4837]: I0313 11:49:01.325291 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:01 crc kubenswrapper[4837]: I0313 11:49:01.325332 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:01 crc kubenswrapper[4837]: I0313 11:49:01.325347 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:01 crc kubenswrapper[4837]: I0313 11:49:01.330906 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:49:01 crc kubenswrapper[4837]: I0313 11:49:01.332924 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:01 crc kubenswrapper[4837]: I0313 11:49:01.333060 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:01 crc kubenswrapper[4837]: I0313 11:49:01.333104 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:01 crc kubenswrapper[4837]: I0313 11:49:01.333151 4837 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 11:49:01 crc kubenswrapper[4837]: E0313 11:49:01.337311 4837 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 13 11:49:01 crc kubenswrapper[4837]: E0313 11:49:01.338294 4837 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 13 11:49:01 crc kubenswrapper[4837]: I0313 11:49:01.979287 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:49:02 crc kubenswrapper[4837]: I0313 11:49:02.328496 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 13 11:49:02 crc kubenswrapper[4837]: I0313 11:49:02.329840 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:49:02 crc kubenswrapper[4837]: I0313 11:49:02.330844 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:02 crc kubenswrapper[4837]: I0313 11:49:02.330890 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:02 crc kubenswrapper[4837]: I0313 11:49:02.330900 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:02 crc kubenswrapper[4837]: I0313 11:49:02.979282 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:49:03 crc kubenswrapper[4837]: I0313 11:49:03.978178 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:49:04 crc kubenswrapper[4837]: I0313 11:49:04.978596 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:49:05 crc kubenswrapper[4837]: E0313 11:49:05.133765 4837 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 11:49:05 crc kubenswrapper[4837]: I0313 11:49:05.977471 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:49:06 crc kubenswrapper[4837]: I0313 11:49:06.978074 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:49:07 crc kubenswrapper[4837]: I0313 11:49:07.487289 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:49:07 crc kubenswrapper[4837]: I0313 11:49:07.487786 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:49:07 crc kubenswrapper[4837]: I0313 11:49:07.489205 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:07 crc kubenswrapper[4837]: I0313 11:49:07.489225 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:07 crc kubenswrapper[4837]: I0313 11:49:07.489235 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:07 crc kubenswrapper[4837]: I0313 11:49:07.492937 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:49:07 crc kubenswrapper[4837]: I0313 11:49:07.733200 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:49:07 crc kubenswrapper[4837]: I0313 11:49:07.978383 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:49:08 crc kubenswrapper[4837]: I0313 11:49:08.338403 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:49:08 crc kubenswrapper[4837]: I0313 11:49:08.339605 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:08 crc kubenswrapper[4837]: I0313 11:49:08.339723 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:08 crc kubenswrapper[4837]: I0313 11:49:08.339735 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:08 crc kubenswrapper[4837]: I0313 11:49:08.339787 4837 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 11:49:08 crc kubenswrapper[4837]: E0313 11:49:08.344886 4837 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 13 11:49:08 crc kubenswrapper[4837]: I0313 11:49:08.345901 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:49:08 crc kubenswrapper[4837]: I0313 11:49:08.346778 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:08 crc kubenswrapper[4837]: I0313 11:49:08.346836 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:08 crc kubenswrapper[4837]: I0313 11:49:08.346849 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:08 crc kubenswrapper[4837]: E0313 11:49:08.348982 4837 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 13 11:49:08 crc kubenswrapper[4837]: I0313 11:49:08.979697 4837 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 11:49:09 crc kubenswrapper[4837]: I0313 11:49:09.348862 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:49:09 crc kubenswrapper[4837]: I0313 11:49:09.350122 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:09 crc kubenswrapper[4837]: I0313 11:49:09.350175 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:09 crc kubenswrapper[4837]: I0313 11:49:09.350189 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:09 crc kubenswrapper[4837]: I0313 11:49:09.432170 4837 csr.go:261] certificate signing request csr-f4sll is approved, waiting to be issued Mar 13 11:49:09 crc kubenswrapper[4837]: I0313 11:49:09.441846 4837 csr.go:257] certificate signing request csr-f4sll is issued Mar 13 11:49:09 crc kubenswrapper[4837]: I0313 11:49:09.536974 4837 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 13 11:49:09 crc kubenswrapper[4837]: I0313 11:49:09.839180 4837 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 13 11:49:10 crc kubenswrapper[4837]: I0313 11:49:10.048249 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:49:10 crc kubenswrapper[4837]: I0313 11:49:10.049565 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:10 crc kubenswrapper[4837]: I0313 11:49:10.049612 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:10 crc kubenswrapper[4837]: I0313 11:49:10.049626 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:10 crc kubenswrapper[4837]: I0313 11:49:10.050272 4837 scope.go:117] "RemoveContainer" containerID="567274bd739faf34091c836a1d1ba1184d2ed741ed419f592fc7dbec60b92e8a" Mar 13 11:49:10 crc kubenswrapper[4837]: I0313 11:49:10.354340 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 13 11:49:10 crc kubenswrapper[4837]: I0313 11:49:10.356997 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8"} Mar 13 11:49:10 crc kubenswrapper[4837]: I0313 11:49:10.357167 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:49:10 crc kubenswrapper[4837]: I0313 11:49:10.358151 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:10 crc kubenswrapper[4837]: I0313 11:49:10.358197 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:10 crc kubenswrapper[4837]: I0313 11:49:10.358207 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:10 crc kubenswrapper[4837]: I0313 11:49:10.443359 4837 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-02 11:12:52.35671313 +0000 UTC Mar 13 11:49:10 crc kubenswrapper[4837]: I0313 11:49:10.443425 4837 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7079h23m41.913290594s for next certificate rotation Mar 13 11:49:11 crc kubenswrapper[4837]: I0313 11:49:11.047276 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:49:11 crc kubenswrapper[4837]: I0313 11:49:11.050484 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:11 crc kubenswrapper[4837]: I0313 11:49:11.050562 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:11 crc kubenswrapper[4837]: I0313 11:49:11.050580 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:11 crc kubenswrapper[4837]: I0313 11:49:11.349051 4837 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 13 11:49:11 crc kubenswrapper[4837]: I0313 11:49:11.360568 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 13 11:49:11 crc kubenswrapper[4837]: I0313 11:49:11.360941 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 13 11:49:11 crc kubenswrapper[4837]: I0313 11:49:11.362694 4837 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8" exitCode=255 Mar 13 11:49:11 crc kubenswrapper[4837]: I0313 11:49:11.362760 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8"} Mar 13 11:49:11 crc kubenswrapper[4837]: I0313 11:49:11.362809 4837 scope.go:117] "RemoveContainer" containerID="567274bd739faf34091c836a1d1ba1184d2ed741ed419f592fc7dbec60b92e8a" Mar 13 11:49:11 crc kubenswrapper[4837]: I0313 11:49:11.362996 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:49:11 crc kubenswrapper[4837]: I0313 11:49:11.363962 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:11 crc kubenswrapper[4837]: I0313 11:49:11.363985 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:11 crc kubenswrapper[4837]: I0313 11:49:11.363995 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:11 crc kubenswrapper[4837]: I0313 11:49:11.364585 4837 scope.go:117] "RemoveContainer" containerID="6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8" Mar 13 11:49:11 crc kubenswrapper[4837]: E0313 11:49:11.364777 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 11:49:12 crc kubenswrapper[4837]: I0313 11:49:12.367730 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 13 11:49:15 crc kubenswrapper[4837]: E0313 11:49:15.134877 4837 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.345359 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.346803 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.346853 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.346866 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.347009 4837 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.355772 4837 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.356106 4837 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 13 11:49:15 crc kubenswrapper[4837]: E0313 11:49:15.356132 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.360688 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.360727 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.360746 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.360768 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.360814 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:15Z","lastTransitionTime":"2026-03-13T11:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:15 crc kubenswrapper[4837]: E0313 11:49:15.376357 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.382626 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.382681 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.382703 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.382731 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.382744 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:15Z","lastTransitionTime":"2026-03-13T11:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:15 crc kubenswrapper[4837]: E0313 11:49:15.399792 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.404768 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.404808 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.404821 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.404839 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.404849 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:15Z","lastTransitionTime":"2026-03-13T11:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:15 crc kubenswrapper[4837]: E0313 11:49:15.414211 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.418489 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.418572 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.418586 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.418612 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.418627 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:15Z","lastTransitionTime":"2026-03-13T11:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:15 crc kubenswrapper[4837]: E0313 11:49:15.429669 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:15 crc kubenswrapper[4837]: E0313 11:49:15.429893 4837 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 11:49:15 crc kubenswrapper[4837]: E0313 11:49:15.429944 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:15 crc kubenswrapper[4837]: E0313 11:49:15.530442 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.544668 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.544893 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.546224 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.546257 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.546266 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:15 crc kubenswrapper[4837]: I0313 11:49:15.547068 4837 scope.go:117] "RemoveContainer" containerID="6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8" Mar 13 11:49:15 crc kubenswrapper[4837]: E0313 11:49:15.547242 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 11:49:15 crc kubenswrapper[4837]: E0313 11:49:15.630954 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:15 crc kubenswrapper[4837]: E0313 11:49:15.731805 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:15 crc kubenswrapper[4837]: E0313 11:49:15.832419 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:15 crc kubenswrapper[4837]: E0313 11:49:15.933272 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:16 crc kubenswrapper[4837]: E0313 11:49:16.034284 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:16 crc kubenswrapper[4837]: E0313 11:49:16.134470 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:16 crc kubenswrapper[4837]: E0313 11:49:16.235657 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:16 crc kubenswrapper[4837]: E0313 11:49:16.335788 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:16 crc kubenswrapper[4837]: I0313 11:49:16.377590 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:49:16 crc kubenswrapper[4837]: I0313 11:49:16.380054 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:49:16 crc kubenswrapper[4837]: I0313 11:49:16.381264 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:16 crc kubenswrapper[4837]: I0313 11:49:16.381299 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:16 crc kubenswrapper[4837]: I0313 11:49:16.381312 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:16 crc kubenswrapper[4837]: I0313 11:49:16.382251 4837 scope.go:117] "RemoveContainer" containerID="6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8" Mar 13 11:49:16 crc kubenswrapper[4837]: E0313 11:49:16.382572 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 11:49:16 crc kubenswrapper[4837]: E0313 11:49:16.435902 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:16 crc kubenswrapper[4837]: E0313 11:49:16.536547 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:16 crc kubenswrapper[4837]: E0313 11:49:16.636753 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:16 crc kubenswrapper[4837]: E0313 11:49:16.737794 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:16 crc kubenswrapper[4837]: E0313 11:49:16.838348 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:16 crc kubenswrapper[4837]: E0313 11:49:16.939314 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:17 crc kubenswrapper[4837]: E0313 11:49:17.040191 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:17 crc kubenswrapper[4837]: E0313 11:49:17.141184 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:17 crc kubenswrapper[4837]: E0313 11:49:17.242230 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:17 crc kubenswrapper[4837]: E0313 11:49:17.343344 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:17 crc kubenswrapper[4837]: E0313 11:49:17.443806 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:17 crc kubenswrapper[4837]: E0313 11:49:17.545090 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:17 crc kubenswrapper[4837]: E0313 11:49:17.646273 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:17 crc kubenswrapper[4837]: I0313 11:49:17.737499 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:49:17 crc kubenswrapper[4837]: I0313 11:49:17.737796 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:49:17 crc kubenswrapper[4837]: I0313 11:49:17.739448 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:17 crc kubenswrapper[4837]: I0313 11:49:17.739496 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:17 crc kubenswrapper[4837]: I0313 11:49:17.739506 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:17 crc kubenswrapper[4837]: E0313 11:49:17.746691 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:17 crc kubenswrapper[4837]: E0313 11:49:17.847833 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:17 crc kubenswrapper[4837]: E0313 11:49:17.948026 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:18 crc kubenswrapper[4837]: E0313 11:49:18.048953 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:18 crc kubenswrapper[4837]: E0313 11:49:18.149432 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:18 crc kubenswrapper[4837]: E0313 11:49:18.249832 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:18 crc kubenswrapper[4837]: E0313 11:49:18.350659 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:18 crc kubenswrapper[4837]: E0313 11:49:18.451827 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:18 crc kubenswrapper[4837]: E0313 11:49:18.552268 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:18 crc kubenswrapper[4837]: E0313 11:49:18.652490 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:18 crc kubenswrapper[4837]: E0313 11:49:18.753587 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:18 crc kubenswrapper[4837]: E0313 11:49:18.853902 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:18 crc kubenswrapper[4837]: E0313 11:49:18.954020 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:19 crc kubenswrapper[4837]: E0313 11:49:19.054358 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:19 crc kubenswrapper[4837]: E0313 11:49:19.155528 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:19 crc kubenswrapper[4837]: E0313 11:49:19.256136 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:19 crc kubenswrapper[4837]: E0313 11:49:19.357082 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:19 crc kubenswrapper[4837]: E0313 11:49:19.457307 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:19 crc kubenswrapper[4837]: E0313 11:49:19.557511 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:19 crc kubenswrapper[4837]: E0313 11:49:19.658059 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:19 crc kubenswrapper[4837]: E0313 11:49:19.758885 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:19 crc kubenswrapper[4837]: E0313 11:49:19.859673 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:19 crc kubenswrapper[4837]: E0313 11:49:19.959800 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:20 crc kubenswrapper[4837]: E0313 11:49:20.060564 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:20 crc kubenswrapper[4837]: E0313 11:49:20.160909 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:20 crc kubenswrapper[4837]: E0313 11:49:20.261263 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:20 crc kubenswrapper[4837]: E0313 11:49:20.362457 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:20 crc kubenswrapper[4837]: E0313 11:49:20.462923 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:20 crc kubenswrapper[4837]: E0313 11:49:20.563297 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:20 crc kubenswrapper[4837]: E0313 11:49:20.664305 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:20 crc kubenswrapper[4837]: E0313 11:49:20.764877 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:20 crc kubenswrapper[4837]: E0313 11:49:20.865062 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:20 crc kubenswrapper[4837]: E0313 11:49:20.965697 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:21 crc kubenswrapper[4837]: E0313 11:49:21.066711 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:21 crc kubenswrapper[4837]: E0313 11:49:21.167473 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:21 crc kubenswrapper[4837]: E0313 11:49:21.267832 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:21 crc kubenswrapper[4837]: E0313 11:49:21.368128 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:21 crc kubenswrapper[4837]: E0313 11:49:21.468725 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:21 crc kubenswrapper[4837]: E0313 11:49:21.569049 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:21 crc kubenswrapper[4837]: E0313 11:49:21.669412 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:21 crc kubenswrapper[4837]: E0313 11:49:21.770055 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:21 crc kubenswrapper[4837]: E0313 11:49:21.870571 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:21 crc kubenswrapper[4837]: E0313 11:49:21.971138 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:22 crc kubenswrapper[4837]: E0313 11:49:22.072405 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:22 crc kubenswrapper[4837]: E0313 11:49:22.172773 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:22 crc kubenswrapper[4837]: E0313 11:49:22.273259 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:22 crc kubenswrapper[4837]: E0313 11:49:22.374013 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:22 crc kubenswrapper[4837]: E0313 11:49:22.475061 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:22 crc kubenswrapper[4837]: E0313 11:49:22.590057 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:22 crc kubenswrapper[4837]: E0313 11:49:22.690410 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:22 crc kubenswrapper[4837]: E0313 11:49:22.791223 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:22 crc kubenswrapper[4837]: E0313 11:49:22.892120 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:22 crc kubenswrapper[4837]: E0313 11:49:22.992690 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:23 crc kubenswrapper[4837]: E0313 11:49:23.093539 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:23 crc kubenswrapper[4837]: E0313 11:49:23.194467 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:23 crc kubenswrapper[4837]: E0313 11:49:23.294568 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:23 crc kubenswrapper[4837]: E0313 11:49:23.395446 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:23 crc kubenswrapper[4837]: E0313 11:49:23.496285 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:23 crc kubenswrapper[4837]: E0313 11:49:23.597112 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:23 crc kubenswrapper[4837]: E0313 11:49:23.697455 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:23 crc kubenswrapper[4837]: E0313 11:49:23.797901 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:23 crc kubenswrapper[4837]: E0313 11:49:23.898770 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:24 crc kubenswrapper[4837]: E0313 11:49:23.999971 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:24 crc kubenswrapper[4837]: E0313 11:49:24.100902 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:24 crc kubenswrapper[4837]: I0313 11:49:24.161090 4837 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 13 11:49:24 crc kubenswrapper[4837]: E0313 11:49:24.201898 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:24 crc kubenswrapper[4837]: E0313 11:49:24.302484 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:24 crc kubenswrapper[4837]: E0313 11:49:24.403849 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:24 crc kubenswrapper[4837]: E0313 11:49:24.504363 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:24 crc kubenswrapper[4837]: E0313 11:49:24.605100 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:24 crc kubenswrapper[4837]: E0313 11:49:24.705393 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:24 crc kubenswrapper[4837]: E0313 11:49:24.806329 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:24 crc kubenswrapper[4837]: E0313 11:49:24.907327 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:25 crc kubenswrapper[4837]: E0313 11:49:25.007465 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:25 crc kubenswrapper[4837]: E0313 11:49:25.108044 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:25 crc kubenswrapper[4837]: E0313 11:49:25.135359 4837 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 11:49:25 crc kubenswrapper[4837]: E0313 11:49:25.208540 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:25 crc kubenswrapper[4837]: E0313 11:49:25.309322 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:25 crc kubenswrapper[4837]: E0313 11:49:25.410101 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:25 crc kubenswrapper[4837]: E0313 11:49:25.511206 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:25 crc kubenswrapper[4837]: E0313 11:49:25.561601 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 13 11:49:25 crc kubenswrapper[4837]: I0313 11:49:25.566768 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:25 crc kubenswrapper[4837]: I0313 11:49:25.566816 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:25 crc kubenswrapper[4837]: I0313 11:49:25.566833 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:25 crc kubenswrapper[4837]: I0313 11:49:25.566858 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:25 crc kubenswrapper[4837]: I0313 11:49:25.566879 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:25Z","lastTransitionTime":"2026-03-13T11:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:25 crc kubenswrapper[4837]: E0313 11:49:25.585482 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:25 crc kubenswrapper[4837]: I0313 11:49:25.590556 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:25 crc kubenswrapper[4837]: I0313 11:49:25.590594 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:25 crc kubenswrapper[4837]: I0313 11:49:25.590606 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:25 crc kubenswrapper[4837]: I0313 11:49:25.590627 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:25 crc kubenswrapper[4837]: I0313 11:49:25.590647 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:25Z","lastTransitionTime":"2026-03-13T11:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:25 crc kubenswrapper[4837]: E0313 11:49:25.605149 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:25 crc kubenswrapper[4837]: I0313 11:49:25.610464 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:25 crc kubenswrapper[4837]: I0313 11:49:25.610497 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:25 crc kubenswrapper[4837]: I0313 11:49:25.610509 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:25 crc kubenswrapper[4837]: I0313 11:49:25.610526 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:25 crc kubenswrapper[4837]: I0313 11:49:25.610538 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:25Z","lastTransitionTime":"2026-03-13T11:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:25 crc kubenswrapper[4837]: E0313 11:49:25.622576 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:25 crc kubenswrapper[4837]: I0313 11:49:25.627595 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:25 crc kubenswrapper[4837]: I0313 11:49:25.627653 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:25 crc kubenswrapper[4837]: I0313 11:49:25.627667 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:25 crc kubenswrapper[4837]: I0313 11:49:25.627687 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:25 crc kubenswrapper[4837]: I0313 11:49:25.627699 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:25Z","lastTransitionTime":"2026-03-13T11:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:25 crc kubenswrapper[4837]: E0313 11:49:25.639427 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:25 crc kubenswrapper[4837]: E0313 11:49:25.639597 4837 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 11:49:25 crc kubenswrapper[4837]: E0313 11:49:25.639628 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:25 crc kubenswrapper[4837]: E0313 11:49:25.739821 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:25 crc kubenswrapper[4837]: E0313 11:49:25.840902 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:25 crc kubenswrapper[4837]: E0313 11:49:25.941138 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:26 crc kubenswrapper[4837]: E0313 11:49:26.042199 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:26 crc kubenswrapper[4837]: E0313 11:49:26.142966 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:26 crc kubenswrapper[4837]: E0313 11:49:26.243280 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:26 crc kubenswrapper[4837]: E0313 11:49:26.344522 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:26 crc kubenswrapper[4837]: E0313 11:49:26.445387 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:26 crc kubenswrapper[4837]: E0313 11:49:26.546560 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:26 crc kubenswrapper[4837]: E0313 11:49:26.647528 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:26 crc kubenswrapper[4837]: E0313 11:49:26.748373 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:26 crc kubenswrapper[4837]: E0313 11:49:26.849106 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:26 crc kubenswrapper[4837]: E0313 11:49:26.950572 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:27 crc kubenswrapper[4837]: I0313 11:49:27.047896 4837 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 11:49:27 crc kubenswrapper[4837]: I0313 11:49:27.049141 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:27 crc kubenswrapper[4837]: I0313 11:49:27.049213 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:27 crc kubenswrapper[4837]: I0313 11:49:27.049231 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:27 crc kubenswrapper[4837]: I0313 11:49:27.050439 4837 scope.go:117] "RemoveContainer" containerID="6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8" Mar 13 11:49:27 crc kubenswrapper[4837]: E0313 11:49:27.050792 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:27 crc kubenswrapper[4837]: E0313 11:49:27.050792 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 11:49:27 crc kubenswrapper[4837]: E0313 11:49:27.151998 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:27 crc kubenswrapper[4837]: E0313 11:49:27.252267 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:27 crc kubenswrapper[4837]: E0313 11:49:27.352498 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:27 crc kubenswrapper[4837]: E0313 11:49:27.453734 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:27 crc kubenswrapper[4837]: E0313 11:49:27.558600 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:27 crc kubenswrapper[4837]: E0313 11:49:27.658733 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:27 crc kubenswrapper[4837]: E0313 11:49:27.759188 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:27 crc kubenswrapper[4837]: E0313 11:49:27.859958 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:27 crc kubenswrapper[4837]: E0313 11:49:27.960172 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:28 crc kubenswrapper[4837]: E0313 11:49:28.060463 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:28 crc kubenswrapper[4837]: E0313 11:49:28.161227 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:28 crc kubenswrapper[4837]: E0313 11:49:28.261427 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:28 crc kubenswrapper[4837]: E0313 11:49:28.361978 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:28 crc kubenswrapper[4837]: E0313 11:49:28.462574 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:28 crc kubenswrapper[4837]: E0313 11:49:28.563708 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:28 crc kubenswrapper[4837]: E0313 11:49:28.663976 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:28 crc kubenswrapper[4837]: E0313 11:49:28.764787 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:28 crc kubenswrapper[4837]: E0313 11:49:28.864928 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:28 crc kubenswrapper[4837]: E0313 11:49:28.965447 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:29 crc kubenswrapper[4837]: E0313 11:49:29.065720 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:29 crc kubenswrapper[4837]: E0313 11:49:29.166934 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:29 crc kubenswrapper[4837]: E0313 11:49:29.267451 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:29 crc kubenswrapper[4837]: E0313 11:49:29.368154 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:29 crc kubenswrapper[4837]: E0313 11:49:29.468693 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:29 crc kubenswrapper[4837]: E0313 11:49:29.569115 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:29 crc kubenswrapper[4837]: E0313 11:49:29.669544 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:29 crc kubenswrapper[4837]: E0313 11:49:29.770609 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:29 crc kubenswrapper[4837]: E0313 11:49:29.871367 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:29 crc kubenswrapper[4837]: E0313 11:49:29.972008 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:30 crc kubenswrapper[4837]: E0313 11:49:30.072327 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:30 crc kubenswrapper[4837]: E0313 11:49:30.172454 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:30 crc kubenswrapper[4837]: E0313 11:49:30.273025 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:30 crc kubenswrapper[4837]: E0313 11:49:30.373252 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:30 crc kubenswrapper[4837]: E0313 11:49:30.473799 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:30 crc kubenswrapper[4837]: E0313 11:49:30.574398 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:30 crc kubenswrapper[4837]: E0313 11:49:30.675145 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:30 crc kubenswrapper[4837]: E0313 11:49:30.776162 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:30 crc kubenswrapper[4837]: E0313 11:49:30.876712 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:30 crc kubenswrapper[4837]: E0313 11:49:30.977099 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:31 crc kubenswrapper[4837]: E0313 11:49:31.077491 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:31 crc kubenswrapper[4837]: E0313 11:49:31.178578 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:31 crc kubenswrapper[4837]: E0313 11:49:31.279239 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:31 crc kubenswrapper[4837]: E0313 11:49:31.379729 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:31 crc kubenswrapper[4837]: E0313 11:49:31.480347 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:31 crc kubenswrapper[4837]: E0313 11:49:31.581541 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:31 crc kubenswrapper[4837]: E0313 11:49:31.682371 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:31 crc kubenswrapper[4837]: E0313 11:49:31.782558 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:31 crc kubenswrapper[4837]: E0313 11:49:31.883702 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:31 crc kubenswrapper[4837]: E0313 11:49:31.984541 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:32 crc kubenswrapper[4837]: E0313 11:49:32.084942 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:32 crc kubenswrapper[4837]: E0313 11:49:32.186415 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:32 crc kubenswrapper[4837]: E0313 11:49:32.286747 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:32 crc kubenswrapper[4837]: E0313 11:49:32.387325 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:32 crc kubenswrapper[4837]: E0313 11:49:32.488301 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:32 crc kubenswrapper[4837]: E0313 11:49:32.589523 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:32 crc kubenswrapper[4837]: E0313 11:49:32.690290 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:32 crc kubenswrapper[4837]: E0313 11:49:32.791204 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:32 crc kubenswrapper[4837]: E0313 11:49:32.891772 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:32 crc kubenswrapper[4837]: E0313 11:49:32.992655 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:33 crc kubenswrapper[4837]: E0313 11:49:33.093242 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:33 crc kubenswrapper[4837]: E0313 11:49:33.194197 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:33 crc kubenswrapper[4837]: E0313 11:49:33.294760 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:33 crc kubenswrapper[4837]: E0313 11:49:33.395274 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:33 crc kubenswrapper[4837]: E0313 11:49:33.495901 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:33 crc kubenswrapper[4837]: E0313 11:49:33.596560 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:33 crc kubenswrapper[4837]: E0313 11:49:33.697457 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:33 crc kubenswrapper[4837]: E0313 11:49:33.797849 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:33 crc kubenswrapper[4837]: E0313 11:49:33.898996 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:33 crc kubenswrapper[4837]: E0313 11:49:33.999151 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:34 crc kubenswrapper[4837]: E0313 11:49:34.099728 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:34 crc kubenswrapper[4837]: E0313 11:49:34.200866 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:34 crc kubenswrapper[4837]: E0313 11:49:34.302151 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:34 crc kubenswrapper[4837]: E0313 11:49:34.403504 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:34 crc kubenswrapper[4837]: E0313 11:49:34.504575 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:34 crc kubenswrapper[4837]: E0313 11:49:34.605178 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:34 crc kubenswrapper[4837]: E0313 11:49:34.705511 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:34 crc kubenswrapper[4837]: E0313 11:49:34.806665 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:34 crc kubenswrapper[4837]: E0313 11:49:34.907189 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.008055 4837 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.013048 4837 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.088424 4837 apiserver.go:52] "Watching apiserver" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.095472 4837 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.096288 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-cjn4q","openshift-ovn-kubernetes/ovnkube-node-4zzrs","openshift-image-registry/node-ca-np68d","openshift-machine-config-operator/machine-config-daemon-2td4d","openshift-dns/node-resolver-xwmn9","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl","openshift-multus/multus-qg957","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-multus/multus-additional-cni-plugins-xkqn6","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"] Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.096802 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.097029 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.097128 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.097225 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.097223 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.097437 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.098018 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.098330 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.098389 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.098653 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.098682 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-np68d" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.098902 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-xwmn9" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.099053 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.099498 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.099699 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.099810 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.099504 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.100136 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.100504 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.105395 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.105636 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.105738 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.106061 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.105849 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.106221 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.106350 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.106392 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.106565 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.106577 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.106593 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.106620 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.107011 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.107125 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.107078 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.107420 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.107845 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.108027 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.108147 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.108308 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.108420 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.108543 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.108767 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.108936 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.108419 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.109258 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.108106 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.109537 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.109351 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.109818 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.109957 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.110158 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.110287 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.110470 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.110614 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.108031 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.116911 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.116977 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.116991 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.117015 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.117036 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:35Z","lastTransitionTime":"2026-03-13T11:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.130306 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.143641 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.156422 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.168818 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.179491 4837 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.181011 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.189663 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.200539 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.205063 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.205133 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.205171 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.205223 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.205253 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.205301 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.205330 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.205379 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.205407 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.205433 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.205432 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.205481 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.205506 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.205553 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.205578 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.205627 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.205560 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.205810 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.206376 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.206450 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.206564 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.206455 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.206434 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.206682 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.206530 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.206729 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.206792 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.206803 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.206807 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.206820 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.206897 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.206951 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.207042 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.207095 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.207122 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.207144 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.207158 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.207185 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.207206 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.207227 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.207390 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.207413 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.207452 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.207476 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.207498 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.207594 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.207602 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.207623 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.207682 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.207710 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.207813 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.207847 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.207896 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.207921 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.207966 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.207989 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208014 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208131 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208187 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208216 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208263 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208344 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208381 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208427 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208456 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208475 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208514 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208536 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208557 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208591 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208614 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208657 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208682 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208702 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208740 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208794 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208952 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208973 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209015 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209037 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209058 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209092 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209111 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209129 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209214 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209239 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209258 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209294 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209314 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209338 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209381 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209404 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209424 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209460 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209480 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209497 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209531 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209550 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209569 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209602 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209748 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209771 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209789 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209833 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209851 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209873 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209910 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209931 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209953 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.211012 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.211037 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.211075 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.211096 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.211114 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.211161 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.211189 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.211237 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.211260 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.211277 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.207857 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.211313 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.207908 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.211334 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208018 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208191 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.211355 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.211393 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.211414 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.211436 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.211481 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.211509 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.211555 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.211575 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.211597 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.211628 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.211671 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.211716 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.211742 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.211768 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.213408 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208187 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208219 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208598 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208828 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208877 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.208814 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209202 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209221 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209218 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209571 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209687 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209738 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.209973 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.210334 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.210503 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.210796 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.210817 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.210880 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.210897 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.210959 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.211678 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.211707 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.211817 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:49:35.711795341 +0000 UTC m=+91.350062094 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.214900 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.214942 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.214966 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.214991 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.215016 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.215038 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.215061 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.215079 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.215097 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.215116 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.215133 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.215144 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.215151 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.215183 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.215421 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.215446 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.215849 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.215880 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.216074 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.216179 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.216841 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.217217 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.217242 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.217367 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.217260 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.217470 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.217955 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.218180 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.218367 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.218402 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.218577 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.218628 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.218716 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.218745 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.219159 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.219535 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.219560 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.219582 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.219612 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.219656 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.212114 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.212491 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.212730 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.212623 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.219815 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.219845 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.220074 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.220102 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.220754 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.221296 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.221543 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.221584 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.221966 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.222079 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.222108 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.222134 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.222223 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.222388 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.222395 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.222424 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.222461 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.222483 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.222497 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:35Z","lastTransitionTime":"2026-03-13T11:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.222422 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.222903 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.222929 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.222953 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.222977 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.223002 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.223029 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.223059 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.223088 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.223116 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.223139 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.223165 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.223186 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.223211 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.223236 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.223445 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.223477 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.223502 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.212992 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.213024 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.213111 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.213423 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.213625 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.213653 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.213699 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.223760 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.213923 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.213971 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.213999 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.214064 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.214084 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.214199 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.223378 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.223676 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.224296 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.224319 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.224326 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.224377 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.224469 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.224495 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.224514 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.224685 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.225121 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.225185 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.225253 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.225322 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.225258 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.225479 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.225537 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.225595 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.226214 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.225685 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.225901 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.225949 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.226282 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.225846 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.225957 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.227548 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-host-run-multus-certs\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.227579 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-kubelet\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.227600 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdh8r\" (UniqueName: \"kubernetes.io/projected/4c126c88-4541-474c-bc1f-5ca9befa3146-kube-api-access-wdh8r\") pod \"node-ca-np68d\" (UID: \"4c126c88-4541-474c-bc1f-5ca9befa3146\") " pod="openshift-image-registry/node-ca-np68d" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.227662 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.227713 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-cni-binary-copy\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.227737 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-host-run-k8s-cni-cncf-io\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.227758 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-cnibin\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.227775 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-host-var-lib-kubelet\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.227804 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.227833 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/501b48f2-bba8-44d4-81df-7a8b7df456b5-system-cni-dir\") pod \"multus-additional-cni-plugins-xkqn6\" (UID: \"501b48f2-bba8-44d4-81df-7a8b7df456b5\") " pod="openshift-multus/multus-additional-cni-plugins-xkqn6" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.227884 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/501b48f2-bba8-44d4-81df-7a8b7df456b5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xkqn6\" (UID: \"501b48f2-bba8-44d4-81df-7a8b7df456b5\") " pod="openshift-multus/multus-additional-cni-plugins-xkqn6" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.227936 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/338e0d25-c97d-42ec-a8ec-51ddf77a5ed8-rootfs\") pod \"machine-config-daemon-2td4d\" (UID: \"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\") " pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.227978 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-multus-cni-dir\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.228003 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.228213 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/43df29f7-1351-41f5-bfca-17f804837cb4-ovnkube-script-lib\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.228473 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.228506 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.228540 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.228717 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.228749 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.228416 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-hostroot\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.229721 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-run-ovn-kubernetes\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.229804 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-cni-bin\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.229463 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.229505 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.229702 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.229733 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.229867 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.230157 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.230187 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.230238 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.230536 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.230572 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.230609 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85hll\" (UniqueName: \"kubernetes.io/projected/43df29f7-1351-41f5-bfca-17f804837cb4-kube-api-access-85hll\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.230706 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.230896 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/501b48f2-bba8-44d4-81df-7a8b7df456b5-cnibin\") pod \"multus-additional-cni-plugins-xkqn6\" (UID: \"501b48f2-bba8-44d4-81df-7a8b7df456b5\") " pod="openshift-multus/multus-additional-cni-plugins-xkqn6" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231001 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvtx6\" (UniqueName: \"kubernetes.io/projected/338e0d25-c97d-42ec-a8ec-51ddf77a5ed8-kube-api-access-cvtx6\") pod \"machine-config-daemon-2td4d\" (UID: \"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\") " pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231020 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-systemd-units\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231036 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-log-socket\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231056 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmhlq\" (UniqueName: \"kubernetes.io/projected/501b48f2-bba8-44d4-81df-7a8b7df456b5-kube-api-access-pmhlq\") pod \"multus-additional-cni-plugins-xkqn6\" (UID: \"501b48f2-bba8-44d4-81df-7a8b7df456b5\") " pod="openshift-multus/multus-additional-cni-plugins-xkqn6" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231087 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231103 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/43df29f7-1351-41f5-bfca-17f804837cb4-env-overrides\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231208 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4c126c88-4541-474c-bc1f-5ca9befa3146-serviceca\") pod \"node-ca-np68d\" (UID: \"4c126c88-4541-474c-bc1f-5ca9befa3146\") " pod="openshift-image-registry/node-ca-np68d" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231232 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86e5afeb-4720-4593-a53e-dfb5381d0b1d-metrics-certs\") pod \"network-metrics-daemon-cjn4q\" (UID: \"86e5afeb-4720-4593-a53e-dfb5381d0b1d\") " pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231251 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/338e0d25-c97d-42ec-a8ec-51ddf77a5ed8-proxy-tls\") pod \"machine-config-daemon-2td4d\" (UID: \"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\") " pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231268 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e05c56f7-b007-4165-9e29-98cfa865d020-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-dt7fl\" (UID: \"e05c56f7-b007-4165-9e29-98cfa865d020\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231284 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e05c56f7-b007-4165-9e29-98cfa865d020-env-overrides\") pod \"ovnkube-control-plane-749d76644c-dt7fl\" (UID: \"e05c56f7-b007-4165-9e29-98cfa865d020\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231300 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-system-cni-dir\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231316 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-os-release\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231337 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231353 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-host-run-netns\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231375 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231395 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231415 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4c126c88-4541-474c-bc1f-5ca9befa3146-host\") pod \"node-ca-np68d\" (UID: \"4c126c88-4541-474c-bc1f-5ca9befa3146\") " pod="openshift-image-registry/node-ca-np68d" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231427 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231435 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-run-ovn\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231517 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231534 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231559 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231578 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nj56\" (UniqueName: \"kubernetes.io/projected/86e5afeb-4720-4593-a53e-dfb5381d0b1d-kube-api-access-6nj56\") pod \"network-metrics-daemon-cjn4q\" (UID: \"86e5afeb-4720-4593-a53e-dfb5381d0b1d\") " pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231596 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/338e0d25-c97d-42ec-a8ec-51ddf77a5ed8-mcd-auth-proxy-config\") pod \"machine-config-daemon-2td4d\" (UID: \"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\") " pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231612 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-multus-conf-dir\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231629 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-multus-daemon-config\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231671 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231690 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-run-netns\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231709 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231703 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231729 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f6398583-f9ff-4b10-829a-503fd523710b-hosts-file\") pod \"node-resolver-xwmn9\" (UID: \"f6398583-f9ff-4b10-829a-503fd523710b\") " pod="openshift-dns/node-resolver-xwmn9" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231903 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-cni-netd\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231951 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/501b48f2-bba8-44d4-81df-7a8b7df456b5-os-release\") pod \"multus-additional-cni-plugins-xkqn6\" (UID: \"501b48f2-bba8-44d4-81df-7a8b7df456b5\") " pod="openshift-multus/multus-additional-cni-plugins-xkqn6" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231988 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9f5g\" (UniqueName: \"kubernetes.io/projected/e05c56f7-b007-4165-9e29-98cfa865d020-kube-api-access-r9f5g\") pod \"ovnkube-control-plane-749d76644c-dt7fl\" (UID: \"e05c56f7-b007-4165-9e29-98cfa865d020\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232018 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fqxj\" (UniqueName: \"kubernetes.io/projected/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-kube-api-access-2fqxj\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232098 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/501b48f2-bba8-44d4-81df-7a8b7df456b5-cni-binary-copy\") pod \"multus-additional-cni-plugins-xkqn6\" (UID: \"501b48f2-bba8-44d4-81df-7a8b7df456b5\") " pod="openshift-multus/multus-additional-cni-plugins-xkqn6" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232130 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-etc-kubernetes\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232141 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232162 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/43df29f7-1351-41f5-bfca-17f804837cb4-ovn-node-metrics-cert\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232195 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-var-lib-openvswitch\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232223 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-run-openvswitch\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232254 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/501b48f2-bba8-44d4-81df-7a8b7df456b5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xkqn6\" (UID: \"501b48f2-bba8-44d4-81df-7a8b7df456b5\") " pod="openshift-multus/multus-additional-cni-plugins-xkqn6" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232269 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232282 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e05c56f7-b007-4165-9e29-98cfa865d020-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-dt7fl\" (UID: \"e05c56f7-b007-4165-9e29-98cfa865d020\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232321 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-host-var-lib-cni-bin\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232354 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-host-var-lib-cni-multus\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232363 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232387 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232415 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7ckv\" (UniqueName: \"kubernetes.io/projected/f6398583-f9ff-4b10-829a-503fd523710b-kube-api-access-q7ckv\") pod \"node-resolver-xwmn9\" (UID: \"f6398583-f9ff-4b10-829a-503fd523710b\") " pod="openshift-dns/node-resolver-xwmn9" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232450 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-slash\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232471 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-etc-openvswitch\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232489 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232521 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232558 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232586 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-node-log\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232609 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-multus-socket-dir-parent\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232631 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-run-systemd\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232662 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/43df29f7-1351-41f5-bfca-17f804837cb4-ovnkube-config\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.233147 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232486 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232598 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.231158 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232669 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232718 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232739 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.232759 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.233096 4837 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.233307 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 11:49:35.733278808 +0000 UTC m=+91.371545751 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.233440 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.233558 4837 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.233754 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 11:49:35.733742942 +0000 UTC m=+91.372009925 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.233957 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.234226 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.234253 4837 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.234385 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.234418 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.234439 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.234458 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.234801 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.235589 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.235627 4837 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.235675 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.235696 4837 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.235712 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.235717 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.235727 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.235883 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.236709 4837 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.236737 4837 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.236756 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.236772 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.236788 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.236803 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.236816 4837 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.236832 4837 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.236851 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.236867 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.236881 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.236895 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.236909 4837 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.236924 4837 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.236957 4837 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.236971 4837 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.236985 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237000 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237014 4837 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237027 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237040 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237054 4837 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237068 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237083 4837 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237098 4837 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237111 4837 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237124 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237137 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237151 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237164 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237177 4837 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237190 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237203 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237216 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237228 4837 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237242 4837 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237255 4837 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237267 4837 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237280 4837 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237295 4837 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237308 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237322 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237336 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237351 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237365 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237382 4837 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237396 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237409 4837 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237422 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237436 4837 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237449 4837 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237465 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237478 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237491 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237505 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237119 4837 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237518 4837 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237575 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237589 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237603 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237618 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237637 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237665 4837 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237678 4837 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.237691 4837 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.238405 4837 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.238421 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.238435 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.238450 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.238464 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.238479 4837 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.238493 4837 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.238506 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.238519 4837 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.238532 4837 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.238545 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.238558 4837 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.238574 4837 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.238592 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.238605 4837 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.238618 4837 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.238630 4837 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.238683 4837 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.238695 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.238708 4837 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.238720 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.238733 4837 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.238746 4837 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.238760 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.238774 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.245603 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.246145 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.248064 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.248420 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.248670 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.248903 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.236040 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.249314 4837 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.249486 4837 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.249316 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.249589 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.249729 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.249919 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.250031 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.250208 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.250126 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.250234 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.250611 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.250655 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.250670 4837 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.250752 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 11:49:35.750727908 +0000 UTC m=+91.388994671 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.251283 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.251321 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.251341 4837 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.251425 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 11:49:35.751400419 +0000 UTC m=+91.389667182 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.252775 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.256468 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.259156 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.262534 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.262765 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.263355 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.263464 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.263792 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.263866 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.264093 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.264509 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.265407 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.265434 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.265474 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.265661 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.266177 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.266354 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.267756 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.267789 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.268188 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.268468 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.268519 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.268969 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.269155 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.269270 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.269362 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.269438 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.269526 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.269721 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.270224 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.273814 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.273889 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.274265 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.274380 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.274337 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.274171 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.274395 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.274837 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.274977 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.279189 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.279189 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.279241 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.279407 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.279792 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.280452 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.280472 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.280767 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.280768 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.280856 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.280873 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.281228 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.281249 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.281259 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.281308 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.281543 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.281617 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.282673 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.283969 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.284009 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.284381 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.284427 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.284483 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.285273 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.288014 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.288247 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.288837 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.289402 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.289452 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.289429 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.289737 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.291919 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.296908 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.299490 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.304017 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.306236 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.309733 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.311982 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.320518 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.325254 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.325345 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.325411 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.325538 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.325602 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:35Z","lastTransitionTime":"2026-03-13T11:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.327114 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.350469 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86e5afeb-4720-4593-a53e-dfb5381d0b1d-metrics-certs\") pod \"network-metrics-daemon-cjn4q\" (UID: \"86e5afeb-4720-4593-a53e-dfb5381d0b1d\") " pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.350505 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/338e0d25-c97d-42ec-a8ec-51ddf77a5ed8-proxy-tls\") pod \"machine-config-daemon-2td4d\" (UID: \"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\") " pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.350522 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e05c56f7-b007-4165-9e29-98cfa865d020-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-dt7fl\" (UID: \"e05c56f7-b007-4165-9e29-98cfa865d020\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.350555 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e05c56f7-b007-4165-9e29-98cfa865d020-env-overrides\") pod \"ovnkube-control-plane-749d76644c-dt7fl\" (UID: \"e05c56f7-b007-4165-9e29-98cfa865d020\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.350571 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/43df29f7-1351-41f5-bfca-17f804837cb4-env-overrides\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.350588 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4c126c88-4541-474c-bc1f-5ca9befa3146-serviceca\") pod \"node-ca-np68d\" (UID: \"4c126c88-4541-474c-bc1f-5ca9befa3146\") " pod="openshift-image-registry/node-ca-np68d" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.350606 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-system-cni-dir\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.350620 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-os-release\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.350657 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-host-run-netns\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.350685 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4c126c88-4541-474c-bc1f-5ca9befa3146-host\") pod \"node-ca-np68d\" (UID: \"4c126c88-4541-474c-bc1f-5ca9befa3146\") " pod="openshift-image-registry/node-ca-np68d" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.350701 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nj56\" (UniqueName: \"kubernetes.io/projected/86e5afeb-4720-4593-a53e-dfb5381d0b1d-kube-api-access-6nj56\") pod \"network-metrics-daemon-cjn4q\" (UID: \"86e5afeb-4720-4593-a53e-dfb5381d0b1d\") " pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.350717 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/338e0d25-c97d-42ec-a8ec-51ddf77a5ed8-mcd-auth-proxy-config\") pod \"machine-config-daemon-2td4d\" (UID: \"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\") " pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.350731 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-multus-conf-dir\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.350745 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-run-ovn\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.350763 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-multus-daemon-config\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.350779 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-run-netns\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.350804 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-cni-netd\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.350833 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/501b48f2-bba8-44d4-81df-7a8b7df456b5-os-release\") pod \"multus-additional-cni-plugins-xkqn6\" (UID: \"501b48f2-bba8-44d4-81df-7a8b7df456b5\") " pod="openshift-multus/multus-additional-cni-plugins-xkqn6" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.350855 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9f5g\" (UniqueName: \"kubernetes.io/projected/e05c56f7-b007-4165-9e29-98cfa865d020-kube-api-access-r9f5g\") pod \"ovnkube-control-plane-749d76644c-dt7fl\" (UID: \"e05c56f7-b007-4165-9e29-98cfa865d020\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.350879 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fqxj\" (UniqueName: \"kubernetes.io/projected/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-kube-api-access-2fqxj\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.350898 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f6398583-f9ff-4b10-829a-503fd523710b-hosts-file\") pod \"node-resolver-xwmn9\" (UID: \"f6398583-f9ff-4b10-829a-503fd523710b\") " pod="openshift-dns/node-resolver-xwmn9" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.350918 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/501b48f2-bba8-44d4-81df-7a8b7df456b5-cni-binary-copy\") pod \"multus-additional-cni-plugins-xkqn6\" (UID: \"501b48f2-bba8-44d4-81df-7a8b7df456b5\") " pod="openshift-multus/multus-additional-cni-plugins-xkqn6" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.350932 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-os-release\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.350966 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-etc-kubernetes\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.350939 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-etc-kubernetes\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351000 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-host-run-netns\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351005 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/43df29f7-1351-41f5-bfca-17f804837cb4-ovn-node-metrics-cert\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351030 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4c126c88-4541-474c-bc1f-5ca9befa3146-host\") pod \"node-ca-np68d\" (UID: \"4c126c88-4541-474c-bc1f-5ca9befa3146\") " pod="openshift-image-registry/node-ca-np68d" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351034 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/501b48f2-bba8-44d4-81df-7a8b7df456b5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xkqn6\" (UID: \"501b48f2-bba8-44d4-81df-7a8b7df456b5\") " pod="openshift-multus/multus-additional-cni-plugins-xkqn6" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351060 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e05c56f7-b007-4165-9e29-98cfa865d020-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-dt7fl\" (UID: \"e05c56f7-b007-4165-9e29-98cfa865d020\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351085 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-host-var-lib-cni-bin\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351109 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-host-var-lib-cni-multus\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351132 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-var-lib-openvswitch\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351155 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-run-openvswitch\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351176 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7ckv\" (UniqueName: \"kubernetes.io/projected/f6398583-f9ff-4b10-829a-503fd523710b-kube-api-access-q7ckv\") pod \"node-resolver-xwmn9\" (UID: \"f6398583-f9ff-4b10-829a-503fd523710b\") " pod="openshift-dns/node-resolver-xwmn9" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351201 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-slash\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351224 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-etc-openvswitch\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351246 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351285 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-node-log\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351307 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-multus-socket-dir-parent\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351341 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-run-systemd\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351366 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/43df29f7-1351-41f5-bfca-17f804837cb4-ovnkube-config\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351387 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdh8r\" (UniqueName: \"kubernetes.io/projected/4c126c88-4541-474c-bc1f-5ca9befa3146-kube-api-access-wdh8r\") pod \"node-ca-np68d\" (UID: \"4c126c88-4541-474c-bc1f-5ca9befa3146\") " pod="openshift-image-registry/node-ca-np68d" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351409 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-cni-binary-copy\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351430 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-host-run-k8s-cni-cncf-io\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351453 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-host-run-multus-certs\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351474 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-kubelet\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351497 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/501b48f2-bba8-44d4-81df-7a8b7df456b5-system-cni-dir\") pod \"multus-additional-cni-plugins-xkqn6\" (UID: \"501b48f2-bba8-44d4-81df-7a8b7df456b5\") " pod="openshift-multus/multus-additional-cni-plugins-xkqn6" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351520 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/501b48f2-bba8-44d4-81df-7a8b7df456b5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xkqn6\" (UID: \"501b48f2-bba8-44d4-81df-7a8b7df456b5\") " pod="openshift-multus/multus-additional-cni-plugins-xkqn6" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351543 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/338e0d25-c97d-42ec-a8ec-51ddf77a5ed8-rootfs\") pod \"machine-config-daemon-2td4d\" (UID: \"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\") " pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351564 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-multus-cni-dir\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351587 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-cnibin\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351609 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-host-var-lib-kubelet\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351631 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351674 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/43df29f7-1351-41f5-bfca-17f804837cb4-ovnkube-script-lib\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351697 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-hostroot\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351721 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-run-ovn-kubernetes\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351737 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e05c56f7-b007-4165-9e29-98cfa865d020-env-overrides\") pod \"ovnkube-control-plane-749d76644c-dt7fl\" (UID: \"e05c56f7-b007-4165-9e29-98cfa865d020\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351760 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-cni-bin\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351798 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/501b48f2-bba8-44d4-81df-7a8b7df456b5-cnibin\") pod \"multus-additional-cni-plugins-xkqn6\" (UID: \"501b48f2-bba8-44d4-81df-7a8b7df456b5\") " pod="openshift-multus/multus-additional-cni-plugins-xkqn6" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351824 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvtx6\" (UniqueName: \"kubernetes.io/projected/338e0d25-c97d-42ec-a8ec-51ddf77a5ed8-kube-api-access-cvtx6\") pod \"machine-config-daemon-2td4d\" (UID: \"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\") " pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351846 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-systemd-units\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.351855 4837 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351869 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-log-socket\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351893 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85hll\" (UniqueName: \"kubernetes.io/projected/43df29f7-1351-41f5-bfca-17f804837cb4-kube-api-access-85hll\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.351926 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86e5afeb-4720-4593-a53e-dfb5381d0b1d-metrics-certs podName:86e5afeb-4720-4593-a53e-dfb5381d0b1d nodeName:}" failed. No retries permitted until 2026-03-13 11:49:35.851903114 +0000 UTC m=+91.490169897 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86e5afeb-4720-4593-a53e-dfb5381d0b1d-metrics-certs") pod "network-metrics-daemon-cjn4q" (UID: "86e5afeb-4720-4593-a53e-dfb5381d0b1d") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351956 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352002 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmhlq\" (UniqueName: \"kubernetes.io/projected/501b48f2-bba8-44d4-81df-7a8b7df456b5-kube-api-access-pmhlq\") pod \"multus-additional-cni-plugins-xkqn6\" (UID: \"501b48f2-bba8-44d4-81df-7a8b7df456b5\") " pod="openshift-multus/multus-additional-cni-plugins-xkqn6" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352085 4837 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352108 4837 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352110 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-run-systemd\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352127 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352146 4837 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352166 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352183 4837 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352199 4837 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352203 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/338e0d25-c97d-42ec-a8ec-51ddf77a5ed8-mcd-auth-proxy-config\") pod \"machine-config-daemon-2td4d\" (UID: \"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\") " pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352217 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352232 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352244 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-multus-conf-dir\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352248 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352272 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352285 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352298 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352310 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352323 4837 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352336 4837 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352349 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352361 4837 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352374 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352387 4837 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352398 4837 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352413 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352426 4837 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352438 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352451 4837 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352466 4837 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352483 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352496 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352510 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352521 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352533 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352523 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-kubelet\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352545 4837 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352564 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/501b48f2-bba8-44d4-81df-7a8b7df456b5-os-release\") pod \"multus-additional-cni-plugins-xkqn6\" (UID: \"501b48f2-bba8-44d4-81df-7a8b7df456b5\") " pod="openshift-multus/multus-additional-cni-plugins-xkqn6" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352598 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352394 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-cnibin\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352660 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/43df29f7-1351-41f5-bfca-17f804837cb4-env-overrides\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.351903 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-multus-socket-dir-parent\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352715 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352884 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-host-var-lib-cni-bin\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352895 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352918 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352928 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/338e0d25-c97d-42ec-a8ec-51ddf77a5ed8-rootfs\") pod \"machine-config-daemon-2td4d\" (UID: \"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\") " pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352959 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-etc-openvswitch\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.352981 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/501b48f2-bba8-44d4-81df-7a8b7df456b5-system-cni-dir\") pod \"multus-additional-cni-plugins-xkqn6\" (UID: \"501b48f2-bba8-44d4-81df-7a8b7df456b5\") " pod="openshift-multus/multus-additional-cni-plugins-xkqn6" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.353002 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-host-var-lib-cni-multus\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.353089 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-systemd-units\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.353134 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-log-socket\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.353171 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-multus-cni-dir\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.353184 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-hostroot\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.353228 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-host-var-lib-kubelet\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.353234 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-node-log\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.353256 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/43df29f7-1351-41f5-bfca-17f804837cb4-ovnkube-config\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.353268 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-cni-netd\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.353293 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-var-lib-openvswitch\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.353276 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-cni-bin\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.353509 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-cni-binary-copy\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.353535 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-slash\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.353556 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-run-openvswitch\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.353595 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-system-cni-dir\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.353940 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-run-netns\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.353992 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/501b48f2-bba8-44d4-81df-7a8b7df456b5-cnibin\") pod \"multus-additional-cni-plugins-xkqn6\" (UID: \"501b48f2-bba8-44d4-81df-7a8b7df456b5\") " pod="openshift-multus/multus-additional-cni-plugins-xkqn6" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.354020 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-host-run-multus-certs\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.354060 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f6398583-f9ff-4b10-829a-503fd523710b-hosts-file\") pod \"node-resolver-xwmn9\" (UID: \"f6398583-f9ff-4b10-829a-503fd523710b\") " pod="openshift-dns/node-resolver-xwmn9" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.354077 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-run-ovn\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.354099 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-host-run-k8s-cni-cncf-io\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.354433 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e05c56f7-b007-4165-9e29-98cfa865d020-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-dt7fl\" (UID: \"e05c56f7-b007-4165-9e29-98cfa865d020\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.354477 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-multus-daemon-config\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.354538 4837 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.354561 4837 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.354579 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.354594 4837 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.354610 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.354627 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.354668 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.354682 4837 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.354697 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.354714 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.354730 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.354745 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.354761 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.354774 4837 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.354778 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/43df29f7-1351-41f5-bfca-17f804837cb4-ovnkube-script-lib\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.354789 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.354910 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-run-ovn-kubernetes\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.354946 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4c126c88-4541-474c-bc1f-5ca9befa3146-serviceca\") pod \"node-ca-np68d\" (UID: \"4c126c88-4541-474c-bc1f-5ca9befa3146\") " pod="openshift-image-registry/node-ca-np68d" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.354991 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/501b48f2-bba8-44d4-81df-7a8b7df456b5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xkqn6\" (UID: \"501b48f2-bba8-44d4-81df-7a8b7df456b5\") " pod="openshift-multus/multus-additional-cni-plugins-xkqn6" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355094 4837 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355115 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355128 4837 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355144 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355161 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355176 4837 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355190 4837 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355203 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355216 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355230 4837 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355243 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355258 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355272 4837 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355286 4837 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355299 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355312 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355332 4837 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355347 4837 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355364 4837 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355395 4837 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355415 4837 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355432 4837 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355448 4837 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355465 4837 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355482 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355500 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355518 4837 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355537 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355556 4837 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355572 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355591 4837 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355609 4837 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355626 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355674 4837 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355690 4837 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355706 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355724 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355740 4837 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355758 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355777 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355801 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355822 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355841 4837 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.355860 4837 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.357729 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/338e0d25-c97d-42ec-a8ec-51ddf77a5ed8-proxy-tls\") pod \"machine-config-daemon-2td4d\" (UID: \"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\") " pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.358481 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/501b48f2-bba8-44d4-81df-7a8b7df456b5-cni-binary-copy\") pod \"multus-additional-cni-plugins-xkqn6\" (UID: \"501b48f2-bba8-44d4-81df-7a8b7df456b5\") " pod="openshift-multus/multus-additional-cni-plugins-xkqn6" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.366787 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/43df29f7-1351-41f5-bfca-17f804837cb4-ovn-node-metrics-cert\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.367135 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nj56\" (UniqueName: \"kubernetes.io/projected/86e5afeb-4720-4593-a53e-dfb5381d0b1d-kube-api-access-6nj56\") pod \"network-metrics-daemon-cjn4q\" (UID: \"86e5afeb-4720-4593-a53e-dfb5381d0b1d\") " pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.368294 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/501b48f2-bba8-44d4-81df-7a8b7df456b5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xkqn6\" (UID: \"501b48f2-bba8-44d4-81df-7a8b7df456b5\") " pod="openshift-multus/multus-additional-cni-plugins-xkqn6" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.368659 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmhlq\" (UniqueName: \"kubernetes.io/projected/501b48f2-bba8-44d4-81df-7a8b7df456b5-kube-api-access-pmhlq\") pod \"multus-additional-cni-plugins-xkqn6\" (UID: \"501b48f2-bba8-44d4-81df-7a8b7df456b5\") " pod="openshift-multus/multus-additional-cni-plugins-xkqn6" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.369207 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85hll\" (UniqueName: \"kubernetes.io/projected/43df29f7-1351-41f5-bfca-17f804837cb4-kube-api-access-85hll\") pod \"ovnkube-node-4zzrs\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.370295 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdh8r\" (UniqueName: \"kubernetes.io/projected/4c126c88-4541-474c-bc1f-5ca9befa3146-kube-api-access-wdh8r\") pod \"node-ca-np68d\" (UID: \"4c126c88-4541-474c-bc1f-5ca9befa3146\") " pod="openshift-image-registry/node-ca-np68d" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.371697 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9f5g\" (UniqueName: \"kubernetes.io/projected/e05c56f7-b007-4165-9e29-98cfa865d020-kube-api-access-r9f5g\") pod \"ovnkube-control-plane-749d76644c-dt7fl\" (UID: \"e05c56f7-b007-4165-9e29-98cfa865d020\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.372805 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e05c56f7-b007-4165-9e29-98cfa865d020-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-dt7fl\" (UID: \"e05c56f7-b007-4165-9e29-98cfa865d020\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.379178 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvtx6\" (UniqueName: \"kubernetes.io/projected/338e0d25-c97d-42ec-a8ec-51ddf77a5ed8-kube-api-access-cvtx6\") pod \"machine-config-daemon-2td4d\" (UID: \"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\") " pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.381204 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7ckv\" (UniqueName: \"kubernetes.io/projected/f6398583-f9ff-4b10-829a-503fd523710b-kube-api-access-q7ckv\") pod \"node-resolver-xwmn9\" (UID: \"f6398583-f9ff-4b10-829a-503fd523710b\") " pod="openshift-dns/node-resolver-xwmn9" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.381512 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fqxj\" (UniqueName: \"kubernetes.io/projected/cbb3f4c6-a6c5-4059-8beb-04179d70aff5-kube-api-access-2fqxj\") pod \"multus-qg957\" (UID: \"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\") " pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.427208 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.427606 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.427665 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.427681 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.427703 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.427715 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:35Z","lastTransitionTime":"2026-03-13T11:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.436236 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 11:49:35 crc kubenswrapper[4837]: W0313 11:49:35.447856 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-cb20f8c50a429be8b1d5add1f63d65a117337629d00c45bf4b2b6cd2d0def957 WatchSource:0}: Error finding container cb20f8c50a429be8b1d5add1f63d65a117337629d00c45bf4b2b6cd2d0def957: Status 404 returned error can't find the container with id cb20f8c50a429be8b1d5add1f63d65a117337629d00c45bf4b2b6cd2d0def957 Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.456074 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.468510 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-np68d" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.473849 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-xwmn9" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.482898 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 11:49:35 crc kubenswrapper[4837]: W0313 11:49:35.484929 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c126c88_4541_474c_bc1f_5ca9befa3146.slice/crio-2881af3742e0d50b334f23b774fbebbacf8b6806c85f4d40f633913a88a6d442 WatchSource:0}: Error finding container 2881af3742e0d50b334f23b774fbebbacf8b6806c85f4d40f633913a88a6d442: Status 404 returned error can't find the container with id 2881af3742e0d50b334f23b774fbebbacf8b6806c85f4d40f633913a88a6d442 Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.489936 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.498427 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-qg957" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.505092 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" Mar 13 11:49:35 crc kubenswrapper[4837]: W0313 11:49:35.510916 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6398583_f9ff_4b10_829a_503fd523710b.slice/crio-166ac2e7f6f6b6b3592e07d9264bf9325d076196c25c41eff24468e141f1843d WatchSource:0}: Error finding container 166ac2e7f6f6b6b3592e07d9264bf9325d076196c25c41eff24468e141f1843d: Status 404 returned error can't find the container with id 166ac2e7f6f6b6b3592e07d9264bf9325d076196c25c41eff24468e141f1843d Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.515432 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.532514 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.532562 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.532571 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.532588 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.532597 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:35Z","lastTransitionTime":"2026-03-13T11:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:35 crc kubenswrapper[4837]: W0313 11:49:35.569308 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43df29f7_1351_41f5_bfca_17f804837cb4.slice/crio-17148b76b47a8d352ae2adca8c21dbaa4b189a84d57c2f7678c2d83f59bfc901 WatchSource:0}: Error finding container 17148b76b47a8d352ae2adca8c21dbaa4b189a84d57c2f7678c2d83f59bfc901: Status 404 returned error can't find the container with id 17148b76b47a8d352ae2adca8c21dbaa4b189a84d57c2f7678c2d83f59bfc901 Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.635176 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.635218 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.635228 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.635244 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.635254 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:35Z","lastTransitionTime":"2026-03-13T11:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.739983 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.740048 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.740057 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.740075 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.740086 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:35Z","lastTransitionTime":"2026-03-13T11:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.759439 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.759927 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:49:36.759898749 +0000 UTC m=+92.398165512 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.760018 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.760047 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.760132 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.760229 4837 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.760272 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 11:49:36.760264371 +0000 UTC m=+92.398531134 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.760481 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.760529 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.760548 4837 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.760613 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 11:49:36.760591911 +0000 UTC m=+92.398858854 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.760715 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.760720 4837 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.760769 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 11:49:36.760759756 +0000 UTC m=+92.399026729 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.760891 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.760915 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.760927 4837 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.760971 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 11:49:36.760956232 +0000 UTC m=+92.399223185 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.842865 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.842924 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.842938 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.842962 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.842975 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:35Z","lastTransitionTime":"2026-03-13T11:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.862008 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86e5afeb-4720-4593-a53e-dfb5381d0b1d-metrics-certs\") pod \"network-metrics-daemon-cjn4q\" (UID: \"86e5afeb-4720-4593-a53e-dfb5381d0b1d\") " pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.862177 4837 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 11:49:35 crc kubenswrapper[4837]: E0313 11:49:35.862262 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86e5afeb-4720-4593-a53e-dfb5381d0b1d-metrics-certs podName:86e5afeb-4720-4593-a53e-dfb5381d0b1d nodeName:}" failed. No retries permitted until 2026-03-13 11:49:36.862239572 +0000 UTC m=+92.500506515 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86e5afeb-4720-4593-a53e-dfb5381d0b1d-metrics-certs") pod "network-metrics-daemon-cjn4q" (UID: "86e5afeb-4720-4593-a53e-dfb5381d0b1d") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.945898 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.945947 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.945956 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.945973 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:35 crc kubenswrapper[4837]: I0313 11:49:35.945983 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:35Z","lastTransitionTime":"2026-03-13T11:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.036711 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.036763 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.036774 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.036792 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.036804 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:36Z","lastTransitionTime":"2026-03-13T11:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:36 crc kubenswrapper[4837]: E0313 11:49:36.047069 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.051859 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.051921 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.051936 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.051955 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.051965 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:36Z","lastTransitionTime":"2026-03-13T11:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:36 crc kubenswrapper[4837]: E0313 11:49:36.061029 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.064970 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.065002 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.065012 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.065028 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.065038 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:36Z","lastTransitionTime":"2026-03-13T11:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:36 crc kubenswrapper[4837]: E0313 11:49:36.074476 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.078677 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.078881 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.078975 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.079095 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.079193 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:36Z","lastTransitionTime":"2026-03-13T11:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:36 crc kubenswrapper[4837]: E0313 11:49:36.090817 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.096218 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.096250 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.096259 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.096276 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.096289 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:36Z","lastTransitionTime":"2026-03-13T11:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:36 crc kubenswrapper[4837]: E0313 11:49:36.107830 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 11:49:36 crc kubenswrapper[4837]: E0313 11:49:36.107949 4837 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.109721 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.109761 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.109771 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.109787 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.109796 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:36Z","lastTransitionTime":"2026-03-13T11:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.212538 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.212590 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.212601 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.212623 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.212651 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:36Z","lastTransitionTime":"2026-03-13T11:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.315658 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.315745 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.315757 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.315780 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.315794 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:36Z","lastTransitionTime":"2026-03-13T11:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.418861 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.418920 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.418933 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.418950 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.418959 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:36Z","lastTransitionTime":"2026-03-13T11:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.432934 4837 generic.go:334] "Generic (PLEG): container finished" podID="43df29f7-1351-41f5-bfca-17f804837cb4" containerID="4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60" exitCode=0 Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.432986 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" event={"ID":"43df29f7-1351-41f5-bfca-17f804837cb4","Type":"ContainerDied","Data":"4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.433036 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" event={"ID":"43df29f7-1351-41f5-bfca-17f804837cb4","Type":"ContainerStarted","Data":"17148b76b47a8d352ae2adca8c21dbaa4b189a84d57c2f7678c2d83f59bfc901"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.435509 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerStarted","Data":"9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.435540 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerStarted","Data":"87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.435553 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerStarted","Data":"f78b0dfed51389d19f2f72872d4eb4ed23f39b0b8057b3cf1d510ef956001134"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.437863 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-xwmn9" event={"ID":"f6398583-f9ff-4b10-829a-503fd523710b","Type":"ContainerStarted","Data":"81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.437929 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-xwmn9" event={"ID":"f6398583-f9ff-4b10-829a-503fd523710b","Type":"ContainerStarted","Data":"166ac2e7f6f6b6b3592e07d9264bf9325d076196c25c41eff24468e141f1843d"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.440238 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.440267 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"81d43c1d485ad8415596ee869abae4167674dbed992582bf1e3cc0ea9b78d6b5"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.442148 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-np68d" event={"ID":"4c126c88-4541-474c-bc1f-5ca9befa3146","Type":"ContainerStarted","Data":"e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.442312 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-np68d" event={"ID":"4c126c88-4541-474c-bc1f-5ca9befa3146","Type":"ContainerStarted","Data":"2881af3742e0d50b334f23b774fbebbacf8b6806c85f4d40f633913a88a6d442"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.445235 4837 generic.go:334] "Generic (PLEG): container finished" podID="501b48f2-bba8-44d4-81df-7a8b7df456b5" containerID="595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1" exitCode=0 Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.445341 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" event={"ID":"501b48f2-bba8-44d4-81df-7a8b7df456b5","Type":"ContainerDied","Data":"595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.445417 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" event={"ID":"501b48f2-bba8-44d4-81df-7a8b7df456b5","Type":"ContainerStarted","Data":"a1b4ca9c1b4c55aa909d80a4fa2f48c689ec4c3090dd6f678eb520f265556c71"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.447351 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" event={"ID":"e05c56f7-b007-4165-9e29-98cfa865d020","Type":"ContainerStarted","Data":"010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.447399 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" event={"ID":"e05c56f7-b007-4165-9e29-98cfa865d020","Type":"ContainerStarted","Data":"35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.447415 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" event={"ID":"e05c56f7-b007-4165-9e29-98cfa865d020","Type":"ContainerStarted","Data":"d6862f2ba91bcb2caa3e47a7b0c9f6fb516532e510a2cd0268bf640898a72c73"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.448616 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qg957" event={"ID":"cbb3f4c6-a6c5-4059-8beb-04179d70aff5","Type":"ContainerStarted","Data":"9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.448701 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qg957" event={"ID":"cbb3f4c6-a6c5-4059-8beb-04179d70aff5","Type":"ContainerStarted","Data":"47c18be3596bf1461dbdaefb54c85ce132865b95bab24e300e61f29af8e5460c"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.450088 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"ccefe8faa7e6cf0cf99365286fda2cbf0f3e1517fbef569ff1b331d009363fca"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.452612 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.452752 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.453281 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"cb20f8c50a429be8b1d5add1f63d65a117337629d00c45bf4b2b6cd2d0def957"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.461671 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:36Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.482748 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:36Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.499649 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:36Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.514225 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:36Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.521596 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.521660 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.521673 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.521694 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.521707 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:36Z","lastTransitionTime":"2026-03-13T11:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.527198 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:36Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.538184 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:36Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.552476 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:36Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.572045 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:36Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.589237 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:36Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.606879 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:36Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.624052 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.624099 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.624110 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.624127 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.624137 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:36Z","lastTransitionTime":"2026-03-13T11:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.627841 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:36Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.643538 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:36Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.659750 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:36Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.672452 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:36Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.706730 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:36Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.727570 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.727627 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.727653 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.727673 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.727687 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:36Z","lastTransitionTime":"2026-03-13T11:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.728439 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:36Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.761154 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:36Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.773202 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.773328 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.773350 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:36 crc kubenswrapper[4837]: E0313 11:49:36.773437 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:49:38.773403051 +0000 UTC m=+94.411669814 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:49:36 crc kubenswrapper[4837]: E0313 11:49:36.773457 4837 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 11:49:36 crc kubenswrapper[4837]: E0313 11:49:36.773516 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 11:49:38.773502445 +0000 UTC m=+94.411769208 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 11:49:36 crc kubenswrapper[4837]: E0313 11:49:36.773520 4837 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 11:49:36 crc kubenswrapper[4837]: E0313 11:49:36.773590 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 11:49:36 crc kubenswrapper[4837]: E0313 11:49:36.773603 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 11:49:36 crc kubenswrapper[4837]: E0313 11:49:36.773609 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 11:49:38.773590387 +0000 UTC m=+94.411857150 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 11:49:36 crc kubenswrapper[4837]: E0313 11:49:36.773614 4837 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.773535 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:36 crc kubenswrapper[4837]: E0313 11:49:36.773678 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 11:49:38.773636849 +0000 UTC m=+94.411903612 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.773694 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:36 crc kubenswrapper[4837]: E0313 11:49:36.773759 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 11:49:36 crc kubenswrapper[4837]: E0313 11:49:36.773770 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 11:49:36 crc kubenswrapper[4837]: E0313 11:49:36.773777 4837 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:49:36 crc kubenswrapper[4837]: E0313 11:49:36.773802 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 11:49:38.773796294 +0000 UTC m=+94.412063057 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.774508 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:36Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.786437 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:36Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.802845 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:36Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.817908 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:36Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.835866 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.835956 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.835971 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.835989 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.836000 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:36Z","lastTransitionTime":"2026-03-13T11:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.835991 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:36Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.846543 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:36Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.862308 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:36Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.874346 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86e5afeb-4720-4593-a53e-dfb5381d0b1d-metrics-certs\") pod \"network-metrics-daemon-cjn4q\" (UID: \"86e5afeb-4720-4593-a53e-dfb5381d0b1d\") " pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:49:36 crc kubenswrapper[4837]: E0313 11:49:36.874573 4837 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 11:49:36 crc kubenswrapper[4837]: E0313 11:49:36.874725 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86e5afeb-4720-4593-a53e-dfb5381d0b1d-metrics-certs podName:86e5afeb-4720-4593-a53e-dfb5381d0b1d nodeName:}" failed. No retries permitted until 2026-03-13 11:49:38.874692681 +0000 UTC m=+94.512959604 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86e5afeb-4720-4593-a53e-dfb5381d0b1d-metrics-certs") pod "network-metrics-daemon-cjn4q" (UID: "86e5afeb-4720-4593-a53e-dfb5381d0b1d") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.879591 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:36Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.908202 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:36Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.923467 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:36Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.935810 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:36Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.938206 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.938247 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.938255 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.938273 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:36 crc kubenswrapper[4837]: I0313 11:49:36.938285 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:36Z","lastTransitionTime":"2026-03-13T11:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.040528 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.040585 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.040598 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.040618 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.040631 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:37Z","lastTransitionTime":"2026-03-13T11:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.047370 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.047336 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.047445 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.047336 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:37 crc kubenswrapper[4837]: E0313 11:49:37.047531 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:49:37 crc kubenswrapper[4837]: E0313 11:49:37.047692 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:49:37 crc kubenswrapper[4837]: E0313 11:49:37.047767 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:49:37 crc kubenswrapper[4837]: E0313 11:49:37.047837 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.057064 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.058135 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.058875 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.059555 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.060690 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.061225 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.062216 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.062831 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.063919 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.064427 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.065357 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.066050 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.070335 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.071518 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.072276 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.073691 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.074589 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.076094 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.077444 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.078203 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.079403 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.080182 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.080697 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.081883 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.082384 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.083479 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.084387 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.084923 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.085501 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.086018 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.086574 4837 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.086702 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.088084 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.088613 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.089148 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.090277 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.090948 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.091538 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.095106 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.096393 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.097102 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.098011 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.099294 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.100538 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.101459 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.102336 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.104628 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.105862 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.107019 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.107714 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.108303 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.109628 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.110444 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.111593 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.144063 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.144101 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.144109 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.144127 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.144136 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:37Z","lastTransitionTime":"2026-03-13T11:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.210770 4837 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.247234 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.247270 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.247281 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.247300 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.247314 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:37Z","lastTransitionTime":"2026-03-13T11:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.349985 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.350226 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.350235 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.350250 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.350259 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:37Z","lastTransitionTime":"2026-03-13T11:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.452767 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.452799 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.452807 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.452821 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.452830 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:37Z","lastTransitionTime":"2026-03-13T11:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.457617 4837 generic.go:334] "Generic (PLEG): container finished" podID="501b48f2-bba8-44d4-81df-7a8b7df456b5" containerID="dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39" exitCode=0 Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.457727 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" event={"ID":"501b48f2-bba8-44d4-81df-7a8b7df456b5","Type":"ContainerDied","Data":"dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39"} Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.471467 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" event={"ID":"43df29f7-1351-41f5-bfca-17f804837cb4","Type":"ContainerStarted","Data":"954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793"} Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.471544 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" event={"ID":"43df29f7-1351-41f5-bfca-17f804837cb4","Type":"ContainerStarted","Data":"bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2"} Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.471567 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" event={"ID":"43df29f7-1351-41f5-bfca-17f804837cb4","Type":"ContainerStarted","Data":"c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d"} Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.471580 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" event={"ID":"43df29f7-1351-41f5-bfca-17f804837cb4","Type":"ContainerStarted","Data":"b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1"} Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.481043 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.494138 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.508312 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.521773 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.535836 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.551052 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.554692 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.554725 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.554738 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.554754 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.554763 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:37Z","lastTransitionTime":"2026-03-13T11:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.567373 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.581333 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.593944 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.604574 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.617789 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.640389 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.656487 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.657751 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.657780 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.657791 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.657809 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.657822 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:37Z","lastTransitionTime":"2026-03-13T11:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.669483 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.760695 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.760760 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.760770 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.760788 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.760798 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:37Z","lastTransitionTime":"2026-03-13T11:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.863848 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.864255 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.864270 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.864290 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.864303 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:37Z","lastTransitionTime":"2026-03-13T11:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.966530 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.966564 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.966572 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.966593 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:37 crc kubenswrapper[4837]: I0313 11:49:37.966608 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:37Z","lastTransitionTime":"2026-03-13T11:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.069685 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.069729 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.069740 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.069756 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.069768 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:38Z","lastTransitionTime":"2026-03-13T11:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.172460 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.172769 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.172779 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.172794 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.172806 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:38Z","lastTransitionTime":"2026-03-13T11:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.280140 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.280187 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.280206 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.280224 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.280235 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:38Z","lastTransitionTime":"2026-03-13T11:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.383243 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.383282 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.383294 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.383311 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.383323 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:38Z","lastTransitionTime":"2026-03-13T11:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.479687 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" event={"ID":"43df29f7-1351-41f5-bfca-17f804837cb4","Type":"ContainerStarted","Data":"80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39"} Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.479761 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" event={"ID":"43df29f7-1351-41f5-bfca-17f804837cb4","Type":"ContainerStarted","Data":"7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7"} Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.482711 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3"} Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.485550 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.485593 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.485608 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.485628 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.485672 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:38Z","lastTransitionTime":"2026-03-13T11:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.486540 4837 generic.go:334] "Generic (PLEG): container finished" podID="501b48f2-bba8-44d4-81df-7a8b7df456b5" containerID="578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138" exitCode=0 Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.486575 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" event={"ID":"501b48f2-bba8-44d4-81df-7a8b7df456b5","Type":"ContainerDied","Data":"578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138"} Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.499136 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.519417 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.536958 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.552980 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.567308 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.581415 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.589463 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.589500 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.589509 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.589526 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.589539 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:38Z","lastTransitionTime":"2026-03-13T11:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.594159 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.610038 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.623140 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.643672 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.664269 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.678735 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.692485 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.692607 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.692625 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.692674 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.692703 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:38Z","lastTransitionTime":"2026-03-13T11:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.694410 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.706188 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.724790 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.745896 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.760744 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.774812 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.791659 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.792282 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:49:38 crc kubenswrapper[4837]: E0313 11:49:38.792462 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:49:42.792436829 +0000 UTC m=+98.430703772 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.792503 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.792548 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.792618 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:38 crc kubenswrapper[4837]: E0313 11:49:38.792680 4837 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 11:49:38 crc kubenswrapper[4837]: E0313 11:49:38.792786 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 11:49:42.792758719 +0000 UTC m=+98.431025672 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 11:49:38 crc kubenswrapper[4837]: E0313 11:49:38.792797 4837 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 11:49:38 crc kubenswrapper[4837]: E0313 11:49:38.792885 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 11:49:42.792860642 +0000 UTC m=+98.431127445 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 11:49:38 crc kubenswrapper[4837]: E0313 11:49:38.792893 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 11:49:38 crc kubenswrapper[4837]: E0313 11:49:38.792921 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 11:49:38 crc kubenswrapper[4837]: E0313 11:49:38.792936 4837 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:49:38 crc kubenswrapper[4837]: E0313 11:49:38.792984 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 11:49:42.792973596 +0000 UTC m=+98.431240559 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.793312 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:38 crc kubenswrapper[4837]: E0313 11:49:38.793578 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 11:49:38 crc kubenswrapper[4837]: E0313 11:49:38.793627 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 11:49:38 crc kubenswrapper[4837]: E0313 11:49:38.793654 4837 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:49:38 crc kubenswrapper[4837]: E0313 11:49:38.793733 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 11:49:42.793711029 +0000 UTC m=+98.431977792 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.795673 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.795703 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.795718 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.795738 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.795755 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:38Z","lastTransitionTime":"2026-03-13T11:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.821323 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.840762 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.858761 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.872379 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.884451 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.894143 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86e5afeb-4720-4593-a53e-dfb5381d0b1d-metrics-certs\") pod \"network-metrics-daemon-cjn4q\" (UID: \"86e5afeb-4720-4593-a53e-dfb5381d0b1d\") " pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:49:38 crc kubenswrapper[4837]: E0313 11:49:38.894315 4837 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 11:49:38 crc kubenswrapper[4837]: E0313 11:49:38.894385 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86e5afeb-4720-4593-a53e-dfb5381d0b1d-metrics-certs podName:86e5afeb-4720-4593-a53e-dfb5381d0b1d nodeName:}" failed. No retries permitted until 2026-03-13 11:49:42.894366019 +0000 UTC m=+98.532632782 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86e5afeb-4720-4593-a53e-dfb5381d0b1d-metrics-certs") pod "network-metrics-daemon-cjn4q" (UID: "86e5afeb-4720-4593-a53e-dfb5381d0b1d") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.897452 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.898574 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.898610 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.898622 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.898666 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.898687 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:38Z","lastTransitionTime":"2026-03-13T11:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.913739 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.931588 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:38 crc kubenswrapper[4837]: I0313 11:49:38.943670 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:38Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.000954 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.001000 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.001008 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.001030 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.001042 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:39Z","lastTransitionTime":"2026-03-13T11:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.047626 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.047681 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.047662 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.047764 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:39 crc kubenswrapper[4837]: E0313 11:49:39.047801 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:49:39 crc kubenswrapper[4837]: E0313 11:49:39.047860 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:49:39 crc kubenswrapper[4837]: E0313 11:49:39.047922 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:39 crc kubenswrapper[4837]: E0313 11:49:39.048190 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.103383 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.103419 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.103428 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.103444 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.103457 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:39Z","lastTransitionTime":"2026-03-13T11:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.206138 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.206180 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.206191 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.206208 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.206219 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:39Z","lastTransitionTime":"2026-03-13T11:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.309167 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.309225 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.309240 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.309260 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.309273 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:39Z","lastTransitionTime":"2026-03-13T11:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.411974 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.412021 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.412033 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.412053 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.412065 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:39Z","lastTransitionTime":"2026-03-13T11:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.494117 4837 generic.go:334] "Generic (PLEG): container finished" podID="501b48f2-bba8-44d4-81df-7a8b7df456b5" containerID="8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923" exitCode=0 Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.494171 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" event={"ID":"501b48f2-bba8-44d4-81df-7a8b7df456b5","Type":"ContainerDied","Data":"8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923"} Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.509814 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:39Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.519015 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.519076 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.519095 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.519121 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.519136 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:39Z","lastTransitionTime":"2026-03-13T11:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.538836 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:39Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.551687 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:39Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.565705 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:39Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.587588 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:39Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.603314 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:39Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.618793 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:39Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.623543 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.623608 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.623622 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.623668 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.623684 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:39Z","lastTransitionTime":"2026-03-13T11:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.635965 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:39Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.659515 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:39Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.681065 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:39Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.702261 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:39Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.713312 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:39Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.723698 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:39Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.726547 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.726576 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.726586 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.726605 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.726616 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:39Z","lastTransitionTime":"2026-03-13T11:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.734036 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:39Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.829420 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.829465 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.829473 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.829490 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.829501 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:39Z","lastTransitionTime":"2026-03-13T11:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.932403 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.932455 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.932473 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.932494 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:39 crc kubenswrapper[4837]: I0313 11:49:39.932508 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:39Z","lastTransitionTime":"2026-03-13T11:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.035495 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.035770 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.035992 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.036120 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.036204 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:40Z","lastTransitionTime":"2026-03-13T11:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.138348 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.138823 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.138838 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.138861 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.138875 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:40Z","lastTransitionTime":"2026-03-13T11:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.240618 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.240680 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.240694 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.240712 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.240723 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:40Z","lastTransitionTime":"2026-03-13T11:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.342683 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.342723 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.342734 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.342752 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.342764 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:40Z","lastTransitionTime":"2026-03-13T11:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.444586 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.444621 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.444629 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.444669 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.444681 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:40Z","lastTransitionTime":"2026-03-13T11:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.501971 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" event={"ID":"43df29f7-1351-41f5-bfca-17f804837cb4","Type":"ContainerStarted","Data":"60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a"} Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.504865 4837 generic.go:334] "Generic (PLEG): container finished" podID="501b48f2-bba8-44d4-81df-7a8b7df456b5" containerID="6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f" exitCode=0 Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.504902 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" event={"ID":"501b48f2-bba8-44d4-81df-7a8b7df456b5","Type":"ContainerDied","Data":"6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f"} Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.520934 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:40Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.536196 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:40Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.548292 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.548324 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.548337 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.548357 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.548371 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:40Z","lastTransitionTime":"2026-03-13T11:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.550795 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:40Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.566166 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:40Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.579528 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:40Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.593246 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:40Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.606796 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:40Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.620374 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:40Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.633879 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:40Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.651343 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:40Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.651891 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.651922 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.651934 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.651951 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.651960 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:40Z","lastTransitionTime":"2026-03-13T11:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.668684 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:40Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.681051 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:40Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.693561 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:40Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.703713 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:40Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.753792 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.753824 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.753836 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.753853 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.753864 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:40Z","lastTransitionTime":"2026-03-13T11:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.857265 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.857317 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.857328 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.857345 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.857355 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:40Z","lastTransitionTime":"2026-03-13T11:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.960391 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.960433 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.960443 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.960460 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:40 crc kubenswrapper[4837]: I0313 11:49:40.960472 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:40Z","lastTransitionTime":"2026-03-13T11:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.048239 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.048303 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.048274 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:41 crc kubenswrapper[4837]: E0313 11:49:41.048433 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.048450 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:41 crc kubenswrapper[4837]: E0313 11:49:41.048541 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:41 crc kubenswrapper[4837]: E0313 11:49:41.048617 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:49:41 crc kubenswrapper[4837]: E0313 11:49:41.048788 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.062824 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.062873 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.062890 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.062911 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.062928 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:41Z","lastTransitionTime":"2026-03-13T11:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.166380 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.166435 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.166449 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.166470 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.166482 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:41Z","lastTransitionTime":"2026-03-13T11:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.269613 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.269695 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.269780 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.269813 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.269826 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:41Z","lastTransitionTime":"2026-03-13T11:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.373280 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.373553 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.373712 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.374288 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.374633 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:41Z","lastTransitionTime":"2026-03-13T11:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.476980 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.477056 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.477081 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.477112 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.477136 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:41Z","lastTransitionTime":"2026-03-13T11:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.517927 4837 generic.go:334] "Generic (PLEG): container finished" podID="501b48f2-bba8-44d4-81df-7a8b7df456b5" containerID="abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11" exitCode=0 Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.518002 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" event={"ID":"501b48f2-bba8-44d4-81df-7a8b7df456b5","Type":"ContainerDied","Data":"abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11"} Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.537332 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:41Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.554986 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:41Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.570658 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:41Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.580279 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.580341 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.580355 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.580377 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.580396 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:41Z","lastTransitionTime":"2026-03-13T11:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.585987 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:41Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.599790 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:41Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.616667 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:41Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.629224 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:41Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.640738 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:41Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.655814 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:41Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.677418 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:41Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.682556 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.682590 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.682597 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.682611 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.682623 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:41Z","lastTransitionTime":"2026-03-13T11:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.694676 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:41Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.707020 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:41Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.720046 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:41Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.741065 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:41Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.785212 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.785255 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.785264 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.785280 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.785310 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:41Z","lastTransitionTime":"2026-03-13T11:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.889153 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.889208 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.889229 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.889253 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.889270 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:41Z","lastTransitionTime":"2026-03-13T11:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.992864 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.992955 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.992979 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.993012 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:41 crc kubenswrapper[4837]: I0313 11:49:41.993037 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:41Z","lastTransitionTime":"2026-03-13T11:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.065310 4837 scope.go:117] "RemoveContainer" containerID="6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8" Mar 13 11:49:42 crc kubenswrapper[4837]: E0313 11:49:42.065515 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.066155 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.102502 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.102558 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.102574 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.102597 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.102612 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:42Z","lastTransitionTime":"2026-03-13T11:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.205588 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.205625 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.205634 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.205662 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.205671 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:42Z","lastTransitionTime":"2026-03-13T11:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.308836 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.309118 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.309127 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.309143 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.309154 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:42Z","lastTransitionTime":"2026-03-13T11:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.411985 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.412036 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.412047 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.412064 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.412074 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:42Z","lastTransitionTime":"2026-03-13T11:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.514890 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.514971 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.514992 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.515022 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.515045 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:42Z","lastTransitionTime":"2026-03-13T11:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.526077 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" event={"ID":"501b48f2-bba8-44d4-81df-7a8b7df456b5","Type":"ContainerStarted","Data":"ef0f102e98673ab18c97a49b7663d696cfc34b8a477b625c17720f895014e128"} Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.531910 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" event={"ID":"43df29f7-1351-41f5-bfca-17f804837cb4","Type":"ContainerStarted","Data":"1682ba45a5caded567709ca21681b997665e2b7d3be2fade571b7391f8e1ec9b"} Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.532524 4837 scope.go:117] "RemoveContainer" containerID="6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8" Mar 13 11:49:42 crc kubenswrapper[4837]: E0313 11:49:42.532747 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.543596 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.560946 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.575545 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.592564 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.605606 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.618018 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93dcd114-c39a-4b27-aa9c-a42e3ef7cd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:49:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 11:49:10.789921 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:49:10.790862 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:49:10.792348 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1361817431/tls.crt::/tmp/serving-cert-1361817431/tls.key\\\\\\\"\\\\nI0313 11:49:11.060533 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:49:11.064576 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:49:11.064598 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:49:11.064618 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:49:11.064623 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:49:11.074003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:49:11.074062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074073 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:49:11.074096 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:49:11.074104 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:49:11.074113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:49:11.074181 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:49:11.075668 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.618542 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.618566 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.618573 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.618587 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.618597 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:42Z","lastTransitionTime":"2026-03-13T11:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.641598 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef0f102e98673ab18c97a49b7663d696cfc34b8a477b625c17720f895014e128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.658953 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.674425 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.684399 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.699623 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.713698 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.720973 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.721014 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.721026 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.721047 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.721059 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:42Z","lastTransitionTime":"2026-03-13T11:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.725913 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.741327 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.760254 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.781781 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.798863 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.808985 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.821012 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.823011 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.823071 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.823089 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.823110 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.823124 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:42Z","lastTransitionTime":"2026-03-13T11:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.835143 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.842649 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.842750 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.842782 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.842827 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.842852 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:42 crc kubenswrapper[4837]: E0313 11:49:42.843024 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 11:49:42 crc kubenswrapper[4837]: E0313 11:49:42.843050 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 11:49:42 crc kubenswrapper[4837]: E0313 11:49:42.843064 4837 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:49:42 crc kubenswrapper[4837]: E0313 11:49:42.843113 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 11:49:50.843097184 +0000 UTC m=+106.481363947 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:49:42 crc kubenswrapper[4837]: E0313 11:49:42.843433 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:49:50.843418414 +0000 UTC m=+106.481685177 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:49:42 crc kubenswrapper[4837]: E0313 11:49:42.843477 4837 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 11:49:42 crc kubenswrapper[4837]: E0313 11:49:42.843518 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 11:49:50.843500997 +0000 UTC m=+106.481767760 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 11:49:42 crc kubenswrapper[4837]: E0313 11:49:42.843574 4837 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 11:49:42 crc kubenswrapper[4837]: E0313 11:49:42.843606 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 11:49:50.84359569 +0000 UTC m=+106.481862453 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 11:49:42 crc kubenswrapper[4837]: E0313 11:49:42.843687 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 11:49:42 crc kubenswrapper[4837]: E0313 11:49:42.843711 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 11:49:42 crc kubenswrapper[4837]: E0313 11:49:42.843721 4837 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:49:42 crc kubenswrapper[4837]: E0313 11:49:42.843752 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 11:49:50.843742655 +0000 UTC m=+106.482009428 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.844739 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.858853 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.869466 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.880268 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.891202 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.904951 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.920524 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1682ba45a5caded567709ca21681b997665e2b7d3be2fade571b7391f8e1ec9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.925230 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.925272 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.925283 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.925302 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.925316 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:42Z","lastTransitionTime":"2026-03-13T11:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.932166 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.943418 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86e5afeb-4720-4593-a53e-dfb5381d0b1d-metrics-certs\") pod \"network-metrics-daemon-cjn4q\" (UID: \"86e5afeb-4720-4593-a53e-dfb5381d0b1d\") " pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:49:42 crc kubenswrapper[4837]: E0313 11:49:42.943663 4837 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 11:49:42 crc kubenswrapper[4837]: E0313 11:49:42.943746 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86e5afeb-4720-4593-a53e-dfb5381d0b1d-metrics-certs podName:86e5afeb-4720-4593-a53e-dfb5381d0b1d nodeName:}" failed. No retries permitted until 2026-03-13 11:49:50.943723843 +0000 UTC m=+106.581990616 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86e5afeb-4720-4593-a53e-dfb5381d0b1d-metrics-certs") pod "network-metrics-daemon-cjn4q" (UID: "86e5afeb-4720-4593-a53e-dfb5381d0b1d") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.946771 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93dcd114-c39a-4b27-aa9c-a42e3ef7cd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:49:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 11:49:10.789921 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:49:10.790862 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:49:10.792348 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1361817431/tls.crt::/tmp/serving-cert-1361817431/tls.key\\\\\\\"\\\\nI0313 11:49:11.060533 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:49:11.064576 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:49:11.064598 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:49:11.064618 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:49:11.064623 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:49:11.074003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:49:11.074062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074073 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:49:11.074096 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:49:11.074104 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:49:11.074113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:49:11.074181 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:49:11.075668 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:42 crc kubenswrapper[4837]: I0313 11:49:42.958649 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef0f102e98673ab18c97a49b7663d696cfc34b8a477b625c17720f895014e128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:42Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.027834 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.027886 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.027904 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.027926 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.027940 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:43Z","lastTransitionTime":"2026-03-13T11:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.047150 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:43 crc kubenswrapper[4837]: E0313 11:49:43.047275 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.047673 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:43 crc kubenswrapper[4837]: E0313 11:49:43.047754 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.047817 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:43 crc kubenswrapper[4837]: E0313 11:49:43.047925 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.047966 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:49:43 crc kubenswrapper[4837]: E0313 11:49:43.048056 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.130540 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.130606 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.130623 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.130683 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.130711 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:43Z","lastTransitionTime":"2026-03-13T11:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.234084 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.234170 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.234189 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.234219 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.234289 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:43Z","lastTransitionTime":"2026-03-13T11:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.336917 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.336981 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.336991 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.337006 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.337016 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:43Z","lastTransitionTime":"2026-03-13T11:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.440061 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.440419 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.440612 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.440860 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.441037 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:43Z","lastTransitionTime":"2026-03-13T11:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.536239 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.536383 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.536459 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.544906 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.544959 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.544973 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.544997 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.545018 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:43Z","lastTransitionTime":"2026-03-13T11:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.569280 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.569375 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.583768 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.596413 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.610378 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.624495 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.641666 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1682ba45a5caded567709ca21681b997665e2b7d3be2fade571b7391f8e1ec9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.647339 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.647377 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.647411 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.647429 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.647441 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:43Z","lastTransitionTime":"2026-03-13T11:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.655949 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93dcd114-c39a-4b27-aa9c-a42e3ef7cd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:49:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 11:49:10.789921 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:49:10.790862 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:49:10.792348 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1361817431/tls.crt::/tmp/serving-cert-1361817431/tls.key\\\\\\\"\\\\nI0313 11:49:11.060533 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:49:11.064576 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:49:11.064598 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:49:11.064618 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:49:11.064623 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:49:11.074003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:49:11.074062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074073 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:49:11.074096 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:49:11.074104 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:49:11.074113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:49:11.074181 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:49:11.075668 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.671021 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef0f102e98673ab18c97a49b7663d696cfc34b8a477b625c17720f895014e128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.681793 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.692701 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.709189 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.721820 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.737473 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.748310 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.749973 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.750006 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.750015 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.750031 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.750057 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:43Z","lastTransitionTime":"2026-03-13T11:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.757828 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.770556 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.782513 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.794789 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.806415 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.817091 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.826304 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.842501 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.852777 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.852813 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.852822 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.852836 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.852846 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:43Z","lastTransitionTime":"2026-03-13T11:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.854440 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.865479 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.880440 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.905733 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1682ba45a5caded567709ca21681b997665e2b7d3be2fade571b7391f8e1ec9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.921165 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93dcd114-c39a-4b27-aa9c-a42e3ef7cd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:49:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 11:49:10.789921 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:49:10.790862 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:49:10.792348 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1361817431/tls.crt::/tmp/serving-cert-1361817431/tls.key\\\\\\\"\\\\nI0313 11:49:11.060533 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:49:11.064576 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:49:11.064598 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:49:11.064618 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:49:11.064623 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:49:11.074003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:49:11.074062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074073 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:49:11.074096 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:49:11.074104 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:49:11.074113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:49:11.074181 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:49:11.075668 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.940315 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef0f102e98673ab18c97a49b7663d696cfc34b8a477b625c17720f895014e128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.954980 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.955575 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.955657 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.955674 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.955699 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.955718 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:43Z","lastTransitionTime":"2026-03-13T11:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.967265 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:43 crc kubenswrapper[4837]: I0313 11:49:43.977096 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:43Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.058933 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.058991 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.059008 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.059037 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.059056 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:44Z","lastTransitionTime":"2026-03-13T11:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.162114 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.162164 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.162176 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.162198 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.162211 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:44Z","lastTransitionTime":"2026-03-13T11:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.265739 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.265790 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.265802 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.265821 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.265832 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:44Z","lastTransitionTime":"2026-03-13T11:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.368199 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.368243 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.368254 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.368273 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.368284 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:44Z","lastTransitionTime":"2026-03-13T11:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.470910 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.470976 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.470994 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.471019 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.471037 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:44Z","lastTransitionTime":"2026-03-13T11:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.575823 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.575866 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.575877 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.575895 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.575908 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:44Z","lastTransitionTime":"2026-03-13T11:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.678570 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.678668 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.678684 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.678701 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.678711 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:44Z","lastTransitionTime":"2026-03-13T11:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.781445 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.781519 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.781541 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.781798 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.781817 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:44Z","lastTransitionTime":"2026-03-13T11:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.884555 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.884591 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.884599 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.884614 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.884623 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:44Z","lastTransitionTime":"2026-03-13T11:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.987794 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.987839 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.987849 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.987870 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:44 crc kubenswrapper[4837]: I0313 11:49:44.987883 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:44Z","lastTransitionTime":"2026-03-13T11:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.047630 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.047685 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.047769 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.047780 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:45 crc kubenswrapper[4837]: E0313 11:49:45.047930 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:49:45 crc kubenswrapper[4837]: E0313 11:49:45.048041 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:49:45 crc kubenswrapper[4837]: E0313 11:49:45.048136 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:49:45 crc kubenswrapper[4837]: E0313 11:49:45.048476 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.060616 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.071556 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.083474 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.090536 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.090622 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.090632 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.090674 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.090688 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:45Z","lastTransitionTime":"2026-03-13T11:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.096162 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.109130 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.122964 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.134078 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.145453 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.162505 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.173479 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.187032 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.193932 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.193975 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.193987 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.194005 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.194020 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:45Z","lastTransitionTime":"2026-03-13T11:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.200953 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.219047 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1682ba45a5caded567709ca21681b997665e2b7d3be2fade571b7391f8e1ec9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.236266 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef0f102e98673ab18c97a49b7663d696cfc34b8a477b625c17720f895014e128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.254362 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93dcd114-c39a-4b27-aa9c-a42e3ef7cd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:49:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 11:49:10.789921 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:49:10.790862 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:49:10.792348 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1361817431/tls.crt::/tmp/serving-cert-1361817431/tls.key\\\\\\\"\\\\nI0313 11:49:11.060533 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:49:11.064576 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:49:11.064598 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:49:11.064618 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:49:11.064623 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:49:11.074003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:49:11.074062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074073 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:49:11.074096 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:49:11.074104 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:49:11.074113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:49:11.074181 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:49:11.075668 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.297074 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.297350 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.297437 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.297544 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.297709 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:45Z","lastTransitionTime":"2026-03-13T11:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.401724 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.402008 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.402131 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.402214 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.402286 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:45Z","lastTransitionTime":"2026-03-13T11:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.505544 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.505817 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.505875 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.505934 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.505988 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:45Z","lastTransitionTime":"2026-03-13T11:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.544138 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4zzrs_43df29f7-1351-41f5-bfca-17f804837cb4/ovnkube-controller/0.log" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.546421 4837 generic.go:334] "Generic (PLEG): container finished" podID="43df29f7-1351-41f5-bfca-17f804837cb4" containerID="1682ba45a5caded567709ca21681b997665e2b7d3be2fade571b7391f8e1ec9b" exitCode=1 Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.546571 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" event={"ID":"43df29f7-1351-41f5-bfca-17f804837cb4","Type":"ContainerDied","Data":"1682ba45a5caded567709ca21681b997665e2b7d3be2fade571b7391f8e1ec9b"} Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.547263 4837 scope.go:117] "RemoveContainer" containerID="1682ba45a5caded567709ca21681b997665e2b7d3be2fade571b7391f8e1ec9b" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.563706 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.579668 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.593528 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.608690 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.609767 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.609804 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.609815 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.609832 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.609842 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:45Z","lastTransitionTime":"2026-03-13T11:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.621474 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.636455 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.650538 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.665462 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.681405 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.707840 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1682ba45a5caded567709ca21681b997665e2b7d3be2fade571b7391f8e1ec9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1682ba45a5caded567709ca21681b997665e2b7d3be2fade571b7391f8e1ec9b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:49:45Z\\\",\\\"message\\\":\\\"formers/factory.go:160\\\\nI0313 11:49:45.067303 6780 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 11:49:45.067813 6780 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 11:49:45.067853 6780 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0313 11:49:45.067873 6780 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 11:49:45.067899 6780 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0313 11:49:45.067916 6780 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 11:49:45.067922 6780 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0313 11:49:45.067948 6780 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0313 11:49:45.067960 6780 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 11:49:45.067970 6780 handler.go:208] Removed *v1.Node event handler 7\\\\nI0313 11:49:45.067980 6780 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0313 11:49:45.067989 6780 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0313 11:49:45.068001 6780 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 11:49:45.068051 6780 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.712781 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.712829 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.712846 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.712871 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.712890 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:45Z","lastTransitionTime":"2026-03-13T11:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.723507 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93dcd114-c39a-4b27-aa9c-a42e3ef7cd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:49:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 11:49:10.789921 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:49:10.790862 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:49:10.792348 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1361817431/tls.crt::/tmp/serving-cert-1361817431/tls.key\\\\\\\"\\\\nI0313 11:49:11.060533 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:49:11.064576 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:49:11.064598 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:49:11.064618 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:49:11.064623 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:49:11.074003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:49:11.074062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074073 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:49:11.074096 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:49:11.074104 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:49:11.074113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:49:11.074181 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:49:11.075668 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.741319 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef0f102e98673ab18c97a49b7663d696cfc34b8a477b625c17720f895014e128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.754400 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.764303 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.774877 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:45Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.816346 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.816397 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.816406 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.816424 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.816435 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:45Z","lastTransitionTime":"2026-03-13T11:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.919757 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.919815 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.919835 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.919877 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:45 crc kubenswrapper[4837]: I0313 11:49:45.919891 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:45Z","lastTransitionTime":"2026-03-13T11:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.022493 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.022540 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.022554 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.022571 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.022580 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:46Z","lastTransitionTime":"2026-03-13T11:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.125626 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.125691 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.125707 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.125726 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.125738 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:46Z","lastTransitionTime":"2026-03-13T11:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.228359 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.228401 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.228411 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.228426 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.228437 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:46Z","lastTransitionTime":"2026-03-13T11:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.331617 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.331683 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.331692 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.331709 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.331724 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:46Z","lastTransitionTime":"2026-03-13T11:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.387681 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.387735 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.387752 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.387772 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.387783 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:46Z","lastTransitionTime":"2026-03-13T11:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:46 crc kubenswrapper[4837]: E0313 11:49:46.401514 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.406057 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.406092 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.406101 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.406123 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.406133 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:46Z","lastTransitionTime":"2026-03-13T11:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:46 crc kubenswrapper[4837]: E0313 11:49:46.418522 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.422779 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.422917 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.423000 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.423081 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.423140 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:46Z","lastTransitionTime":"2026-03-13T11:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:46 crc kubenswrapper[4837]: E0313 11:49:46.436291 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.446409 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.446501 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.446515 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.446540 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.446884 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:46Z","lastTransitionTime":"2026-03-13T11:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:46 crc kubenswrapper[4837]: E0313 11:49:46.461756 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.466559 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.466607 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.466618 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.466648 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.466689 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:46Z","lastTransitionTime":"2026-03-13T11:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:46 crc kubenswrapper[4837]: E0313 11:49:46.480387 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:46 crc kubenswrapper[4837]: E0313 11:49:46.480619 4837 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.482468 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.482513 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.482529 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.482553 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.482568 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:46Z","lastTransitionTime":"2026-03-13T11:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.551341 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4zzrs_43df29f7-1351-41f5-bfca-17f804837cb4/ovnkube-controller/1.log" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.552120 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4zzrs_43df29f7-1351-41f5-bfca-17f804837cb4/ovnkube-controller/0.log" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.555204 4837 generic.go:334] "Generic (PLEG): container finished" podID="43df29f7-1351-41f5-bfca-17f804837cb4" containerID="e391e6d06012bec4c5b5d6fdde2effc343d6321eccbe517c9a83736be9b553d4" exitCode=1 Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.555258 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" event={"ID":"43df29f7-1351-41f5-bfca-17f804837cb4","Type":"ContainerDied","Data":"e391e6d06012bec4c5b5d6fdde2effc343d6321eccbe517c9a83736be9b553d4"} Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.555311 4837 scope.go:117] "RemoveContainer" containerID="1682ba45a5caded567709ca21681b997665e2b7d3be2fade571b7391f8e1ec9b" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.556240 4837 scope.go:117] "RemoveContainer" containerID="e391e6d06012bec4c5b5d6fdde2effc343d6321eccbe517c9a83736be9b553d4" Mar 13 11:49:46 crc kubenswrapper[4837]: E0313 11:49:46.556477 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4zzrs_openshift-ovn-kubernetes(43df29f7-1351-41f5-bfca-17f804837cb4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.570611 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.586004 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.586060 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.586072 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.586093 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.586108 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:46Z","lastTransitionTime":"2026-03-13T11:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.586366 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.600820 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.617622 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.637192 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.657305 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.679711 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.689202 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.689256 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.689270 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.689290 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.689304 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:46Z","lastTransitionTime":"2026-03-13T11:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.695147 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.714795 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.731718 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.749317 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.767716 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.792089 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.792140 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.792152 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.792176 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.792190 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:46Z","lastTransitionTime":"2026-03-13T11:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.801442 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e391e6d06012bec4c5b5d6fdde2effc343d6321eccbe517c9a83736be9b553d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1682ba45a5caded567709ca21681b997665e2b7d3be2fade571b7391f8e1ec9b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:49:45Z\\\",\\\"message\\\":\\\"formers/factory.go:160\\\\nI0313 11:49:45.067303 6780 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 11:49:45.067813 6780 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 11:49:45.067853 6780 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0313 11:49:45.067873 6780 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 11:49:45.067899 6780 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0313 11:49:45.067916 6780 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 11:49:45.067922 6780 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0313 11:49:45.067948 6780 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0313 11:49:45.067960 6780 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 11:49:45.067970 6780 handler.go:208] Removed *v1.Node event handler 7\\\\nI0313 11:49:45.067980 6780 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0313 11:49:45.067989 6780 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0313 11:49:45.068001 6780 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 11:49:45.068051 6780 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e391e6d06012bec4c5b5d6fdde2effc343d6321eccbe517c9a83736be9b553d4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"message\\\":\\\":160\\\\nI0313 11:49:46.362104 6913 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 11:49:46.362220 6913 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0313 11:49:46.362340 6913 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 11:49:46.362469 6913 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 11:49:46.362889 6913 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 11:49:46.374778 6913 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0313 11:49:46.374817 6913 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0313 11:49:46.374885 6913 ovnkube.go:599] Stopped ovnkube\\\\nI0313 11:49:46.374926 6913 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 11:49:46.375025 6913 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.823044 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef0f102e98673ab18c97a49b7663d696cfc34b8a477b625c17720f895014e128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.841268 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93dcd114-c39a-4b27-aa9c-a42e3ef7cd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:49:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 11:49:10.789921 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:49:10.790862 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:49:10.792348 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1361817431/tls.crt::/tmp/serving-cert-1361817431/tls.key\\\\\\\"\\\\nI0313 11:49:11.060533 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:49:11.064576 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:49:11.064598 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:49:11.064618 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:49:11.064623 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:49:11.074003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:49:11.074062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074073 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:49:11.074096 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:49:11.074104 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:49:11.074113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:49:11.074181 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:49:11.075668 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:46Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.895088 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.895137 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.895148 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.895164 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.895175 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:46Z","lastTransitionTime":"2026-03-13T11:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.998125 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.998215 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.998270 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.998301 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:46 crc kubenswrapper[4837]: I0313 11:49:46.998320 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:46Z","lastTransitionTime":"2026-03-13T11:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.047561 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.047599 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.047684 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.047693 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:49:47 crc kubenswrapper[4837]: E0313 11:49:47.047835 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:47 crc kubenswrapper[4837]: E0313 11:49:47.047948 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:49:47 crc kubenswrapper[4837]: E0313 11:49:47.048085 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:49:47 crc kubenswrapper[4837]: E0313 11:49:47.048246 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.101458 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.101495 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.101504 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.101517 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.101527 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:47Z","lastTransitionTime":"2026-03-13T11:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.205516 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.205578 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.205590 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.205609 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.205624 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:47Z","lastTransitionTime":"2026-03-13T11:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.308499 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.308584 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.308680 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.308719 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.308743 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:47Z","lastTransitionTime":"2026-03-13T11:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.411406 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.411469 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.411481 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.411510 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.411524 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:47Z","lastTransitionTime":"2026-03-13T11:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.513850 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.513904 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.513917 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.513935 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.513950 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:47Z","lastTransitionTime":"2026-03-13T11:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.562964 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4zzrs_43df29f7-1351-41f5-bfca-17f804837cb4/ovnkube-controller/1.log" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.569482 4837 scope.go:117] "RemoveContainer" containerID="e391e6d06012bec4c5b5d6fdde2effc343d6321eccbe517c9a83736be9b553d4" Mar 13 11:49:47 crc kubenswrapper[4837]: E0313 11:49:47.569739 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4zzrs_openshift-ovn-kubernetes(43df29f7-1351-41f5-bfca-17f804837cb4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.582406 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:47Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.596220 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:47Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.611998 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:47Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.619137 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.619196 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.619210 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.619230 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.619245 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:47Z","lastTransitionTime":"2026-03-13T11:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.626390 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:47Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.635538 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:47Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.646490 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:47Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.656608 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:47Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.667709 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:47Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.678153 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:47Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.696760 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e391e6d06012bec4c5b5d6fdde2effc343d6321eccbe517c9a83736be9b553d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e391e6d06012bec4c5b5d6fdde2effc343d6321eccbe517c9a83736be9b553d4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"message\\\":\\\":160\\\\nI0313 11:49:46.362104 6913 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 11:49:46.362220 6913 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0313 11:49:46.362340 6913 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 11:49:46.362469 6913 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 11:49:46.362889 6913 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 11:49:46.374778 6913 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0313 11:49:46.374817 6913 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0313 11:49:46.374885 6913 ovnkube.go:599] Stopped ovnkube\\\\nI0313 11:49:46.374926 6913 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 11:49:46.375025 6913 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4zzrs_openshift-ovn-kubernetes(43df29f7-1351-41f5-bfca-17f804837cb4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:47Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.711101 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93dcd114-c39a-4b27-aa9c-a42e3ef7cd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:49:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 11:49:10.789921 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:49:10.790862 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:49:10.792348 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1361817431/tls.crt::/tmp/serving-cert-1361817431/tls.key\\\\\\\"\\\\nI0313 11:49:11.060533 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:49:11.064576 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:49:11.064598 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:49:11.064618 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:49:11.064623 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:49:11.074003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:49:11.074062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074073 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:49:11.074096 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:49:11.074104 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:49:11.074113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:49:11.074181 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:49:11.075668 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:47Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.722025 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.722215 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.722277 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.722360 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.722420 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:47Z","lastTransitionTime":"2026-03-13T11:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.729848 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef0f102e98673ab18c97a49b7663d696cfc34b8a477b625c17720f895014e128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:47Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.743969 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:47Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.757077 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:47Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.769388 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:47Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.825895 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.825933 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.825941 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.825955 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.825964 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:47Z","lastTransitionTime":"2026-03-13T11:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.928570 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.928674 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.928689 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.928713 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:47 crc kubenswrapper[4837]: I0313 11:49:47.928727 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:47Z","lastTransitionTime":"2026-03-13T11:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.031745 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.031781 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.031792 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.031810 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.031823 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:48Z","lastTransitionTime":"2026-03-13T11:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.070872 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.135350 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.135400 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.135412 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.135432 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.135448 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:48Z","lastTransitionTime":"2026-03-13T11:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.249378 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.249445 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.249461 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.249489 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.249507 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:48Z","lastTransitionTime":"2026-03-13T11:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.353311 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.353361 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.353369 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.353391 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.353400 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:48Z","lastTransitionTime":"2026-03-13T11:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.456242 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.456297 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.456312 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.456336 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.456351 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:48Z","lastTransitionTime":"2026-03-13T11:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.559802 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.559842 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.559852 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.559868 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.559878 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:48Z","lastTransitionTime":"2026-03-13T11:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.662947 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.663009 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.663027 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.663054 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.663071 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:48Z","lastTransitionTime":"2026-03-13T11:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.765558 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.765607 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.765622 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.765675 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.765695 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:48Z","lastTransitionTime":"2026-03-13T11:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.868233 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.868295 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.868316 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.868347 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.868370 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:48Z","lastTransitionTime":"2026-03-13T11:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.971040 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.971078 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.971087 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.971101 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:48 crc kubenswrapper[4837]: I0313 11:49:48.971110 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:48Z","lastTransitionTime":"2026-03-13T11:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.048194 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.048239 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.048270 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.048273 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:49:49 crc kubenswrapper[4837]: E0313 11:49:49.048373 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:49 crc kubenswrapper[4837]: E0313 11:49:49.048469 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:49:49 crc kubenswrapper[4837]: E0313 11:49:49.048548 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:49:49 crc kubenswrapper[4837]: E0313 11:49:49.048596 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.073384 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.073431 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.073442 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.073460 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.073473 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:49Z","lastTransitionTime":"2026-03-13T11:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.178068 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.178423 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.178565 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.178756 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.178897 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:49Z","lastTransitionTime":"2026-03-13T11:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.282071 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.282123 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.282135 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.282153 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.282169 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:49Z","lastTransitionTime":"2026-03-13T11:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.389863 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.389931 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.389949 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.389974 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.389991 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:49Z","lastTransitionTime":"2026-03-13T11:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.493075 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.493157 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.493183 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.493215 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.493240 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:49Z","lastTransitionTime":"2026-03-13T11:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.596898 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.596962 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.596979 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.597004 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.597022 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:49Z","lastTransitionTime":"2026-03-13T11:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.699479 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.699524 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.699539 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.699558 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.699574 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:49Z","lastTransitionTime":"2026-03-13T11:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.802410 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.802477 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.802495 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.802522 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.802540 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:49Z","lastTransitionTime":"2026-03-13T11:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.906270 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.906352 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.906381 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.906415 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:49 crc kubenswrapper[4837]: I0313 11:49:49.906442 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:49Z","lastTransitionTime":"2026-03-13T11:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.009736 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.009799 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.009816 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.009842 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.009860 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:50Z","lastTransitionTime":"2026-03-13T11:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.112704 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.112743 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.112754 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.112775 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.112787 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:50Z","lastTransitionTime":"2026-03-13T11:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.215662 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.215727 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.215739 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.215763 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.215779 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:50Z","lastTransitionTime":"2026-03-13T11:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.320766 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.320824 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.320834 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.320852 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.320863 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:50Z","lastTransitionTime":"2026-03-13T11:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.424195 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.424240 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.424251 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.424274 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.424283 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:50Z","lastTransitionTime":"2026-03-13T11:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.526741 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.526799 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.526811 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.526845 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.526859 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:50Z","lastTransitionTime":"2026-03-13T11:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.630184 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.630227 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.630239 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.630257 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.630270 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:50Z","lastTransitionTime":"2026-03-13T11:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.734086 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.734243 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.734267 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.734293 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.734311 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:50Z","lastTransitionTime":"2026-03-13T11:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.837710 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.837764 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.837777 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.837800 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.837818 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:50Z","lastTransitionTime":"2026-03-13T11:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.929877 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:49:50 crc kubenswrapper[4837]: E0313 11:49:50.930023 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:50:06.929999289 +0000 UTC m=+122.568266062 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.930101 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.930136 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.930175 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.930198 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:50 crc kubenswrapper[4837]: E0313 11:49:50.930218 4837 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 11:49:50 crc kubenswrapper[4837]: E0313 11:49:50.930259 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 11:50:06.930249657 +0000 UTC m=+122.568516420 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 11:49:50 crc kubenswrapper[4837]: E0313 11:49:50.930351 4837 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 11:49:50 crc kubenswrapper[4837]: E0313 11:49:50.930460 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 11:50:06.930441783 +0000 UTC m=+122.568708546 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 11:49:50 crc kubenswrapper[4837]: E0313 11:49:50.930470 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 11:49:50 crc kubenswrapper[4837]: E0313 11:49:50.930531 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 11:49:50 crc kubenswrapper[4837]: E0313 11:49:50.930549 4837 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:49:50 crc kubenswrapper[4837]: E0313 11:49:50.930371 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 11:49:50 crc kubenswrapper[4837]: E0313 11:49:50.930626 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 11:50:06.930602547 +0000 UTC m=+122.568869320 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:49:50 crc kubenswrapper[4837]: E0313 11:49:50.930630 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 11:49:50 crc kubenswrapper[4837]: E0313 11:49:50.930668 4837 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:49:50 crc kubenswrapper[4837]: E0313 11:49:50.930723 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 11:50:06.93070805 +0000 UTC m=+122.568975003 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.940349 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.940393 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.940407 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.940429 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:50 crc kubenswrapper[4837]: I0313 11:49:50.940446 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:50Z","lastTransitionTime":"2026-03-13T11:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.030775 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86e5afeb-4720-4593-a53e-dfb5381d0b1d-metrics-certs\") pod \"network-metrics-daemon-cjn4q\" (UID: \"86e5afeb-4720-4593-a53e-dfb5381d0b1d\") " pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:49:51 crc kubenswrapper[4837]: E0313 11:49:51.030953 4837 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 11:49:51 crc kubenswrapper[4837]: E0313 11:49:51.031026 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86e5afeb-4720-4593-a53e-dfb5381d0b1d-metrics-certs podName:86e5afeb-4720-4593-a53e-dfb5381d0b1d nodeName:}" failed. No retries permitted until 2026-03-13 11:50:07.03100895 +0000 UTC m=+122.669275723 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86e5afeb-4720-4593-a53e-dfb5381d0b1d-metrics-certs") pod "network-metrics-daemon-cjn4q" (UID: "86e5afeb-4720-4593-a53e-dfb5381d0b1d") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.042950 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.043027 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.043052 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.043084 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.043109 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:51Z","lastTransitionTime":"2026-03-13T11:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.047183 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.047230 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.047248 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.047287 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:51 crc kubenswrapper[4837]: E0313 11:49:51.047376 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:49:51 crc kubenswrapper[4837]: E0313 11:49:51.047483 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:49:51 crc kubenswrapper[4837]: E0313 11:49:51.047561 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:51 crc kubenswrapper[4837]: E0313 11:49:51.047694 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.146375 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.146451 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.146473 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.146502 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.146524 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:51Z","lastTransitionTime":"2026-03-13T11:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.249956 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.250000 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.250012 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.250031 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.250045 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:51Z","lastTransitionTime":"2026-03-13T11:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.353287 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.353339 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.353374 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.353395 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.353411 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:51Z","lastTransitionTime":"2026-03-13T11:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.455719 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.455782 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.455799 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.455824 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.455841 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:51Z","lastTransitionTime":"2026-03-13T11:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.558465 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.558548 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.558573 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.558604 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.558626 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:51Z","lastTransitionTime":"2026-03-13T11:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.662151 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.662251 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.662278 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.662313 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.662337 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:51Z","lastTransitionTime":"2026-03-13T11:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.765267 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.765343 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.765361 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.765402 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.765435 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:51Z","lastTransitionTime":"2026-03-13T11:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.873081 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.873123 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.873135 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.873153 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.873165 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:51Z","lastTransitionTime":"2026-03-13T11:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.974928 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.974978 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.974996 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.975013 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:51 crc kubenswrapper[4837]: I0313 11:49:51.975026 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:51Z","lastTransitionTime":"2026-03-13T11:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.078369 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.078434 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.078449 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.078478 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.078495 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:52Z","lastTransitionTime":"2026-03-13T11:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.183811 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.183884 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.183896 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.183954 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.183984 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:52Z","lastTransitionTime":"2026-03-13T11:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.287585 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.287661 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.287672 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.287690 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.287700 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:52Z","lastTransitionTime":"2026-03-13T11:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.390860 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.390901 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.390912 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.390930 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.390941 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:52Z","lastTransitionTime":"2026-03-13T11:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.494249 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.494302 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.494320 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.494344 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.494361 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:52Z","lastTransitionTime":"2026-03-13T11:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.597349 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.597418 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.597434 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.597465 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.597490 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:52Z","lastTransitionTime":"2026-03-13T11:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.700913 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.700975 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.700984 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.701001 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.701011 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:52Z","lastTransitionTime":"2026-03-13T11:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.804785 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.804831 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.804840 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.804855 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.804865 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:52Z","lastTransitionTime":"2026-03-13T11:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.906898 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.906944 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.906953 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.906972 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:52 crc kubenswrapper[4837]: I0313 11:49:52.906984 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:52Z","lastTransitionTime":"2026-03-13T11:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.009990 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.010048 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.010060 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.010081 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.010094 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:53Z","lastTransitionTime":"2026-03-13T11:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.047985 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.048053 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.048053 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:53 crc kubenswrapper[4837]: E0313 11:49:53.048220 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.048297 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:53 crc kubenswrapper[4837]: E0313 11:49:53.048471 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:49:53 crc kubenswrapper[4837]: E0313 11:49:53.048563 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:49:53 crc kubenswrapper[4837]: E0313 11:49:53.048702 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.112317 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.112700 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.112880 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.113077 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.113264 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:53Z","lastTransitionTime":"2026-03-13T11:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.216416 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.216459 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.216471 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.216505 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.216520 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:53Z","lastTransitionTime":"2026-03-13T11:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.319617 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.319723 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.319747 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.319778 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.319865 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:53Z","lastTransitionTime":"2026-03-13T11:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.422325 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.422387 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.422403 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.422423 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.422432 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:53Z","lastTransitionTime":"2026-03-13T11:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.525920 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.525993 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.526007 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.526032 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.526048 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:53Z","lastTransitionTime":"2026-03-13T11:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.628173 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.628221 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.628235 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.628260 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.628274 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:53Z","lastTransitionTime":"2026-03-13T11:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.731522 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.731571 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.731584 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.731606 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.731619 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:53Z","lastTransitionTime":"2026-03-13T11:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.834973 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.835041 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.835060 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.835091 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.835113 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:53Z","lastTransitionTime":"2026-03-13T11:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.938532 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.938595 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.938608 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.938632 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:53 crc kubenswrapper[4837]: I0313 11:49:53.938670 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:53Z","lastTransitionTime":"2026-03-13T11:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.041702 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.041758 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.041769 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.041788 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.041799 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:54Z","lastTransitionTime":"2026-03-13T11:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.048965 4837 scope.go:117] "RemoveContainer" containerID="6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.144973 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.145024 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.145034 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.145053 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.145066 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:54Z","lastTransitionTime":"2026-03-13T11:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.248141 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.248181 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.248189 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.248224 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.248235 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:54Z","lastTransitionTime":"2026-03-13T11:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.350515 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.350570 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.350579 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.350597 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.350609 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:54Z","lastTransitionTime":"2026-03-13T11:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.453562 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.453604 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.453612 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.453630 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.453642 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:54Z","lastTransitionTime":"2026-03-13T11:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.556525 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.556586 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.556604 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.556627 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.556682 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:54Z","lastTransitionTime":"2026-03-13T11:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.597366 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.599140 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"abb4f7913ed2023bd133ac1171cd590f8b0366200f10ee3b27c1d2c3195fc8ea"} Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.599591 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.612896 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:54Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.623479 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:54Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.634835 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:54Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.645117 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:54Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.655918 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:54Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.659470 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.659509 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.659520 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.659539 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.659551 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:54Z","lastTransitionTime":"2026-03-13T11:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.674022 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e391e6d06012bec4c5b5d6fdde2effc343d6321eccbe517c9a83736be9b553d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e391e6d06012bec4c5b5d6fdde2effc343d6321eccbe517c9a83736be9b553d4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"message\\\":\\\":160\\\\nI0313 11:49:46.362104 6913 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 11:49:46.362220 6913 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0313 11:49:46.362340 6913 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 11:49:46.362469 6913 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 11:49:46.362889 6913 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 11:49:46.374778 6913 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0313 11:49:46.374817 6913 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0313 11:49:46.374885 6913 ovnkube.go:599] Stopped ovnkube\\\\nI0313 11:49:46.374926 6913 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 11:49:46.375025 6913 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4zzrs_openshift-ovn-kubernetes(43df29f7-1351-41f5-bfca-17f804837cb4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:54Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.692311 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b481010-5fbc-4c5c-b782-9dbb7524023e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4286e1cf3e088b3ccc0949721368fe176894a5d6bdf8d1dd108b92adecf45952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c00ffa41f4f30f0516fe955d957ac92818f9576557f7e1352070e221ac7b09d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae595b4ed8facfb5d9a747dac75233102bd05bc21e4bd5c644c0a1985bb7ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a7546e653505747aa787947982ccf181e3209cc3110f8bde34360ea73a1c69d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f3bbb38d2bec20e9b96f72dee3906973b4cc3e658d067928a46a8de37652f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:54Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.705084 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:54Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.717170 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:54Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.727376 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:54Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.739415 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:54Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.753727 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93dcd114-c39a-4b27-aa9c-a42e3ef7cd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4f7913ed2023bd133ac1171cd590f8b0366200f10ee3b27c1d2c3195fc8ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:49:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 11:49:10.789921 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:49:10.790862 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:49:10.792348 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1361817431/tls.crt::/tmp/serving-cert-1361817431/tls.key\\\\\\\"\\\\nI0313 11:49:11.060533 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:49:11.064576 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:49:11.064598 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:49:11.064618 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:49:11.064623 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:49:11.074003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:49:11.074062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074073 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:49:11.074096 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:49:11.074104 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:49:11.074113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:49:11.074181 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:49:11.075668 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:54Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.762540 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.762765 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.762881 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.763150 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.763238 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:54Z","lastTransitionTime":"2026-03-13T11:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.767546 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef0f102e98673ab18c97a49b7663d696cfc34b8a477b625c17720f895014e128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:54Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.778417 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:54Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.787286 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:54Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.800319 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:54Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.866573 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.866610 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.866621 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.866640 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.866666 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:54Z","lastTransitionTime":"2026-03-13T11:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.969307 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.969385 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.969407 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.969438 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:54 crc kubenswrapper[4837]: I0313 11:49:54.969457 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:54Z","lastTransitionTime":"2026-03-13T11:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.047418 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.047506 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.047575 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:55 crc kubenswrapper[4837]: E0313 11:49:55.047719 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:49:55 crc kubenswrapper[4837]: E0313 11:49:55.047889 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.048027 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:55 crc kubenswrapper[4837]: E0313 11:49:55.048117 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:49:55 crc kubenswrapper[4837]: E0313 11:49:55.048346 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.066563 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:55Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.071222 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.071261 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.071271 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.071286 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.071298 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:55Z","lastTransitionTime":"2026-03-13T11:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.081265 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:55Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.091958 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:55Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.107658 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:55Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.119400 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:55Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.132256 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:55Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.143560 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:55Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.158970 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:55Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.178353 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.178426 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.178439 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.178463 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.178481 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:55Z","lastTransitionTime":"2026-03-13T11:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.178569 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b481010-5fbc-4c5c-b782-9dbb7524023e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4286e1cf3e088b3ccc0949721368fe176894a5d6bdf8d1dd108b92adecf45952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c00ffa41f4f30f0516fe955d957ac92818f9576557f7e1352070e221ac7b09d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae595b4ed8facfb5d9a747dac75233102bd05bc21e4bd5c644c0a1985bb7ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a7546e653505747aa787947982ccf181e3209cc3110f8bde34360ea73a1c69d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f3bbb38d2bec20e9b96f72dee3906973b4cc3e658d067928a46a8de37652f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:55Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.195134 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:55Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.214160 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:55Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.228229 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:55Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.244836 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:55Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.271471 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e391e6d06012bec4c5b5d6fdde2effc343d6321eccbe517c9a83736be9b553d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e391e6d06012bec4c5b5d6fdde2effc343d6321eccbe517c9a83736be9b553d4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"message\\\":\\\":160\\\\nI0313 11:49:46.362104 6913 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 11:49:46.362220 6913 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0313 11:49:46.362340 6913 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 11:49:46.362469 6913 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 11:49:46.362889 6913 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 11:49:46.374778 6913 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0313 11:49:46.374817 6913 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0313 11:49:46.374885 6913 ovnkube.go:599] Stopped ovnkube\\\\nI0313 11:49:46.374926 6913 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 11:49:46.375025 6913 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4zzrs_openshift-ovn-kubernetes(43df29f7-1351-41f5-bfca-17f804837cb4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:55Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.280700 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.280728 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.280739 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.280752 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.280762 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:55Z","lastTransitionTime":"2026-03-13T11:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.288096 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93dcd114-c39a-4b27-aa9c-a42e3ef7cd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4f7913ed2023bd133ac1171cd590f8b0366200f10ee3b27c1d2c3195fc8ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:49:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 11:49:10.789921 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:49:10.790862 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:49:10.792348 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1361817431/tls.crt::/tmp/serving-cert-1361817431/tls.key\\\\\\\"\\\\nI0313 11:49:11.060533 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:49:11.064576 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:49:11.064598 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:49:11.064618 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:49:11.064623 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:49:11.074003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:49:11.074062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074073 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:49:11.074096 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:49:11.074104 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:49:11.074113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:49:11.074181 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:49:11.075668 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:55Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.305777 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef0f102e98673ab18c97a49b7663d696cfc34b8a477b625c17720f895014e128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:55Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.382870 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.382920 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.382928 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.382943 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.382953 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:55Z","lastTransitionTime":"2026-03-13T11:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.485348 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.485390 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.485400 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.485435 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.485447 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:55Z","lastTransitionTime":"2026-03-13T11:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.587611 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.587683 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.587697 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.587740 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.587753 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:55Z","lastTransitionTime":"2026-03-13T11:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.690019 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.690059 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.690070 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.690105 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.690114 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:55Z","lastTransitionTime":"2026-03-13T11:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.793221 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.793260 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.793270 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.793289 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.793300 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:55Z","lastTransitionTime":"2026-03-13T11:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.897132 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.897711 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.897795 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.897824 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:55 crc kubenswrapper[4837]: I0313 11:49:55.897875 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:55Z","lastTransitionTime":"2026-03-13T11:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.001356 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.001424 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.001446 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.001477 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.001503 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:56Z","lastTransitionTime":"2026-03-13T11:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.104161 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.104248 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.104310 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.104347 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.104362 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:56Z","lastTransitionTime":"2026-03-13T11:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.206851 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.206963 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.206977 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.207000 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.207015 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:56Z","lastTransitionTime":"2026-03-13T11:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.309228 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.309282 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.309296 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.309322 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.309335 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:56Z","lastTransitionTime":"2026-03-13T11:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.412199 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.412276 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.412287 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.412306 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.412320 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:56Z","lastTransitionTime":"2026-03-13T11:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.515734 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.516091 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.516170 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.516238 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.516322 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:56Z","lastTransitionTime":"2026-03-13T11:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.517482 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.517526 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.517539 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.517581 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.517593 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:56Z","lastTransitionTime":"2026-03-13T11:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:56 crc kubenswrapper[4837]: E0313 11:49:56.532710 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:56Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.537457 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.537510 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.537524 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.537547 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.537562 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:56Z","lastTransitionTime":"2026-03-13T11:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:56 crc kubenswrapper[4837]: E0313 11:49:56.551507 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:56Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.557895 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.558259 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.558378 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.558477 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.558561 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:56Z","lastTransitionTime":"2026-03-13T11:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:56 crc kubenswrapper[4837]: E0313 11:49:56.575327 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:56Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.581186 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.581216 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.581226 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.581242 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.581252 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:56Z","lastTransitionTime":"2026-03-13T11:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:56 crc kubenswrapper[4837]: E0313 11:49:56.600067 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:56Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.604773 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.604810 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.604821 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.604841 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.604851 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:56Z","lastTransitionTime":"2026-03-13T11:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:56 crc kubenswrapper[4837]: E0313 11:49:56.624448 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:49:56Z is after 2025-08-24T17:21:41Z" Mar 13 11:49:56 crc kubenswrapper[4837]: E0313 11:49:56.624786 4837 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.627329 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.627357 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.627368 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.627386 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.627398 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:56Z","lastTransitionTime":"2026-03-13T11:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.730287 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.730326 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.730338 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.730357 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.730370 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:56Z","lastTransitionTime":"2026-03-13T11:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.833750 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.833846 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.833878 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.833916 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.833936 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:56Z","lastTransitionTime":"2026-03-13T11:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.936675 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.936710 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.936722 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.936740 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:56 crc kubenswrapper[4837]: I0313 11:49:56.936752 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:56Z","lastTransitionTime":"2026-03-13T11:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.039457 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.039506 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.039522 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.039543 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.039559 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:57Z","lastTransitionTime":"2026-03-13T11:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.048036 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.048062 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.048117 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:57 crc kubenswrapper[4837]: E0313 11:49:57.048207 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:49:57 crc kubenswrapper[4837]: E0313 11:49:57.048296 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.048339 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:57 crc kubenswrapper[4837]: E0313 11:49:57.048522 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:57 crc kubenswrapper[4837]: E0313 11:49:57.048742 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.062716 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.142845 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.142891 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.142904 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.142926 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.142939 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:57Z","lastTransitionTime":"2026-03-13T11:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.245881 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.245915 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.245925 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.245941 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.245953 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:57Z","lastTransitionTime":"2026-03-13T11:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.348975 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.349018 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.349027 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.349045 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.349057 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:57Z","lastTransitionTime":"2026-03-13T11:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.453196 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.453248 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.453260 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.453277 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.453288 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:57Z","lastTransitionTime":"2026-03-13T11:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.556155 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.556211 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.556229 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.556255 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.556273 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:57Z","lastTransitionTime":"2026-03-13T11:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.660372 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.660461 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.660476 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.660498 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.660513 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:57Z","lastTransitionTime":"2026-03-13T11:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.764146 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.764190 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.764201 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.764218 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.764228 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:57Z","lastTransitionTime":"2026-03-13T11:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.866045 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.866090 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.866101 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.866119 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.866129 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:57Z","lastTransitionTime":"2026-03-13T11:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.968780 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.968848 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.968871 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.968903 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:57 crc kubenswrapper[4837]: I0313 11:49:57.968927 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:57Z","lastTransitionTime":"2026-03-13T11:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.072480 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.072554 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.072577 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.072607 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.072632 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:58Z","lastTransitionTime":"2026-03-13T11:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.175974 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.176063 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.176087 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.176129 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.176152 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:58Z","lastTransitionTime":"2026-03-13T11:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.279190 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.279264 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.279291 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.279327 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.279351 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:58Z","lastTransitionTime":"2026-03-13T11:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.382975 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.383052 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.383072 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.383096 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.383114 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:58Z","lastTransitionTime":"2026-03-13T11:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.486862 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.486931 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.486949 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.486975 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.486994 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:58Z","lastTransitionTime":"2026-03-13T11:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.590830 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.590876 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.590889 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.590918 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.590933 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:58Z","lastTransitionTime":"2026-03-13T11:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.694495 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.694561 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.694578 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.694605 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.694623 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:58Z","lastTransitionTime":"2026-03-13T11:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.797964 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.798348 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.798426 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.798537 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.798609 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:58Z","lastTransitionTime":"2026-03-13T11:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.901817 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.901868 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.901883 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.901908 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:58 crc kubenswrapper[4837]: I0313 11:49:58.901924 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:58Z","lastTransitionTime":"2026-03-13T11:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.005420 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.005501 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.005519 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.005550 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.005574 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:59Z","lastTransitionTime":"2026-03-13T11:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.047357 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.047473 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.047473 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.047692 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:49:59 crc kubenswrapper[4837]: E0313 11:49:59.048008 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:49:59 crc kubenswrapper[4837]: E0313 11:49:59.047846 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:49:59 crc kubenswrapper[4837]: E0313 11:49:59.048142 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:49:59 crc kubenswrapper[4837]: E0313 11:49:59.048246 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.109138 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.109841 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.109893 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.109935 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.109954 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:59Z","lastTransitionTime":"2026-03-13T11:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.213779 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.213837 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.213854 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.213878 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.213899 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:59Z","lastTransitionTime":"2026-03-13T11:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.317567 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.317914 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.318023 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.318140 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.318242 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:59Z","lastTransitionTime":"2026-03-13T11:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.424404 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.424453 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.424467 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.424487 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.424504 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:59Z","lastTransitionTime":"2026-03-13T11:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.526867 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.526909 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.526919 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.526935 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.526945 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:59Z","lastTransitionTime":"2026-03-13T11:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.629432 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.629694 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.629723 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.629788 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.629808 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:59Z","lastTransitionTime":"2026-03-13T11:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.733581 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.733688 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.733704 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.733753 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.733769 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:59Z","lastTransitionTime":"2026-03-13T11:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.836601 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.836690 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.836714 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.836749 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.836772 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:59Z","lastTransitionTime":"2026-03-13T11:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.939405 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.939466 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.939483 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.939510 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:49:59 crc kubenswrapper[4837]: I0313 11:49:59.939533 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:49:59Z","lastTransitionTime":"2026-03-13T11:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.042569 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.042706 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.042737 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.042769 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.042791 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:00Z","lastTransitionTime":"2026-03-13T11:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.049304 4837 scope.go:117] "RemoveContainer" containerID="e391e6d06012bec4c5b5d6fdde2effc343d6321eccbe517c9a83736be9b553d4" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.146044 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.146112 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.146132 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.146159 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.146183 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:00Z","lastTransitionTime":"2026-03-13T11:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.249156 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.249199 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.249210 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.249228 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.249239 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:00Z","lastTransitionTime":"2026-03-13T11:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.351261 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.351313 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.351330 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.351350 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.351364 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:00Z","lastTransitionTime":"2026-03-13T11:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.453919 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.453963 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.453973 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.453987 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.453997 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:00Z","lastTransitionTime":"2026-03-13T11:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.559083 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.559133 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.559148 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.559167 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.559179 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:00Z","lastTransitionTime":"2026-03-13T11:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.617146 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4zzrs_43df29f7-1351-41f5-bfca-17f804837cb4/ovnkube-controller/1.log" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.620302 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" event={"ID":"43df29f7-1351-41f5-bfca-17f804837cb4","Type":"ContainerStarted","Data":"7f1cbcbcc13da4f4e1d2b4678deafdb330e2c7587d8bd8d528597f279c254ff7"} Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.620827 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.636787 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:00Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.651287 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:00Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.661924 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.661973 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.661987 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.662008 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.662021 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:00Z","lastTransitionTime":"2026-03-13T11:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.666094 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:00Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.679901 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:00Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.690598 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:00Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.704126 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:00Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.716472 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:00Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.730121 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:00Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.741497 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:00Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.753584 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:00Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.763812 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.763843 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.763852 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.763867 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.763877 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:00Z","lastTransitionTime":"2026-03-13T11:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.776456 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f1cbcbcc13da4f4e1d2b4678deafdb330e2c7587d8bd8d528597f279c254ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e391e6d06012bec4c5b5d6fdde2effc343d6321eccbe517c9a83736be9b553d4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"message\\\":\\\":160\\\\nI0313 11:49:46.362104 6913 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 11:49:46.362220 6913 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0313 11:49:46.362340 6913 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 11:49:46.362469 6913 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 11:49:46.362889 6913 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 11:49:46.374778 6913 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0313 11:49:46.374817 6913 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0313 11:49:46.374885 6913 ovnkube.go:599] Stopped ovnkube\\\\nI0313 11:49:46.374926 6913 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 11:49:46.375025 6913 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:50:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:00Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.797407 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b481010-5fbc-4c5c-b782-9dbb7524023e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4286e1cf3e088b3ccc0949721368fe176894a5d6bdf8d1dd108b92adecf45952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c00ffa41f4f30f0516fe955d957ac92818f9576557f7e1352070e221ac7b09d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae595b4ed8facfb5d9a747dac75233102bd05bc21e4bd5c644c0a1985bb7ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a7546e653505747aa787947982ccf181e3209cc3110f8bde34360ea73a1c69d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f3bbb38d2bec20e9b96f72dee3906973b4cc3e658d067928a46a8de37652f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:00Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.810761 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb785bc-eb5f-41db-9d64-f1cecd2d25f9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f22c5fe3a62270693c25f87ecfb55bdd775a49445bc2d88cb26ec6c6daf2291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35cb83c3dfbdb94194292c22b9c7a42478f1dff83f6f703c45da3c08613a8da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b14790e78b11453c1d1b4a35d40c25fa01684c6b20f05cac9002eda7645cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f0e16118f5b414af37ef05c357d964583bfd8467d1f7434ce8e778334909a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f0e16118f5b414af37ef05c357d964583bfd8467d1f7434ce8e778334909a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:00Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.824717 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:00Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.838475 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:00Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.857417 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93dcd114-c39a-4b27-aa9c-a42e3ef7cd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4f7913ed2023bd133ac1171cd590f8b0366200f10ee3b27c1d2c3195fc8ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:49:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 11:49:10.789921 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:49:10.790862 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:49:10.792348 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1361817431/tls.crt::/tmp/serving-cert-1361817431/tls.key\\\\\\\"\\\\nI0313 11:49:11.060533 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:49:11.064576 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:49:11.064598 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:49:11.064618 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:49:11.064623 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:49:11.074003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:49:11.074062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074073 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:49:11.074096 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:49:11.074104 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:49:11.074113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:49:11.074181 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:49:11.075668 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:00Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.866462 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.866505 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.866517 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.866535 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.866549 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:00Z","lastTransitionTime":"2026-03-13T11:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.875233 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef0f102e98673ab18c97a49b7663d696cfc34b8a477b625c17720f895014e128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:00Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.968174 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.968214 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.968223 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.968242 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:00 crc kubenswrapper[4837]: I0313 11:50:00.968252 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:00Z","lastTransitionTime":"2026-03-13T11:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.048155 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.048217 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.048307 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:01 crc kubenswrapper[4837]: E0313 11:50:01.048486 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.048782 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:01 crc kubenswrapper[4837]: E0313 11:50:01.048876 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:01 crc kubenswrapper[4837]: E0313 11:50:01.049300 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:01 crc kubenswrapper[4837]: E0313 11:50:01.049396 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.071183 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.071232 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.071244 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.071261 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.071271 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:01Z","lastTransitionTime":"2026-03-13T11:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.174666 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.174714 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.174730 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.174750 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.174765 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:01Z","lastTransitionTime":"2026-03-13T11:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.278169 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.278203 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.278215 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.278231 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.278242 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:01Z","lastTransitionTime":"2026-03-13T11:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.380162 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.380214 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.380230 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.380253 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.380268 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:01Z","lastTransitionTime":"2026-03-13T11:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.483105 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.483192 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.483210 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.483235 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.483252 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:01Z","lastTransitionTime":"2026-03-13T11:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.586346 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.586389 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.586400 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.586421 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.586432 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:01Z","lastTransitionTime":"2026-03-13T11:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.627019 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4zzrs_43df29f7-1351-41f5-bfca-17f804837cb4/ovnkube-controller/2.log" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.627947 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4zzrs_43df29f7-1351-41f5-bfca-17f804837cb4/ovnkube-controller/1.log" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.632601 4837 generic.go:334] "Generic (PLEG): container finished" podID="43df29f7-1351-41f5-bfca-17f804837cb4" containerID="7f1cbcbcc13da4f4e1d2b4678deafdb330e2c7587d8bd8d528597f279c254ff7" exitCode=1 Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.632659 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" event={"ID":"43df29f7-1351-41f5-bfca-17f804837cb4","Type":"ContainerDied","Data":"7f1cbcbcc13da4f4e1d2b4678deafdb330e2c7587d8bd8d528597f279c254ff7"} Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.632714 4837 scope.go:117] "RemoveContainer" containerID="e391e6d06012bec4c5b5d6fdde2effc343d6321eccbe517c9a83736be9b553d4" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.634715 4837 scope.go:117] "RemoveContainer" containerID="7f1cbcbcc13da4f4e1d2b4678deafdb330e2c7587d8bd8d528597f279c254ff7" Mar 13 11:50:01 crc kubenswrapper[4837]: E0313 11:50:01.635349 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4zzrs_openshift-ovn-kubernetes(43df29f7-1351-41f5-bfca-17f804837cb4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.660397 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93dcd114-c39a-4b27-aa9c-a42e3ef7cd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4f7913ed2023bd133ac1171cd590f8b0366200f10ee3b27c1d2c3195fc8ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:49:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 11:49:10.789921 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:49:10.790862 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:49:10.792348 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1361817431/tls.crt::/tmp/serving-cert-1361817431/tls.key\\\\\\\"\\\\nI0313 11:49:11.060533 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:49:11.064576 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:49:11.064598 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:49:11.064618 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:49:11.064623 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:49:11.074003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:49:11.074062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074073 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:49:11.074096 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:49:11.074104 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:49:11.074113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:49:11.074181 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:49:11.075668 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:01Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.689411 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.689470 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.689490 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.689515 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.689535 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:01Z","lastTransitionTime":"2026-03-13T11:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.691330 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef0f102e98673ab18c97a49b7663d696cfc34b8a477b625c17720f895014e128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:01Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.705340 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:01Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.720304 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:01Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.734893 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:01Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.751163 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:01Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.765028 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:01Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.781783 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:01Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.792175 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.792348 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.792406 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.792470 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.792528 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:01Z","lastTransitionTime":"2026-03-13T11:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.802435 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:01Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.816113 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:01Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.838664 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b481010-5fbc-4c5c-b782-9dbb7524023e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4286e1cf3e088b3ccc0949721368fe176894a5d6bdf8d1dd108b92adecf45952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c00ffa41f4f30f0516fe955d957ac92818f9576557f7e1352070e221ac7b09d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae595b4ed8facfb5d9a747dac75233102bd05bc21e4bd5c644c0a1985bb7ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a7546e653505747aa787947982ccf181e3209cc3110f8bde34360ea73a1c69d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f3bbb38d2bec20e9b96f72dee3906973b4cc3e658d067928a46a8de37652f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:01Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.856558 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb785bc-eb5f-41db-9d64-f1cecd2d25f9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f22c5fe3a62270693c25f87ecfb55bdd775a49445bc2d88cb26ec6c6daf2291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35cb83c3dfbdb94194292c22b9c7a42478f1dff83f6f703c45da3c08613a8da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b14790e78b11453c1d1b4a35d40c25fa01684c6b20f05cac9002eda7645cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f0e16118f5b414af37ef05c357d964583bfd8467d1f7434ce8e778334909a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f0e16118f5b414af37ef05c357d964583bfd8467d1f7434ce8e778334909a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:01Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.870218 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:01Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.881532 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:01Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.895065 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.895099 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.895107 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.895121 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.895131 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:01Z","lastTransitionTime":"2026-03-13T11:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.895433 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:01Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.908662 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:01Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.930082 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f1cbcbcc13da4f4e1d2b4678deafdb330e2c7587d8bd8d528597f279c254ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e391e6d06012bec4c5b5d6fdde2effc343d6321eccbe517c9a83736be9b553d4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:49:46Z\\\",\\\"message\\\":\\\":160\\\\nI0313 11:49:46.362104 6913 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 11:49:46.362220 6913 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0313 11:49:46.362340 6913 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 11:49:46.362469 6913 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 11:49:46.362889 6913 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 11:49:46.374778 6913 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0313 11:49:46.374817 6913 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0313 11:49:46.374885 6913 ovnkube.go:599] Stopped ovnkube\\\\nI0313 11:49:46.374926 6913 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 11:49:46.375025 6913 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f1cbcbcc13da4f4e1d2b4678deafdb330e2c7587d8bd8d528597f279c254ff7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:50:00Z\\\",\\\"message\\\":\\\":00.918999 7126 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 11:50:00.919044 7126 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0313 11:50:00.919014 7126 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 11:50:00.919071 7126 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0313 11:50:00.919092 7126 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0313 11:50:00.919110 7126 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 11:50:00.919107 7126 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 11:50:00.919127 7126 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 11:50:00.919146 7126 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0313 11:50:00.919153 7126 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 11:50:00.919178 7126 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0313 11:50:00.919162 7126 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0313 11:50:00.919211 7126 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 11:50:00.919218 7126 factory.go:656] Stopping watch factory\\\\nI0313 11:50:00.919227 7126 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0313 11:50:00.919233 7126 ovnkube.go:599] Stopped ovnkube\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:50:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:01Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.998803 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.998838 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.998846 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.998860 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:01 crc kubenswrapper[4837]: I0313 11:50:01.998870 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:01Z","lastTransitionTime":"2026-03-13T11:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.101452 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.102079 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.102195 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.102295 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.102380 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:02Z","lastTransitionTime":"2026-03-13T11:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.205171 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.205233 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.205298 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.205327 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.205348 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:02Z","lastTransitionTime":"2026-03-13T11:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.308259 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.308313 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.308324 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.308343 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.308356 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:02Z","lastTransitionTime":"2026-03-13T11:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.411362 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.411418 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.411435 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.411459 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.411471 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:02Z","lastTransitionTime":"2026-03-13T11:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.514060 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.514098 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.514108 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.514124 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.514133 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:02Z","lastTransitionTime":"2026-03-13T11:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.616768 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.616813 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.616827 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.616848 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.616862 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:02Z","lastTransitionTime":"2026-03-13T11:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.637466 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4zzrs_43df29f7-1351-41f5-bfca-17f804837cb4/ovnkube-controller/2.log" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.641014 4837 scope.go:117] "RemoveContainer" containerID="7f1cbcbcc13da4f4e1d2b4678deafdb330e2c7587d8bd8d528597f279c254ff7" Mar 13 11:50:02 crc kubenswrapper[4837]: E0313 11:50:02.641191 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4zzrs_openshift-ovn-kubernetes(43df29f7-1351-41f5-bfca-17f804837cb4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.655695 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:02Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.670253 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:02Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.684280 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:02Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.698595 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:02Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.711495 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:02Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.719370 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.719416 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.719425 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.719439 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.719449 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:02Z","lastTransitionTime":"2026-03-13T11:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.723965 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:02Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.737702 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:02Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.754859 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:02Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.775008 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f1cbcbcc13da4f4e1d2b4678deafdb330e2c7587d8bd8d528597f279c254ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f1cbcbcc13da4f4e1d2b4678deafdb330e2c7587d8bd8d528597f279c254ff7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:50:00Z\\\",\\\"message\\\":\\\":00.918999 7126 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 11:50:00.919044 7126 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0313 11:50:00.919014 7126 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 11:50:00.919071 7126 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0313 11:50:00.919092 7126 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0313 11:50:00.919110 7126 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 11:50:00.919107 7126 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 11:50:00.919127 7126 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 11:50:00.919146 7126 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0313 11:50:00.919153 7126 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 11:50:00.919178 7126 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0313 11:50:00.919162 7126 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0313 11:50:00.919211 7126 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 11:50:00.919218 7126 factory.go:656] Stopping watch factory\\\\nI0313 11:50:00.919227 7126 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0313 11:50:00.919233 7126 ovnkube.go:599] Stopped ovnkube\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:50:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4zzrs_openshift-ovn-kubernetes(43df29f7-1351-41f5-bfca-17f804837cb4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:02Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.797898 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b481010-5fbc-4c5c-b782-9dbb7524023e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4286e1cf3e088b3ccc0949721368fe176894a5d6bdf8d1dd108b92adecf45952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c00ffa41f4f30f0516fe955d957ac92818f9576557f7e1352070e221ac7b09d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae595b4ed8facfb5d9a747dac75233102bd05bc21e4bd5c644c0a1985bb7ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a7546e653505747aa787947982ccf181e3209cc3110f8bde34360ea73a1c69d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f3bbb38d2bec20e9b96f72dee3906973b4cc3e658d067928a46a8de37652f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:02Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.814228 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb785bc-eb5f-41db-9d64-f1cecd2d25f9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f22c5fe3a62270693c25f87ecfb55bdd775a49445bc2d88cb26ec6c6daf2291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35cb83c3dfbdb94194292c22b9c7a42478f1dff83f6f703c45da3c08613a8da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b14790e78b11453c1d1b4a35d40c25fa01684c6b20f05cac9002eda7645cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f0e16118f5b414af37ef05c357d964583bfd8467d1f7434ce8e778334909a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f0e16118f5b414af37ef05c357d964583bfd8467d1f7434ce8e778334909a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:02Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.825621 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.825680 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.825688 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.825703 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.825712 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:02Z","lastTransitionTime":"2026-03-13T11:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.835433 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:02Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.853222 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93dcd114-c39a-4b27-aa9c-a42e3ef7cd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4f7913ed2023bd133ac1171cd590f8b0366200f10ee3b27c1d2c3195fc8ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:49:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 11:49:10.789921 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:49:10.790862 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:49:10.792348 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1361817431/tls.crt::/tmp/serving-cert-1361817431/tls.key\\\\\\\"\\\\nI0313 11:49:11.060533 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:49:11.064576 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:49:11.064598 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:49:11.064618 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:49:11.064623 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:49:11.074003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:49:11.074062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074073 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:49:11.074096 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:49:11.074104 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:49:11.074113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:49:11.074181 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:49:11.075668 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:02Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.872553 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef0f102e98673ab18c97a49b7663d696cfc34b8a477b625c17720f895014e128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:02Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.887741 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:02Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.903405 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:02Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.917871 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:02Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.928790 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.928844 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.928854 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.928870 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:02 crc kubenswrapper[4837]: I0313 11:50:02.928881 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:02Z","lastTransitionTime":"2026-03-13T11:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.031359 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.031408 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.031422 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.031441 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.031453 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:03Z","lastTransitionTime":"2026-03-13T11:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.050185 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:03 crc kubenswrapper[4837]: E0313 11:50:03.050310 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.050730 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:03 crc kubenswrapper[4837]: E0313 11:50:03.050798 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.050846 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:03 crc kubenswrapper[4837]: E0313 11:50:03.050892 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.050938 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:03 crc kubenswrapper[4837]: E0313 11:50:03.050990 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.134614 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.134694 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.134719 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.134746 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.134767 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:03Z","lastTransitionTime":"2026-03-13T11:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.237130 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.237180 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.237191 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.237209 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.237220 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:03Z","lastTransitionTime":"2026-03-13T11:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.340178 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.340238 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.340251 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.340268 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.340280 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:03Z","lastTransitionTime":"2026-03-13T11:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.443381 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.443424 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.443432 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.443449 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.443458 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:03Z","lastTransitionTime":"2026-03-13T11:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.545471 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.545517 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.545528 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.545545 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.545556 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:03Z","lastTransitionTime":"2026-03-13T11:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.646937 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.646973 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.646985 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.647000 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.647011 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:03Z","lastTransitionTime":"2026-03-13T11:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.749732 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.749789 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.749799 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.749818 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.749831 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:03Z","lastTransitionTime":"2026-03-13T11:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.852271 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.852310 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.852318 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.852332 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.852341 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:03Z","lastTransitionTime":"2026-03-13T11:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.954402 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.954444 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.954456 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.954473 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:03 crc kubenswrapper[4837]: I0313 11:50:03.954483 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:03Z","lastTransitionTime":"2026-03-13T11:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.057515 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.057603 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.057616 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.057633 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.057663 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:04Z","lastTransitionTime":"2026-03-13T11:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.159671 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.159721 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.159733 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.159759 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.159771 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:04Z","lastTransitionTime":"2026-03-13T11:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.263671 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.263731 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.263746 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.263771 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.263789 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:04Z","lastTransitionTime":"2026-03-13T11:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.367613 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.367697 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.367708 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.367727 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.367740 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:04Z","lastTransitionTime":"2026-03-13T11:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.470719 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.470763 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.470773 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.470789 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.470799 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:04Z","lastTransitionTime":"2026-03-13T11:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.573924 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.573992 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.574003 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.574021 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.574035 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:04Z","lastTransitionTime":"2026-03-13T11:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.677630 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.677731 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.677744 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.677768 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.677781 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:04Z","lastTransitionTime":"2026-03-13T11:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.781234 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.781310 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.781359 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.781383 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.781398 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:04Z","lastTransitionTime":"2026-03-13T11:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.885277 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.885342 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.885351 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.885369 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:04 crc kubenswrapper[4837]: I0313 11:50:04.885380 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:04Z","lastTransitionTime":"2026-03-13T11:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:04 crc kubenswrapper[4837]: E0313 11:50:04.986286 4837 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 13 11:50:05 crc kubenswrapper[4837]: I0313 11:50:05.047946 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:05 crc kubenswrapper[4837]: E0313 11:50:05.048135 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:05 crc kubenswrapper[4837]: I0313 11:50:05.048257 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:05 crc kubenswrapper[4837]: E0313 11:50:05.048393 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:05 crc kubenswrapper[4837]: I0313 11:50:05.048469 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:05 crc kubenswrapper[4837]: E0313 11:50:05.048528 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:05 crc kubenswrapper[4837]: I0313 11:50:05.048596 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:05 crc kubenswrapper[4837]: E0313 11:50:05.048677 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:05 crc kubenswrapper[4837]: I0313 11:50:05.064528 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:05Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:05 crc kubenswrapper[4837]: I0313 11:50:05.078982 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:05Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:05 crc kubenswrapper[4837]: I0313 11:50:05.093233 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:05Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:05 crc kubenswrapper[4837]: I0313 11:50:05.110662 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:05Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:05 crc kubenswrapper[4837]: I0313 11:50:05.124958 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:05Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:05 crc kubenswrapper[4837]: I0313 11:50:05.137516 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:05Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:05 crc kubenswrapper[4837]: I0313 11:50:05.152542 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:05Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:05 crc kubenswrapper[4837]: E0313 11:50:05.158115 4837 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 11:50:05 crc kubenswrapper[4837]: I0313 11:50:05.169749 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:05Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:05 crc kubenswrapper[4837]: I0313 11:50:05.185675 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb785bc-eb5f-41db-9d64-f1cecd2d25f9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f22c5fe3a62270693c25f87ecfb55bdd775a49445bc2d88cb26ec6c6daf2291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35cb83c3dfbdb94194292c22b9c7a42478f1dff83f6f703c45da3c08613a8da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b14790e78b11453c1d1b4a35d40c25fa01684c6b20f05cac9002eda7645cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f0e16118f5b414af37ef05c357d964583bfd8467d1f7434ce8e778334909a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f0e16118f5b414af37ef05c357d964583bfd8467d1f7434ce8e778334909a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:05Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:05 crc kubenswrapper[4837]: I0313 11:50:05.200025 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:05Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:05 crc kubenswrapper[4837]: I0313 11:50:05.215815 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:05Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:05 crc kubenswrapper[4837]: I0313 11:50:05.233749 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:05Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:05 crc kubenswrapper[4837]: I0313 11:50:05.251224 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:05Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:05 crc kubenswrapper[4837]: I0313 11:50:05.272057 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f1cbcbcc13da4f4e1d2b4678deafdb330e2c7587d8bd8d528597f279c254ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f1cbcbcc13da4f4e1d2b4678deafdb330e2c7587d8bd8d528597f279c254ff7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:50:00Z\\\",\\\"message\\\":\\\":00.918999 7126 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 11:50:00.919044 7126 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0313 11:50:00.919014 7126 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 11:50:00.919071 7126 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0313 11:50:00.919092 7126 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0313 11:50:00.919110 7126 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 11:50:00.919107 7126 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 11:50:00.919127 7126 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 11:50:00.919146 7126 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0313 11:50:00.919153 7126 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 11:50:00.919178 7126 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0313 11:50:00.919162 7126 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0313 11:50:00.919211 7126 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 11:50:00.919218 7126 factory.go:656] Stopping watch factory\\\\nI0313 11:50:00.919227 7126 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0313 11:50:00.919233 7126 ovnkube.go:599] Stopped ovnkube\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:50:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4zzrs_openshift-ovn-kubernetes(43df29f7-1351-41f5-bfca-17f804837cb4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:05Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:05 crc kubenswrapper[4837]: I0313 11:50:05.291829 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b481010-5fbc-4c5c-b782-9dbb7524023e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4286e1cf3e088b3ccc0949721368fe176894a5d6bdf8d1dd108b92adecf45952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c00ffa41f4f30f0516fe955d957ac92818f9576557f7e1352070e221ac7b09d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae595b4ed8facfb5d9a747dac75233102bd05bc21e4bd5c644c0a1985bb7ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a7546e653505747aa787947982ccf181e3209cc3110f8bde34360ea73a1c69d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f3bbb38d2bec20e9b96f72dee3906973b4cc3e658d067928a46a8de37652f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:05Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:05 crc kubenswrapper[4837]: I0313 11:50:05.307161 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef0f102e98673ab18c97a49b7663d696cfc34b8a477b625c17720f895014e128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:05Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:05 crc kubenswrapper[4837]: I0313 11:50:05.320608 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93dcd114-c39a-4b27-aa9c-a42e3ef7cd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4f7913ed2023bd133ac1171cd590f8b0366200f10ee3b27c1d2c3195fc8ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:49:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 11:49:10.789921 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:49:10.790862 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:49:10.792348 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1361817431/tls.crt::/tmp/serving-cert-1361817431/tls.key\\\\\\\"\\\\nI0313 11:49:11.060533 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:49:11.064576 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:49:11.064598 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:49:11.064618 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:49:11.064623 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:49:11.074003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:49:11.074062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074073 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:49:11.074096 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:49:11.074104 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:49:11.074113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:49:11.074181 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:49:11.075668 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:05Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.383779 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.401246 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:06Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.421089 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:06Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.439531 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:06Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.454679 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:06Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.469722 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:06Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.482059 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:06Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.496358 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:06Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.511956 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:06Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.525730 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:06Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.537576 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:06Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.551314 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:06Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.578217 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f1cbcbcc13da4f4e1d2b4678deafdb330e2c7587d8bd8d528597f279c254ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f1cbcbcc13da4f4e1d2b4678deafdb330e2c7587d8bd8d528597f279c254ff7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:50:00Z\\\",\\\"message\\\":\\\":00.918999 7126 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 11:50:00.919044 7126 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0313 11:50:00.919014 7126 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 11:50:00.919071 7126 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0313 11:50:00.919092 7126 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0313 11:50:00.919110 7126 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 11:50:00.919107 7126 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 11:50:00.919127 7126 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 11:50:00.919146 7126 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0313 11:50:00.919153 7126 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 11:50:00.919178 7126 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0313 11:50:00.919162 7126 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0313 11:50:00.919211 7126 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 11:50:00.919218 7126 factory.go:656] Stopping watch factory\\\\nI0313 11:50:00.919227 7126 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0313 11:50:00.919233 7126 ovnkube.go:599] Stopped ovnkube\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:50:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4zzrs_openshift-ovn-kubernetes(43df29f7-1351-41f5-bfca-17f804837cb4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:06Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.596083 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b481010-5fbc-4c5c-b782-9dbb7524023e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4286e1cf3e088b3ccc0949721368fe176894a5d6bdf8d1dd108b92adecf45952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c00ffa41f4f30f0516fe955d957ac92818f9576557f7e1352070e221ac7b09d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae595b4ed8facfb5d9a747dac75233102bd05bc21e4bd5c644c0a1985bb7ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a7546e653505747aa787947982ccf181e3209cc3110f8bde34360ea73a1c69d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f3bbb38d2bec20e9b96f72dee3906973b4cc3e658d067928a46a8de37652f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:06Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.607082 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb785bc-eb5f-41db-9d64-f1cecd2d25f9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f22c5fe3a62270693c25f87ecfb55bdd775a49445bc2d88cb26ec6c6daf2291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35cb83c3dfbdb94194292c22b9c7a42478f1dff83f6f703c45da3c08613a8da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b14790e78b11453c1d1b4a35d40c25fa01684c6b20f05cac9002eda7645cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f0e16118f5b414af37ef05c357d964583bfd8467d1f7434ce8e778334909a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f0e16118f5b414af37ef05c357d964583bfd8467d1f7434ce8e778334909a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:06Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.618690 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:06Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.632345 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93dcd114-c39a-4b27-aa9c-a42e3ef7cd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4f7913ed2023bd133ac1171cd590f8b0366200f10ee3b27c1d2c3195fc8ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:49:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 11:49:10.789921 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:49:10.790862 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:49:10.792348 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1361817431/tls.crt::/tmp/serving-cert-1361817431/tls.key\\\\\\\"\\\\nI0313 11:49:11.060533 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:49:11.064576 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:49:11.064598 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:49:11.064618 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:49:11.064623 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:49:11.074003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:49:11.074062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074073 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:49:11.074096 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:49:11.074104 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:49:11.074113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:49:11.074181 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:49:11.075668 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:06Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.648745 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef0f102e98673ab18c97a49b7663d696cfc34b8a477b625c17720f895014e128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:06Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.688077 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.688122 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.688131 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.688146 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.688155 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:06Z","lastTransitionTime":"2026-03-13T11:50:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:06 crc kubenswrapper[4837]: E0313 11:50:06.702211 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:06Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.706226 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.706261 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.706270 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.706286 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.706295 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:06Z","lastTransitionTime":"2026-03-13T11:50:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:06 crc kubenswrapper[4837]: E0313 11:50:06.718871 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:06Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.723712 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.723746 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.723754 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.723769 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.723778 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:06Z","lastTransitionTime":"2026-03-13T11:50:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:06 crc kubenswrapper[4837]: E0313 11:50:06.738106 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:06Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.744090 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.744123 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.744132 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.744146 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.744156 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:06Z","lastTransitionTime":"2026-03-13T11:50:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:06 crc kubenswrapper[4837]: E0313 11:50:06.763700 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:06Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.768206 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.768245 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.768253 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.768270 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:06 crc kubenswrapper[4837]: I0313 11:50:06.768280 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:06Z","lastTransitionTime":"2026-03-13T11:50:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:06 crc kubenswrapper[4837]: E0313 11:50:06.780021 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:06Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:06 crc kubenswrapper[4837]: E0313 11:50:06.780135 4837 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 11:50:07 crc kubenswrapper[4837]: I0313 11:50:07.008200 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:50:07 crc kubenswrapper[4837]: I0313 11:50:07.008333 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:07 crc kubenswrapper[4837]: I0313 11:50:07.008367 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:07 crc kubenswrapper[4837]: I0313 11:50:07.008409 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:07 crc kubenswrapper[4837]: I0313 11:50:07.008425 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:07 crc kubenswrapper[4837]: E0313 11:50:07.008467 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:50:39.008432331 +0000 UTC m=+154.646699134 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:07 crc kubenswrapper[4837]: E0313 11:50:07.008538 4837 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 11:50:07 crc kubenswrapper[4837]: E0313 11:50:07.008551 4837 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 11:50:07 crc kubenswrapper[4837]: E0313 11:50:07.008598 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 11:50:39.008583704 +0000 UTC m=+154.646850467 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 11:50:07 crc kubenswrapper[4837]: E0313 11:50:07.008613 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 11:50:39.008607475 +0000 UTC m=+154.646874238 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 11:50:07 crc kubenswrapper[4837]: E0313 11:50:07.008615 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 11:50:07 crc kubenswrapper[4837]: E0313 11:50:07.008666 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 11:50:07 crc kubenswrapper[4837]: E0313 11:50:07.008684 4837 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:50:07 crc kubenswrapper[4837]: E0313 11:50:07.008619 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 11:50:07 crc kubenswrapper[4837]: E0313 11:50:07.008759 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 11:50:07 crc kubenswrapper[4837]: E0313 11:50:07.008773 4837 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:50:07 crc kubenswrapper[4837]: E0313 11:50:07.008723 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 11:50:39.008708648 +0000 UTC m=+154.646975631 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:50:07 crc kubenswrapper[4837]: E0313 11:50:07.008851 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 11:50:39.008833302 +0000 UTC m=+154.647100065 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:50:07 crc kubenswrapper[4837]: I0313 11:50:07.047835 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:07 crc kubenswrapper[4837]: I0313 11:50:07.047835 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:07 crc kubenswrapper[4837]: E0313 11:50:07.047978 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:07 crc kubenswrapper[4837]: I0313 11:50:07.048000 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:07 crc kubenswrapper[4837]: I0313 11:50:07.047983 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:07 crc kubenswrapper[4837]: E0313 11:50:07.048073 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:07 crc kubenswrapper[4837]: E0313 11:50:07.048224 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:07 crc kubenswrapper[4837]: E0313 11:50:07.048331 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:07 crc kubenswrapper[4837]: I0313 11:50:07.109141 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86e5afeb-4720-4593-a53e-dfb5381d0b1d-metrics-certs\") pod \"network-metrics-daemon-cjn4q\" (UID: \"86e5afeb-4720-4593-a53e-dfb5381d0b1d\") " pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:07 crc kubenswrapper[4837]: E0313 11:50:07.109341 4837 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 11:50:07 crc kubenswrapper[4837]: E0313 11:50:07.109417 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86e5afeb-4720-4593-a53e-dfb5381d0b1d-metrics-certs podName:86e5afeb-4720-4593-a53e-dfb5381d0b1d nodeName:}" failed. No retries permitted until 2026-03-13 11:50:39.109399029 +0000 UTC m=+154.747665782 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86e5afeb-4720-4593-a53e-dfb5381d0b1d-metrics-certs") pod "network-metrics-daemon-cjn4q" (UID: "86e5afeb-4720-4593-a53e-dfb5381d0b1d") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 11:50:09 crc kubenswrapper[4837]: I0313 11:50:09.047963 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:09 crc kubenswrapper[4837]: E0313 11:50:09.048374 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:09 crc kubenswrapper[4837]: I0313 11:50:09.048099 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:09 crc kubenswrapper[4837]: I0313 11:50:09.048123 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:09 crc kubenswrapper[4837]: E0313 11:50:09.048552 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:09 crc kubenswrapper[4837]: E0313 11:50:09.048601 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:09 crc kubenswrapper[4837]: I0313 11:50:09.048125 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:09 crc kubenswrapper[4837]: E0313 11:50:09.048709 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:10 crc kubenswrapper[4837]: E0313 11:50:10.159280 4837 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 11:50:11 crc kubenswrapper[4837]: I0313 11:50:11.048066 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:11 crc kubenswrapper[4837]: E0313 11:50:11.048226 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:11 crc kubenswrapper[4837]: I0313 11:50:11.048219 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:11 crc kubenswrapper[4837]: I0313 11:50:11.048282 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:11 crc kubenswrapper[4837]: I0313 11:50:11.048372 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:11 crc kubenswrapper[4837]: E0313 11:50:11.048313 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:11 crc kubenswrapper[4837]: E0313 11:50:11.048467 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:11 crc kubenswrapper[4837]: E0313 11:50:11.048541 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:13 crc kubenswrapper[4837]: I0313 11:50:13.047515 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:13 crc kubenswrapper[4837]: I0313 11:50:13.047573 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:13 crc kubenswrapper[4837]: I0313 11:50:13.047676 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:13 crc kubenswrapper[4837]: I0313 11:50:13.047783 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:13 crc kubenswrapper[4837]: E0313 11:50:13.047783 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:13 crc kubenswrapper[4837]: E0313 11:50:13.047987 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:13 crc kubenswrapper[4837]: E0313 11:50:13.048061 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:13 crc kubenswrapper[4837]: E0313 11:50:13.048323 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:14 crc kubenswrapper[4837]: I0313 11:50:14.048458 4837 scope.go:117] "RemoveContainer" containerID="7f1cbcbcc13da4f4e1d2b4678deafdb330e2c7587d8bd8d528597f279c254ff7" Mar 13 11:50:14 crc kubenswrapper[4837]: E0313 11:50:14.048767 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4zzrs_openshift-ovn-kubernetes(43df29f7-1351-41f5-bfca-17f804837cb4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" Mar 13 11:50:15 crc kubenswrapper[4837]: I0313 11:50:15.047512 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:15 crc kubenswrapper[4837]: I0313 11:50:15.047580 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:15 crc kubenswrapper[4837]: E0313 11:50:15.047724 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:15 crc kubenswrapper[4837]: I0313 11:50:15.047546 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:15 crc kubenswrapper[4837]: I0313 11:50:15.047766 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:15 crc kubenswrapper[4837]: E0313 11:50:15.047895 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:15 crc kubenswrapper[4837]: E0313 11:50:15.048064 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:15 crc kubenswrapper[4837]: E0313 11:50:15.048132 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:15 crc kubenswrapper[4837]: I0313 11:50:15.067880 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb785bc-eb5f-41db-9d64-f1cecd2d25f9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f22c5fe3a62270693c25f87ecfb55bdd775a49445bc2d88cb26ec6c6daf2291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35cb83c3dfbdb94194292c22b9c7a42478f1dff83f6f703c45da3c08613a8da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b14790e78b11453c1d1b4a35d40c25fa01684c6b20f05cac9002eda7645cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f0e16118f5b414af37ef05c357d964583bfd8467d1f7434ce8e778334909a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f0e16118f5b414af37ef05c357d964583bfd8467d1f7434ce8e778334909a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:15Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:15 crc kubenswrapper[4837]: I0313 11:50:15.088697 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:15Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:15 crc kubenswrapper[4837]: I0313 11:50:15.104256 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:15Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:15 crc kubenswrapper[4837]: I0313 11:50:15.117132 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:15Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:15 crc kubenswrapper[4837]: I0313 11:50:15.131042 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:15Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:15 crc kubenswrapper[4837]: I0313 11:50:15.149600 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f1cbcbcc13da4f4e1d2b4678deafdb330e2c7587d8bd8d528597f279c254ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f1cbcbcc13da4f4e1d2b4678deafdb330e2c7587d8bd8d528597f279c254ff7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:50:00Z\\\",\\\"message\\\":\\\":00.918999 7126 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 11:50:00.919044 7126 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0313 11:50:00.919014 7126 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 11:50:00.919071 7126 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0313 11:50:00.919092 7126 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0313 11:50:00.919110 7126 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 11:50:00.919107 7126 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 11:50:00.919127 7126 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 11:50:00.919146 7126 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0313 11:50:00.919153 7126 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 11:50:00.919178 7126 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0313 11:50:00.919162 7126 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0313 11:50:00.919211 7126 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 11:50:00.919218 7126 factory.go:656] Stopping watch factory\\\\nI0313 11:50:00.919227 7126 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0313 11:50:00.919233 7126 ovnkube.go:599] Stopped ovnkube\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:50:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4zzrs_openshift-ovn-kubernetes(43df29f7-1351-41f5-bfca-17f804837cb4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:15Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:15 crc kubenswrapper[4837]: E0313 11:50:15.159724 4837 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 11:50:15 crc kubenswrapper[4837]: I0313 11:50:15.169847 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b481010-5fbc-4c5c-b782-9dbb7524023e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4286e1cf3e088b3ccc0949721368fe176894a5d6bdf8d1dd108b92adecf45952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c00ffa41f4f30f0516fe955d957ac92818f9576557f7e1352070e221ac7b09d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae595b4ed8facfb5d9a747dac75233102bd05bc21e4bd5c644c0a1985bb7ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a7546e653505747aa787947982ccf181e3209cc3110f8bde34360ea73a1c69d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f3bbb38d2bec20e9b96f72dee3906973b4cc3e658d067928a46a8de37652f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:15Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:15 crc kubenswrapper[4837]: I0313 11:50:15.184603 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef0f102e98673ab18c97a49b7663d696cfc34b8a477b625c17720f895014e128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:15Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:15 crc kubenswrapper[4837]: I0313 11:50:15.199708 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93dcd114-c39a-4b27-aa9c-a42e3ef7cd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4f7913ed2023bd133ac1171cd590f8b0366200f10ee3b27c1d2c3195fc8ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:49:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 11:49:10.789921 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:49:10.790862 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:49:10.792348 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1361817431/tls.crt::/tmp/serving-cert-1361817431/tls.key\\\\\\\"\\\\nI0313 11:49:11.060533 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:49:11.064576 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:49:11.064598 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:49:11.064618 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:49:11.064623 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:49:11.074003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:49:11.074062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074073 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:49:11.074096 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:49:11.074104 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:49:11.074113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:49:11.074181 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:49:11.075668 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:15Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:15 crc kubenswrapper[4837]: I0313 11:50:15.211934 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:15Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:15 crc kubenswrapper[4837]: I0313 11:50:15.224353 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:15Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:15 crc kubenswrapper[4837]: I0313 11:50:15.235972 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:15Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:15 crc kubenswrapper[4837]: I0313 11:50:15.249581 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:15Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:15 crc kubenswrapper[4837]: I0313 11:50:15.263256 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:15Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:15 crc kubenswrapper[4837]: I0313 11:50:15.274983 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:15Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:15 crc kubenswrapper[4837]: I0313 11:50:15.288666 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:15Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:15 crc kubenswrapper[4837]: I0313 11:50:15.299555 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:15Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:16 crc kubenswrapper[4837]: I0313 11:50:16.838608 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:16 crc kubenswrapper[4837]: I0313 11:50:16.838712 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:16 crc kubenswrapper[4837]: I0313 11:50:16.838731 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:16 crc kubenswrapper[4837]: I0313 11:50:16.838758 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:16 crc kubenswrapper[4837]: I0313 11:50:16.838775 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:16Z","lastTransitionTime":"2026-03-13T11:50:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:16 crc kubenswrapper[4837]: E0313 11:50:16.860391 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:16Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:16 crc kubenswrapper[4837]: I0313 11:50:16.865777 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:16 crc kubenswrapper[4837]: I0313 11:50:16.865840 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:16 crc kubenswrapper[4837]: I0313 11:50:16.865863 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:16 crc kubenswrapper[4837]: I0313 11:50:16.865891 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:16 crc kubenswrapper[4837]: I0313 11:50:16.865913 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:16Z","lastTransitionTime":"2026-03-13T11:50:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:16 crc kubenswrapper[4837]: E0313 11:50:16.886839 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:16Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:16 crc kubenswrapper[4837]: I0313 11:50:16.891356 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:16 crc kubenswrapper[4837]: I0313 11:50:16.891421 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:16 crc kubenswrapper[4837]: I0313 11:50:16.891431 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:16 crc kubenswrapper[4837]: I0313 11:50:16.891448 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:16 crc kubenswrapper[4837]: I0313 11:50:16.891459 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:16Z","lastTransitionTime":"2026-03-13T11:50:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:16 crc kubenswrapper[4837]: E0313 11:50:16.904376 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:16Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:16 crc kubenswrapper[4837]: I0313 11:50:16.908414 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:16 crc kubenswrapper[4837]: I0313 11:50:16.908468 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:16 crc kubenswrapper[4837]: I0313 11:50:16.908479 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:16 crc kubenswrapper[4837]: I0313 11:50:16.908495 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:16 crc kubenswrapper[4837]: I0313 11:50:16.908505 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:16Z","lastTransitionTime":"2026-03-13T11:50:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:16 crc kubenswrapper[4837]: E0313 11:50:16.924312 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:16Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:16 crc kubenswrapper[4837]: I0313 11:50:16.928682 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:16 crc kubenswrapper[4837]: I0313 11:50:16.928715 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:16 crc kubenswrapper[4837]: I0313 11:50:16.928727 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:16 crc kubenswrapper[4837]: I0313 11:50:16.928742 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:16 crc kubenswrapper[4837]: I0313 11:50:16.928752 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:16Z","lastTransitionTime":"2026-03-13T11:50:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:16 crc kubenswrapper[4837]: E0313 11:50:16.940634 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:16Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:16 crc kubenswrapper[4837]: E0313 11:50:16.940762 4837 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 11:50:17 crc kubenswrapper[4837]: I0313 11:50:17.048061 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:17 crc kubenswrapper[4837]: I0313 11:50:17.048096 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:17 crc kubenswrapper[4837]: I0313 11:50:17.048145 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:17 crc kubenswrapper[4837]: I0313 11:50:17.048069 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:17 crc kubenswrapper[4837]: E0313 11:50:17.048190 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:17 crc kubenswrapper[4837]: E0313 11:50:17.048313 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:17 crc kubenswrapper[4837]: E0313 11:50:17.048442 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:17 crc kubenswrapper[4837]: E0313 11:50:17.048607 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:19 crc kubenswrapper[4837]: I0313 11:50:19.047936 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:19 crc kubenswrapper[4837]: I0313 11:50:19.047940 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:19 crc kubenswrapper[4837]: I0313 11:50:19.047997 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:19 crc kubenswrapper[4837]: I0313 11:50:19.048026 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:19 crc kubenswrapper[4837]: E0313 11:50:19.048874 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:19 crc kubenswrapper[4837]: E0313 11:50:19.048977 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:19 crc kubenswrapper[4837]: E0313 11:50:19.049073 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:19 crc kubenswrapper[4837]: E0313 11:50:19.049144 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:20 crc kubenswrapper[4837]: E0313 11:50:20.161430 4837 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 11:50:21 crc kubenswrapper[4837]: I0313 11:50:21.047964 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:21 crc kubenswrapper[4837]: I0313 11:50:21.048024 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:21 crc kubenswrapper[4837]: I0313 11:50:21.048084 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:21 crc kubenswrapper[4837]: I0313 11:50:21.047962 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:21 crc kubenswrapper[4837]: E0313 11:50:21.048142 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:21 crc kubenswrapper[4837]: E0313 11:50:21.048246 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:21 crc kubenswrapper[4837]: E0313 11:50:21.048384 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:21 crc kubenswrapper[4837]: E0313 11:50:21.048752 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:23 crc kubenswrapper[4837]: I0313 11:50:23.048190 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:23 crc kubenswrapper[4837]: I0313 11:50:23.048218 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:23 crc kubenswrapper[4837]: I0313 11:50:23.048270 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:23 crc kubenswrapper[4837]: E0313 11:50:23.048349 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:23 crc kubenswrapper[4837]: I0313 11:50:23.048452 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:23 crc kubenswrapper[4837]: E0313 11:50:23.048485 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:23 crc kubenswrapper[4837]: E0313 11:50:23.048691 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:23 crc kubenswrapper[4837]: E0313 11:50:23.048800 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:23 crc kubenswrapper[4837]: I0313 11:50:23.715361 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qg957_cbb3f4c6-a6c5-4059-8beb-04179d70aff5/kube-multus/0.log" Mar 13 11:50:23 crc kubenswrapper[4837]: I0313 11:50:23.715415 4837 generic.go:334] "Generic (PLEG): container finished" podID="cbb3f4c6-a6c5-4059-8beb-04179d70aff5" containerID="9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27" exitCode=1 Mar 13 11:50:23 crc kubenswrapper[4837]: I0313 11:50:23.715451 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qg957" event={"ID":"cbb3f4c6-a6c5-4059-8beb-04179d70aff5","Type":"ContainerDied","Data":"9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27"} Mar 13 11:50:23 crc kubenswrapper[4837]: I0313 11:50:23.715930 4837 scope.go:117] "RemoveContainer" containerID="9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27" Mar 13 11:50:23 crc kubenswrapper[4837]: I0313 11:50:23.734115 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:23 crc kubenswrapper[4837]: I0313 11:50:23.747734 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:23 crc kubenswrapper[4837]: I0313 11:50:23.760281 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:23 crc kubenswrapper[4837]: I0313 11:50:23.774288 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:23 crc kubenswrapper[4837]: I0313 11:50:23.791096 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:23 crc kubenswrapper[4837]: I0313 11:50:23.805880 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:23 crc kubenswrapper[4837]: I0313 11:50:23.819751 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:23 crc kubenswrapper[4837]: I0313 11:50:23.835283 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:50:22Z\\\",\\\"message\\\":\\\"2026-03-13T11:49:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_94226f47-389c-46ce-a284-334a08311124\\\\n2026-03-13T11:49:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_94226f47-389c-46ce-a284-334a08311124 to /host/opt/cni/bin/\\\\n2026-03-13T11:49:37Z [verbose] multus-daemon started\\\\n2026-03-13T11:49:37Z [verbose] Readiness Indicator file check\\\\n2026-03-13T11:50:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:23 crc kubenswrapper[4837]: I0313 11:50:23.854548 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f1cbcbcc13da4f4e1d2b4678deafdb330e2c7587d8bd8d528597f279c254ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f1cbcbcc13da4f4e1d2b4678deafdb330e2c7587d8bd8d528597f279c254ff7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:50:00Z\\\",\\\"message\\\":\\\":00.918999 7126 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 11:50:00.919044 7126 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0313 11:50:00.919014 7126 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 11:50:00.919071 7126 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0313 11:50:00.919092 7126 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0313 11:50:00.919110 7126 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 11:50:00.919107 7126 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 11:50:00.919127 7126 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 11:50:00.919146 7126 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0313 11:50:00.919153 7126 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 11:50:00.919178 7126 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0313 11:50:00.919162 7126 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0313 11:50:00.919211 7126 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 11:50:00.919218 7126 factory.go:656] Stopping watch factory\\\\nI0313 11:50:00.919227 7126 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0313 11:50:00.919233 7126 ovnkube.go:599] Stopped ovnkube\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:50:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4zzrs_openshift-ovn-kubernetes(43df29f7-1351-41f5-bfca-17f804837cb4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:23 crc kubenswrapper[4837]: I0313 11:50:23.874933 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b481010-5fbc-4c5c-b782-9dbb7524023e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4286e1cf3e088b3ccc0949721368fe176894a5d6bdf8d1dd108b92adecf45952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c00ffa41f4f30f0516fe955d957ac92818f9576557f7e1352070e221ac7b09d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae595b4ed8facfb5d9a747dac75233102bd05bc21e4bd5c644c0a1985bb7ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a7546e653505747aa787947982ccf181e3209cc3110f8bde34360ea73a1c69d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f3bbb38d2bec20e9b96f72dee3906973b4cc3e658d067928a46a8de37652f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:23 crc kubenswrapper[4837]: I0313 11:50:23.887450 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb785bc-eb5f-41db-9d64-f1cecd2d25f9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f22c5fe3a62270693c25f87ecfb55bdd775a49445bc2d88cb26ec6c6daf2291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35cb83c3dfbdb94194292c22b9c7a42478f1dff83f6f703c45da3c08613a8da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b14790e78b11453c1d1b4a35d40c25fa01684c6b20f05cac9002eda7645cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f0e16118f5b414af37ef05c357d964583bfd8467d1f7434ce8e778334909a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f0e16118f5b414af37ef05c357d964583bfd8467d1f7434ce8e778334909a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:23 crc kubenswrapper[4837]: I0313 11:50:23.902856 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:23 crc kubenswrapper[4837]: I0313 11:50:23.915976 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93dcd114-c39a-4b27-aa9c-a42e3ef7cd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4f7913ed2023bd133ac1171cd590f8b0366200f10ee3b27c1d2c3195fc8ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:49:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 11:49:10.789921 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:49:10.790862 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:49:10.792348 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1361817431/tls.crt::/tmp/serving-cert-1361817431/tls.key\\\\\\\"\\\\nI0313 11:49:11.060533 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:49:11.064576 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:49:11.064598 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:49:11.064618 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:49:11.064623 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:49:11.074003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:49:11.074062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074073 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:49:11.074096 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:49:11.074104 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:49:11.074113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:49:11.074181 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:49:11.075668 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:23 crc kubenswrapper[4837]: I0313 11:50:23.931714 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef0f102e98673ab18c97a49b7663d696cfc34b8a477b625c17720f895014e128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:23 crc kubenswrapper[4837]: I0313 11:50:23.943695 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:23 crc kubenswrapper[4837]: I0313 11:50:23.956818 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:23 crc kubenswrapper[4837]: I0313 11:50:23.968617 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:23Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:24 crc kubenswrapper[4837]: I0313 11:50:24.721151 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qg957_cbb3f4c6-a6c5-4059-8beb-04179d70aff5/kube-multus/0.log" Mar 13 11:50:24 crc kubenswrapper[4837]: I0313 11:50:24.721230 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qg957" event={"ID":"cbb3f4c6-a6c5-4059-8beb-04179d70aff5","Type":"ContainerStarted","Data":"19b8a72f10c691a74098997e9d2383adf1aeb1811ad22dc8a74b5a47945d1e3e"} Mar 13 11:50:24 crc kubenswrapper[4837]: I0313 11:50:24.742284 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:24Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:24 crc kubenswrapper[4837]: I0313 11:50:24.755556 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:24Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:24 crc kubenswrapper[4837]: I0313 11:50:24.771677 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:24Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:24 crc kubenswrapper[4837]: I0313 11:50:24.784915 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:24Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:24 crc kubenswrapper[4837]: I0313 11:50:24.799672 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:24Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:24 crc kubenswrapper[4837]: I0313 11:50:24.816488 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb785bc-eb5f-41db-9d64-f1cecd2d25f9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f22c5fe3a62270693c25f87ecfb55bdd775a49445bc2d88cb26ec6c6daf2291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35cb83c3dfbdb94194292c22b9c7a42478f1dff83f6f703c45da3c08613a8da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b14790e78b11453c1d1b4a35d40c25fa01684c6b20f05cac9002eda7645cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f0e16118f5b414af37ef05c357d964583bfd8467d1f7434ce8e778334909a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f0e16118f5b414af37ef05c357d964583bfd8467d1f7434ce8e778334909a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:24Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:24 crc kubenswrapper[4837]: I0313 11:50:24.835527 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:24Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:24 crc kubenswrapper[4837]: I0313 11:50:24.852203 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:24Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:24 crc kubenswrapper[4837]: I0313 11:50:24.866717 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:24Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:24 crc kubenswrapper[4837]: I0313 11:50:24.885981 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b8a72f10c691a74098997e9d2383adf1aeb1811ad22dc8a74b5a47945d1e3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:50:22Z\\\",\\\"message\\\":\\\"2026-03-13T11:49:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_94226f47-389c-46ce-a284-334a08311124\\\\n2026-03-13T11:49:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_94226f47-389c-46ce-a284-334a08311124 to /host/opt/cni/bin/\\\\n2026-03-13T11:49:37Z [verbose] multus-daemon started\\\\n2026-03-13T11:49:37Z [verbose] Readiness Indicator file check\\\\n2026-03-13T11:50:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:50:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:24Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:24 crc kubenswrapper[4837]: I0313 11:50:24.911498 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f1cbcbcc13da4f4e1d2b4678deafdb330e2c7587d8bd8d528597f279c254ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f1cbcbcc13da4f4e1d2b4678deafdb330e2c7587d8bd8d528597f279c254ff7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:50:00Z\\\",\\\"message\\\":\\\":00.918999 7126 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 11:50:00.919044 7126 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0313 11:50:00.919014 7126 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 11:50:00.919071 7126 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0313 11:50:00.919092 7126 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0313 11:50:00.919110 7126 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 11:50:00.919107 7126 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 11:50:00.919127 7126 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 11:50:00.919146 7126 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0313 11:50:00.919153 7126 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 11:50:00.919178 7126 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0313 11:50:00.919162 7126 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0313 11:50:00.919211 7126 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 11:50:00.919218 7126 factory.go:656] Stopping watch factory\\\\nI0313 11:50:00.919227 7126 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0313 11:50:00.919233 7126 ovnkube.go:599] Stopped ovnkube\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:50:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4zzrs_openshift-ovn-kubernetes(43df29f7-1351-41f5-bfca-17f804837cb4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:24Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:24 crc kubenswrapper[4837]: I0313 11:50:24.937101 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b481010-5fbc-4c5c-b782-9dbb7524023e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4286e1cf3e088b3ccc0949721368fe176894a5d6bdf8d1dd108b92adecf45952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c00ffa41f4f30f0516fe955d957ac92818f9576557f7e1352070e221ac7b09d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae595b4ed8facfb5d9a747dac75233102bd05bc21e4bd5c644c0a1985bb7ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a7546e653505747aa787947982ccf181e3209cc3110f8bde34360ea73a1c69d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f3bbb38d2bec20e9b96f72dee3906973b4cc3e658d067928a46a8de37652f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:24Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:24 crc kubenswrapper[4837]: I0313 11:50:24.955817 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef0f102e98673ab18c97a49b7663d696cfc34b8a477b625c17720f895014e128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:24Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:24 crc kubenswrapper[4837]: I0313 11:50:24.981853 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93dcd114-c39a-4b27-aa9c-a42e3ef7cd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4f7913ed2023bd133ac1171cd590f8b0366200f10ee3b27c1d2c3195fc8ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:49:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 11:49:10.789921 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:49:10.790862 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:49:10.792348 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1361817431/tls.crt::/tmp/serving-cert-1361817431/tls.key\\\\\\\"\\\\nI0313 11:49:11.060533 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:49:11.064576 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:49:11.064598 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:49:11.064618 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:49:11.064623 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:49:11.074003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:49:11.074062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074073 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:49:11.074096 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:49:11.074104 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:49:11.074113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:49:11.074181 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:49:11.075668 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:24Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:25 crc kubenswrapper[4837]: I0313 11:50:25.001291 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:24Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:25 crc kubenswrapper[4837]: I0313 11:50:25.014899 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:25Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:25 crc kubenswrapper[4837]: I0313 11:50:25.028897 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:25Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:25 crc kubenswrapper[4837]: I0313 11:50:25.048214 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:25 crc kubenswrapper[4837]: I0313 11:50:25.048304 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:25 crc kubenswrapper[4837]: I0313 11:50:25.048348 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:25 crc kubenswrapper[4837]: I0313 11:50:25.048489 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:25 crc kubenswrapper[4837]: E0313 11:50:25.048524 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:25 crc kubenswrapper[4837]: E0313 11:50:25.048613 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:25 crc kubenswrapper[4837]: E0313 11:50:25.048863 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:25 crc kubenswrapper[4837]: E0313 11:50:25.049027 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:25 crc kubenswrapper[4837]: I0313 11:50:25.065482 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93dcd114-c39a-4b27-aa9c-a42e3ef7cd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4f7913ed2023bd133ac1171cd590f8b0366200f10ee3b27c1d2c3195fc8ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:49:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 11:49:10.789921 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:49:10.790862 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:49:10.792348 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1361817431/tls.crt::/tmp/serving-cert-1361817431/tls.key\\\\\\\"\\\\nI0313 11:49:11.060533 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:49:11.064576 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:49:11.064598 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:49:11.064618 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:49:11.064623 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:49:11.074003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:49:11.074062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074073 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:49:11.074096 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:49:11.074104 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:49:11.074113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:49:11.074181 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:49:11.075668 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:25Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:25 crc kubenswrapper[4837]: I0313 11:50:25.087846 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef0f102e98673ab18c97a49b7663d696cfc34b8a477b625c17720f895014e128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:25Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:25 crc kubenswrapper[4837]: I0313 11:50:25.104773 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:25Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:25 crc kubenswrapper[4837]: I0313 11:50:25.118625 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:25Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:25 crc kubenswrapper[4837]: I0313 11:50:25.131246 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:25Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:25 crc kubenswrapper[4837]: I0313 11:50:25.146480 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:25Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:25 crc kubenswrapper[4837]: I0313 11:50:25.161822 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:25Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:25 crc kubenswrapper[4837]: E0313 11:50:25.161960 4837 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 11:50:25 crc kubenswrapper[4837]: I0313 11:50:25.178286 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:25Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:25 crc kubenswrapper[4837]: I0313 11:50:25.193279 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:25Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:25 crc kubenswrapper[4837]: I0313 11:50:25.206565 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:25Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:25 crc kubenswrapper[4837]: I0313 11:50:25.226945 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f1cbcbcc13da4f4e1d2b4678deafdb330e2c7587d8bd8d528597f279c254ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f1cbcbcc13da4f4e1d2b4678deafdb330e2c7587d8bd8d528597f279c254ff7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:50:00Z\\\",\\\"message\\\":\\\":00.918999 7126 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 11:50:00.919044 7126 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0313 11:50:00.919014 7126 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 11:50:00.919071 7126 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0313 11:50:00.919092 7126 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0313 11:50:00.919110 7126 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 11:50:00.919107 7126 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 11:50:00.919127 7126 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 11:50:00.919146 7126 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0313 11:50:00.919153 7126 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 11:50:00.919178 7126 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0313 11:50:00.919162 7126 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0313 11:50:00.919211 7126 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 11:50:00.919218 7126 factory.go:656] Stopping watch factory\\\\nI0313 11:50:00.919227 7126 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0313 11:50:00.919233 7126 ovnkube.go:599] Stopped ovnkube\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:50:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4zzrs_openshift-ovn-kubernetes(43df29f7-1351-41f5-bfca-17f804837cb4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:25Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:25 crc kubenswrapper[4837]: I0313 11:50:25.247180 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b481010-5fbc-4c5c-b782-9dbb7524023e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4286e1cf3e088b3ccc0949721368fe176894a5d6bdf8d1dd108b92adecf45952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c00ffa41f4f30f0516fe955d957ac92818f9576557f7e1352070e221ac7b09d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae595b4ed8facfb5d9a747dac75233102bd05bc21e4bd5c644c0a1985bb7ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a7546e653505747aa787947982ccf181e3209cc3110f8bde34360ea73a1c69d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f3bbb38d2bec20e9b96f72dee3906973b4cc3e658d067928a46a8de37652f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:25Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:25 crc kubenswrapper[4837]: I0313 11:50:25.259376 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb785bc-eb5f-41db-9d64-f1cecd2d25f9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f22c5fe3a62270693c25f87ecfb55bdd775a49445bc2d88cb26ec6c6daf2291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35cb83c3dfbdb94194292c22b9c7a42478f1dff83f6f703c45da3c08613a8da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b14790e78b11453c1d1b4a35d40c25fa01684c6b20f05cac9002eda7645cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f0e16118f5b414af37ef05c357d964583bfd8467d1f7434ce8e778334909a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f0e16118f5b414af37ef05c357d964583bfd8467d1f7434ce8e778334909a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:25Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:25 crc kubenswrapper[4837]: I0313 11:50:25.272930 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:25Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:25 crc kubenswrapper[4837]: I0313 11:50:25.283875 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:25Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:25 crc kubenswrapper[4837]: I0313 11:50:25.296114 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:25Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:25 crc kubenswrapper[4837]: I0313 11:50:25.309838 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b8a72f10c691a74098997e9d2383adf1aeb1811ad22dc8a74b5a47945d1e3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:50:22Z\\\",\\\"message\\\":\\\"2026-03-13T11:49:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_94226f47-389c-46ce-a284-334a08311124\\\\n2026-03-13T11:49:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_94226f47-389c-46ce-a284-334a08311124 to /host/opt/cni/bin/\\\\n2026-03-13T11:49:37Z [verbose] multus-daemon started\\\\n2026-03-13T11:49:37Z [verbose] Readiness Indicator file check\\\\n2026-03-13T11:50:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:50:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:25Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.047863 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.047922 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.047936 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:27 crc kubenswrapper[4837]: E0313 11:50:27.048037 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.048172 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:27 crc kubenswrapper[4837]: E0313 11:50:27.048266 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:27 crc kubenswrapper[4837]: E0313 11:50:27.048362 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:27 crc kubenswrapper[4837]: E0313 11:50:27.048728 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.068436 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.069330 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.069381 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.069399 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.069422 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.069440 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:27Z","lastTransitionTime":"2026-03-13T11:50:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:27 crc kubenswrapper[4837]: E0313 11:50:27.094335 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:27Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.100698 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.100750 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.100763 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.100784 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.100797 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:27Z","lastTransitionTime":"2026-03-13T11:50:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:27 crc kubenswrapper[4837]: E0313 11:50:27.118091 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:27Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.122787 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.122852 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.122865 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.122882 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.122894 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:27Z","lastTransitionTime":"2026-03-13T11:50:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:27 crc kubenswrapper[4837]: E0313 11:50:27.167326 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:27Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.175823 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.176120 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.176228 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.176360 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.176448 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:27Z","lastTransitionTime":"2026-03-13T11:50:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:27 crc kubenswrapper[4837]: E0313 11:50:27.195931 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:27Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.199826 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.199861 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.199871 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.199885 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:27 crc kubenswrapper[4837]: I0313 11:50:27.199894 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:27Z","lastTransitionTime":"2026-03-13T11:50:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:27 crc kubenswrapper[4837]: E0313 11:50:27.216432 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:27Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:27 crc kubenswrapper[4837]: E0313 11:50:27.216590 4837 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 11:50:29 crc kubenswrapper[4837]: I0313 11:50:29.048348 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:29 crc kubenswrapper[4837]: I0313 11:50:29.048425 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:29 crc kubenswrapper[4837]: I0313 11:50:29.048519 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:29 crc kubenswrapper[4837]: I0313 11:50:29.048566 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:29 crc kubenswrapper[4837]: I0313 11:50:29.049559 4837 scope.go:117] "RemoveContainer" containerID="7f1cbcbcc13da4f4e1d2b4678deafdb330e2c7587d8bd8d528597f279c254ff7" Mar 13 11:50:29 crc kubenswrapper[4837]: E0313 11:50:29.049812 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:29 crc kubenswrapper[4837]: E0313 11:50:29.049904 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:29 crc kubenswrapper[4837]: E0313 11:50:29.049971 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:29 crc kubenswrapper[4837]: E0313 11:50:29.050104 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:29 crc kubenswrapper[4837]: I0313 11:50:29.738455 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4zzrs_43df29f7-1351-41f5-bfca-17f804837cb4/ovnkube-controller/2.log" Mar 13 11:50:29 crc kubenswrapper[4837]: I0313 11:50:29.741148 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" event={"ID":"43df29f7-1351-41f5-bfca-17f804837cb4","Type":"ContainerStarted","Data":"01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c"} Mar 13 11:50:29 crc kubenswrapper[4837]: I0313 11:50:29.741582 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:50:29 crc kubenswrapper[4837]: I0313 11:50:29.754984 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:29Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:29 crc kubenswrapper[4837]: I0313 11:50:29.767447 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:29Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:29 crc kubenswrapper[4837]: I0313 11:50:29.782443 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b8a72f10c691a74098997e9d2383adf1aeb1811ad22dc8a74b5a47945d1e3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:50:22Z\\\",\\\"message\\\":\\\"2026-03-13T11:49:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_94226f47-389c-46ce-a284-334a08311124\\\\n2026-03-13T11:49:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_94226f47-389c-46ce-a284-334a08311124 to /host/opt/cni/bin/\\\\n2026-03-13T11:49:37Z [verbose] multus-daemon started\\\\n2026-03-13T11:49:37Z [verbose] Readiness Indicator file check\\\\n2026-03-13T11:50:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:50:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:29Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:29 crc kubenswrapper[4837]: I0313 11:50:29.805208 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f1cbcbcc13da4f4e1d2b4678deafdb330e2c7587d8bd8d528597f279c254ff7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:50:00Z\\\",\\\"message\\\":\\\":00.918999 7126 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 11:50:00.919044 7126 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0313 11:50:00.919014 7126 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 11:50:00.919071 7126 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0313 11:50:00.919092 7126 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0313 11:50:00.919110 7126 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 11:50:00.919107 7126 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 11:50:00.919127 7126 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 11:50:00.919146 7126 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0313 11:50:00.919153 7126 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 11:50:00.919178 7126 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0313 11:50:00.919162 7126 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0313 11:50:00.919211 7126 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 11:50:00.919218 7126 factory.go:656] Stopping watch factory\\\\nI0313 11:50:00.919227 7126 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0313 11:50:00.919233 7126 ovnkube.go:599] Stopped ovnkube\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:50:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:50:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:29Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:29 crc kubenswrapper[4837]: I0313 11:50:29.827868 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b481010-5fbc-4c5c-b782-9dbb7524023e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4286e1cf3e088b3ccc0949721368fe176894a5d6bdf8d1dd108b92adecf45952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c00ffa41f4f30f0516fe955d957ac92818f9576557f7e1352070e221ac7b09d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae595b4ed8facfb5d9a747dac75233102bd05bc21e4bd5c644c0a1985bb7ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a7546e653505747aa787947982ccf181e3209cc3110f8bde34360ea73a1c69d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f3bbb38d2bec20e9b96f72dee3906973b4cc3e658d067928a46a8de37652f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:29Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:29 crc kubenswrapper[4837]: I0313 11:50:29.840323 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb785bc-eb5f-41db-9d64-f1cecd2d25f9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f22c5fe3a62270693c25f87ecfb55bdd775a49445bc2d88cb26ec6c6daf2291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35cb83c3dfbdb94194292c22b9c7a42478f1dff83f6f703c45da3c08613a8da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b14790e78b11453c1d1b4a35d40c25fa01684c6b20f05cac9002eda7645cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f0e16118f5b414af37ef05c357d964583bfd8467d1f7434ce8e778334909a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f0e16118f5b414af37ef05c357d964583bfd8467d1f7434ce8e778334909a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:29Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:29 crc kubenswrapper[4837]: I0313 11:50:29.856084 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:29Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:29 crc kubenswrapper[4837]: I0313 11:50:29.873782 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93dcd114-c39a-4b27-aa9c-a42e3ef7cd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4f7913ed2023bd133ac1171cd590f8b0366200f10ee3b27c1d2c3195fc8ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:49:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 11:49:10.789921 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:49:10.790862 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:49:10.792348 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1361817431/tls.crt::/tmp/serving-cert-1361817431/tls.key\\\\\\\"\\\\nI0313 11:49:11.060533 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:49:11.064576 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:49:11.064598 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:49:11.064618 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:49:11.064623 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:49:11.074003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:49:11.074062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074073 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:49:11.074096 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:49:11.074104 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:49:11.074113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:49:11.074181 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:49:11.075668 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:29Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:29 crc kubenswrapper[4837]: I0313 11:50:29.890772 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef0f102e98673ab18c97a49b7663d696cfc34b8a477b625c17720f895014e128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:29Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:29 crc kubenswrapper[4837]: I0313 11:50:29.906218 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:29Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:29 crc kubenswrapper[4837]: I0313 11:50:29.921987 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:29Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:29 crc kubenswrapper[4837]: I0313 11:50:29.936434 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:29Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:29 crc kubenswrapper[4837]: I0313 11:50:29.953010 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:29Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:29 crc kubenswrapper[4837]: I0313 11:50:29.972536 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:29Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:29 crc kubenswrapper[4837]: I0313 11:50:29.986463 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:29Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:30 crc kubenswrapper[4837]: I0313 11:50:30.001342 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4513a0ad-4bd6-4aec-bce8-cb6337db1d57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://603ace7b4b1c79d13e8d3fd10baf836c890a60bfbdae807921ae0cc6365bc3dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f1e59b3f4d6931337d42b5716a5ab247f9314e2a0eb400f8fc438c0e1ff95bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f1e59b3f4d6931337d42b5716a5ab247f9314e2a0eb400f8fc438c0e1ff95bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:29Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:30 crc kubenswrapper[4837]: I0313 11:50:30.016695 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:30Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:30 crc kubenswrapper[4837]: I0313 11:50:30.032767 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:30Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:30 crc kubenswrapper[4837]: E0313 11:50:30.163598 4837 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 11:50:30 crc kubenswrapper[4837]: I0313 11:50:30.747145 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4zzrs_43df29f7-1351-41f5-bfca-17f804837cb4/ovnkube-controller/3.log" Mar 13 11:50:30 crc kubenswrapper[4837]: I0313 11:50:30.748111 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4zzrs_43df29f7-1351-41f5-bfca-17f804837cb4/ovnkube-controller/2.log" Mar 13 11:50:30 crc kubenswrapper[4837]: I0313 11:50:30.752533 4837 generic.go:334] "Generic (PLEG): container finished" podID="43df29f7-1351-41f5-bfca-17f804837cb4" containerID="01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c" exitCode=1 Mar 13 11:50:30 crc kubenswrapper[4837]: I0313 11:50:30.752617 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" event={"ID":"43df29f7-1351-41f5-bfca-17f804837cb4","Type":"ContainerDied","Data":"01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c"} Mar 13 11:50:30 crc kubenswrapper[4837]: I0313 11:50:30.752793 4837 scope.go:117] "RemoveContainer" containerID="7f1cbcbcc13da4f4e1d2b4678deafdb330e2c7587d8bd8d528597f279c254ff7" Mar 13 11:50:30 crc kubenswrapper[4837]: I0313 11:50:30.753581 4837 scope.go:117] "RemoveContainer" containerID="01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c" Mar 13 11:50:30 crc kubenswrapper[4837]: E0313 11:50:30.753837 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4zzrs_openshift-ovn-kubernetes(43df29f7-1351-41f5-bfca-17f804837cb4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" Mar 13 11:50:30 crc kubenswrapper[4837]: I0313 11:50:30.771791 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93dcd114-c39a-4b27-aa9c-a42e3ef7cd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4f7913ed2023bd133ac1171cd590f8b0366200f10ee3b27c1d2c3195fc8ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:49:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 11:49:10.789921 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:49:10.790862 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:49:10.792348 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1361817431/tls.crt::/tmp/serving-cert-1361817431/tls.key\\\\\\\"\\\\nI0313 11:49:11.060533 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:49:11.064576 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:49:11.064598 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:49:11.064618 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:49:11.064623 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:49:11.074003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:49:11.074062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074073 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:49:11.074096 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:49:11.074104 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:49:11.074113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:49:11.074181 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:49:11.075668 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:30Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:30 crc kubenswrapper[4837]: I0313 11:50:30.790150 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef0f102e98673ab18c97a49b7663d696cfc34b8a477b625c17720f895014e128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:30Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:30 crc kubenswrapper[4837]: I0313 11:50:30.811135 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:30Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:30 crc kubenswrapper[4837]: I0313 11:50:30.826417 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:30Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:30 crc kubenswrapper[4837]: I0313 11:50:30.843265 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:30Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:30 crc kubenswrapper[4837]: I0313 11:50:30.856146 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4513a0ad-4bd6-4aec-bce8-cb6337db1d57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://603ace7b4b1c79d13e8d3fd10baf836c890a60bfbdae807921ae0cc6365bc3dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f1e59b3f4d6931337d42b5716a5ab247f9314e2a0eb400f8fc438c0e1ff95bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f1e59b3f4d6931337d42b5716a5ab247f9314e2a0eb400f8fc438c0e1ff95bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:30Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:30 crc kubenswrapper[4837]: I0313 11:50:30.873034 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:30Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:30 crc kubenswrapper[4837]: I0313 11:50:30.889500 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:30Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:30 crc kubenswrapper[4837]: I0313 11:50:30.904860 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:30Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:30 crc kubenswrapper[4837]: I0313 11:50:30.926024 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:30Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:30 crc kubenswrapper[4837]: I0313 11:50:30.940252 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:30Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:30 crc kubenswrapper[4837]: I0313 11:50:30.966918 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f1cbcbcc13da4f4e1d2b4678deafdb330e2c7587d8bd8d528597f279c254ff7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:50:00Z\\\",\\\"message\\\":\\\":00.918999 7126 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 11:50:00.919044 7126 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0313 11:50:00.919014 7126 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 11:50:00.919071 7126 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0313 11:50:00.919092 7126 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0313 11:50:00.919110 7126 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 11:50:00.919107 7126 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 11:50:00.919127 7126 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 11:50:00.919146 7126 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0313 11:50:00.919153 7126 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 11:50:00.919178 7126 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0313 11:50:00.919162 7126 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0313 11:50:00.919211 7126 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 11:50:00.919218 7126 factory.go:656] Stopping watch factory\\\\nI0313 11:50:00.919227 7126 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0313 11:50:00.919233 7126 ovnkube.go:599] Stopped ovnkube\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:50:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:50:29Z\\\",\\\"message\\\":\\\"ormers/factory.go:160\\\\nI0313 11:50:29.836038 7458 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 11:50:29.836680 7458 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 11:50:29.837264 7458 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 11:50:29.845714 7458 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 11:50:29.845841 7458 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 11:50:29.845894 7458 factory.go:656] Stopping watch factory\\\\nI0313 11:50:29.845925 7458 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 11:50:29.845982 7458 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 11:50:29.852851 7458 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0313 11:50:29.852888 7458 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0313 11:50:29.852950 7458 ovnkube.go:599] Stopped ovnkube\\\\nI0313 11:50:29.852977 7458 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 11:50:29.853057 7458 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:50:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:30Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:30 crc kubenswrapper[4837]: I0313 11:50:30.991708 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b481010-5fbc-4c5c-b782-9dbb7524023e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4286e1cf3e088b3ccc0949721368fe176894a5d6bdf8d1dd108b92adecf45952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c00ffa41f4f30f0516fe955d957ac92818f9576557f7e1352070e221ac7b09d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae595b4ed8facfb5d9a747dac75233102bd05bc21e4bd5c644c0a1985bb7ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a7546e653505747aa787947982ccf181e3209cc3110f8bde34360ea73a1c69d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f3bbb38d2bec20e9b96f72dee3906973b4cc3e658d067928a46a8de37652f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:30Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:31 crc kubenswrapper[4837]: I0313 11:50:31.009588 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb785bc-eb5f-41db-9d64-f1cecd2d25f9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f22c5fe3a62270693c25f87ecfb55bdd775a49445bc2d88cb26ec6c6daf2291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35cb83c3dfbdb94194292c22b9c7a42478f1dff83f6f703c45da3c08613a8da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b14790e78b11453c1d1b4a35d40c25fa01684c6b20f05cac9002eda7645cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f0e16118f5b414af37ef05c357d964583bfd8467d1f7434ce8e778334909a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f0e16118f5b414af37ef05c357d964583bfd8467d1f7434ce8e778334909a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:31Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:31 crc kubenswrapper[4837]: I0313 11:50:31.027077 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:31Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:31 crc kubenswrapper[4837]: I0313 11:50:31.044822 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:31Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:31 crc kubenswrapper[4837]: I0313 11:50:31.047874 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:31 crc kubenswrapper[4837]: I0313 11:50:31.047901 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:31 crc kubenswrapper[4837]: I0313 11:50:31.047941 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:31 crc kubenswrapper[4837]: I0313 11:50:31.047981 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:31 crc kubenswrapper[4837]: E0313 11:50:31.048071 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:31 crc kubenswrapper[4837]: E0313 11:50:31.048170 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:31 crc kubenswrapper[4837]: E0313 11:50:31.048285 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:31 crc kubenswrapper[4837]: E0313 11:50:31.048385 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:31 crc kubenswrapper[4837]: I0313 11:50:31.062578 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:31Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:31 crc kubenswrapper[4837]: I0313 11:50:31.078156 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b8a72f10c691a74098997e9d2383adf1aeb1811ad22dc8a74b5a47945d1e3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:50:22Z\\\",\\\"message\\\":\\\"2026-03-13T11:49:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_94226f47-389c-46ce-a284-334a08311124\\\\n2026-03-13T11:49:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_94226f47-389c-46ce-a284-334a08311124 to /host/opt/cni/bin/\\\\n2026-03-13T11:49:37Z [verbose] multus-daemon started\\\\n2026-03-13T11:49:37Z [verbose] Readiness Indicator file check\\\\n2026-03-13T11:50:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:50:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:31Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:31 crc kubenswrapper[4837]: I0313 11:50:31.759090 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4zzrs_43df29f7-1351-41f5-bfca-17f804837cb4/ovnkube-controller/3.log" Mar 13 11:50:31 crc kubenswrapper[4837]: I0313 11:50:31.768393 4837 scope.go:117] "RemoveContainer" containerID="01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c" Mar 13 11:50:31 crc kubenswrapper[4837]: E0313 11:50:31.769433 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4zzrs_openshift-ovn-kubernetes(43df29f7-1351-41f5-bfca-17f804837cb4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" Mar 13 11:50:31 crc kubenswrapper[4837]: I0313 11:50:31.783827 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:31Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:31 crc kubenswrapper[4837]: I0313 11:50:31.800672 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:31Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:31 crc kubenswrapper[4837]: I0313 11:50:31.813110 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:31Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:31 crc kubenswrapper[4837]: I0313 11:50:31.828391 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:31Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:31 crc kubenswrapper[4837]: I0313 11:50:31.842952 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:31Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:31 crc kubenswrapper[4837]: I0313 11:50:31.858564 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4513a0ad-4bd6-4aec-bce8-cb6337db1d57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://603ace7b4b1c79d13e8d3fd10baf836c890a60bfbdae807921ae0cc6365bc3dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f1e59b3f4d6931337d42b5716a5ab247f9314e2a0eb400f8fc438c0e1ff95bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f1e59b3f4d6931337d42b5716a5ab247f9314e2a0eb400f8fc438c0e1ff95bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:31Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:31 crc kubenswrapper[4837]: I0313 11:50:31.874584 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:31Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:31 crc kubenswrapper[4837]: I0313 11:50:31.890935 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:31Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:31 crc kubenswrapper[4837]: I0313 11:50:31.907377 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:31Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:31 crc kubenswrapper[4837]: I0313 11:50:31.921178 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:31Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:31 crc kubenswrapper[4837]: I0313 11:50:31.936029 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b8a72f10c691a74098997e9d2383adf1aeb1811ad22dc8a74b5a47945d1e3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:50:22Z\\\",\\\"message\\\":\\\"2026-03-13T11:49:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_94226f47-389c-46ce-a284-334a08311124\\\\n2026-03-13T11:49:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_94226f47-389c-46ce-a284-334a08311124 to /host/opt/cni/bin/\\\\n2026-03-13T11:49:37Z [verbose] multus-daemon started\\\\n2026-03-13T11:49:37Z [verbose] Readiness Indicator file check\\\\n2026-03-13T11:50:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:50:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:31Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:31 crc kubenswrapper[4837]: I0313 11:50:31.959335 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:50:29Z\\\",\\\"message\\\":\\\"ormers/factory.go:160\\\\nI0313 11:50:29.836038 7458 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 11:50:29.836680 7458 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 11:50:29.837264 7458 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 11:50:29.845714 7458 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 11:50:29.845841 7458 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 11:50:29.845894 7458 factory.go:656] Stopping watch factory\\\\nI0313 11:50:29.845925 7458 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 11:50:29.845982 7458 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 11:50:29.852851 7458 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0313 11:50:29.852888 7458 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0313 11:50:29.852950 7458 ovnkube.go:599] Stopped ovnkube\\\\nI0313 11:50:29.852977 7458 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 11:50:29.853057 7458 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:50:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4zzrs_openshift-ovn-kubernetes(43df29f7-1351-41f5-bfca-17f804837cb4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:31Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:31 crc kubenswrapper[4837]: I0313 11:50:31.981027 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b481010-5fbc-4c5c-b782-9dbb7524023e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4286e1cf3e088b3ccc0949721368fe176894a5d6bdf8d1dd108b92adecf45952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c00ffa41f4f30f0516fe955d957ac92818f9576557f7e1352070e221ac7b09d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae595b4ed8facfb5d9a747dac75233102bd05bc21e4bd5c644c0a1985bb7ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a7546e653505747aa787947982ccf181e3209cc3110f8bde34360ea73a1c69d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f3bbb38d2bec20e9b96f72dee3906973b4cc3e658d067928a46a8de37652f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:31Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:31 crc kubenswrapper[4837]: I0313 11:50:31.998034 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb785bc-eb5f-41db-9d64-f1cecd2d25f9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f22c5fe3a62270693c25f87ecfb55bdd775a49445bc2d88cb26ec6c6daf2291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35cb83c3dfbdb94194292c22b9c7a42478f1dff83f6f703c45da3c08613a8da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b14790e78b11453c1d1b4a35d40c25fa01684c6b20f05cac9002eda7645cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f0e16118f5b414af37ef05c357d964583bfd8467d1f7434ce8e778334909a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f0e16118f5b414af37ef05c357d964583bfd8467d1f7434ce8e778334909a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:31Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:32 crc kubenswrapper[4837]: I0313 11:50:32.015364 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:32Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:32 crc kubenswrapper[4837]: I0313 11:50:32.029731 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:32Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:32 crc kubenswrapper[4837]: I0313 11:50:32.046472 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93dcd114-c39a-4b27-aa9c-a42e3ef7cd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4f7913ed2023bd133ac1171cd590f8b0366200f10ee3b27c1d2c3195fc8ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:49:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 11:49:10.789921 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:49:10.790862 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:49:10.792348 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1361817431/tls.crt::/tmp/serving-cert-1361817431/tls.key\\\\\\\"\\\\nI0313 11:49:11.060533 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:49:11.064576 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:49:11.064598 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:49:11.064618 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:49:11.064623 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:49:11.074003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:49:11.074062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074073 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:49:11.074096 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:49:11.074104 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:49:11.074113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:49:11.074181 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:49:11.075668 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:32Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:32 crc kubenswrapper[4837]: I0313 11:50:32.066496 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef0f102e98673ab18c97a49b7663d696cfc34b8a477b625c17720f895014e128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:32Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:33 crc kubenswrapper[4837]: I0313 11:50:33.047511 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:33 crc kubenswrapper[4837]: I0313 11:50:33.047590 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:33 crc kubenswrapper[4837]: E0313 11:50:33.047665 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:33 crc kubenswrapper[4837]: I0313 11:50:33.047700 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:33 crc kubenswrapper[4837]: E0313 11:50:33.047724 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:33 crc kubenswrapper[4837]: E0313 11:50:33.047837 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:33 crc kubenswrapper[4837]: I0313 11:50:33.047938 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:33 crc kubenswrapper[4837]: E0313 11:50:33.048001 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:35 crc kubenswrapper[4837]: I0313 11:50:35.047592 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:35 crc kubenswrapper[4837]: I0313 11:50:35.047596 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:35 crc kubenswrapper[4837]: E0313 11:50:35.048021 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:35 crc kubenswrapper[4837]: I0313 11:50:35.047706 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:35 crc kubenswrapper[4837]: E0313 11:50:35.048075 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:35 crc kubenswrapper[4837]: I0313 11:50:35.047629 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:35 crc kubenswrapper[4837]: E0313 11:50:35.048262 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:35 crc kubenswrapper[4837]: E0313 11:50:35.048306 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:35 crc kubenswrapper[4837]: I0313 11:50:35.061276 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffb02ea00858228c6a446245d9b555b1c78c7c6d72816c5c216dd688304944f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:35Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:35 crc kubenswrapper[4837]: I0313 11:50:35.071850 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xwmn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6398583-f9ff-4b10-829a-503fd523710b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81bbbbaa679f139bb4f89ffd88a4719076e3b05998470e44663f39d77c554b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7ckv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xwmn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:35Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:35 crc kubenswrapper[4837]: I0313 11:50:35.081799 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86e5afeb-4720-4593-a53e-dfb5381d0b1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nj56\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cjn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:35Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:35 crc kubenswrapper[4837]: I0313 11:50:35.093047 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:35Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:35 crc kubenswrapper[4837]: I0313 11:50:35.104935 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-np68d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c126c88-4541-474c-bc1f-5ca9befa3146\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e222a4e8317b8a22b443189e2e1139a8f7ffbe54f43e01fa2c67bf193869fabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdh8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-np68d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:35Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:35 crc kubenswrapper[4837]: I0313 11:50:35.116815 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4513a0ad-4bd6-4aec-bce8-cb6337db1d57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://603ace7b4b1c79d13e8d3fd10baf836c890a60bfbdae807921ae0cc6365bc3dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f1e59b3f4d6931337d42b5716a5ab247f9314e2a0eb400f8fc438c0e1ff95bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f1e59b3f4d6931337d42b5716a5ab247f9314e2a0eb400f8fc438c0e1ff95bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:35Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:35 crc kubenswrapper[4837]: I0313 11:50:35.132492 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4550efaabe4b3c701aad154363fa9456bac3525f1450b76a152156599d3fb80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:35Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:35 crc kubenswrapper[4837]: I0313 11:50:35.143844 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:35Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:35 crc kubenswrapper[4837]: I0313 11:50:35.159952 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12789cc5674ec4d2ea4993f7b24fbf643f0ba9fc40d65b3f1da4d0b905f96ec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58194199844fb42fc9e91a2e38306d6c6bc55c77daedd8c88446f45307886a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:35Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:35 crc kubenswrapper[4837]: E0313 11:50:35.164299 4837 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 11:50:35 crc kubenswrapper[4837]: I0313 11:50:35.173544 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e05c56f7-b007-4165-9e29-98cfa865d020\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35e757ae9d58c31e3308d64e190299249471b2542f27ce093fe589cd2331043b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://010a055a1fd60be376cbc6b201a282a004c2c10b6f8b696ce028bbbe160e6139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r9f5g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt7fl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:35Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:35 crc kubenswrapper[4837]: I0313 11:50:35.187163 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qg957" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbb3f4c6-a6c5-4059-8beb-04179d70aff5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b8a72f10c691a74098997e9d2383adf1aeb1811ad22dc8a74b5a47945d1e3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:50:22Z\\\",\\\"message\\\":\\\"2026-03-13T11:49:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_94226f47-389c-46ce-a284-334a08311124\\\\n2026-03-13T11:49:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_94226f47-389c-46ce-a284-334a08311124 to /host/opt/cni/bin/\\\\n2026-03-13T11:49:37Z [verbose] multus-daemon started\\\\n2026-03-13T11:49:37Z [verbose] Readiness Indicator file check\\\\n2026-03-13T11:50:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:50:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fqxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qg957\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:35Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:35 crc kubenswrapper[4837]: I0313 11:50:35.206884 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43df29f7-1351-41f5-bfca-17f804837cb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T11:50:29Z\\\",\\\"message\\\":\\\"ormers/factory.go:160\\\\nI0313 11:50:29.836038 7458 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 11:50:29.836680 7458 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 11:50:29.837264 7458 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 11:50:29.845714 7458 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 11:50:29.845841 7458 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 11:50:29.845894 7458 factory.go:656] Stopping watch factory\\\\nI0313 11:50:29.845925 7458 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 11:50:29.845982 7458 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 11:50:29.852851 7458 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0313 11:50:29.852888 7458 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0313 11:50:29.852950 7458 ovnkube.go:599] Stopped ovnkube\\\\nI0313 11:50:29.852977 7458 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 11:50:29.853057 7458 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:50:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4zzrs_openshift-ovn-kubernetes(43df29f7-1351-41f5-bfca-17f804837cb4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85hll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4zzrs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:35Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:35 crc kubenswrapper[4837]: I0313 11:50:35.231596 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b481010-5fbc-4c5c-b782-9dbb7524023e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4286e1cf3e088b3ccc0949721368fe176894a5d6bdf8d1dd108b92adecf45952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c00ffa41f4f30f0516fe955d957ac92818f9576557f7e1352070e221ac7b09d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae595b4ed8facfb5d9a747dac75233102bd05bc21e4bd5c644c0a1985bb7ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a7546e653505747aa787947982ccf181e3209cc3110f8bde34360ea73a1c69d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18f3bbb38d2bec20e9b96f72dee3906973b4cc3e658d067928a46a8de37652f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eadc4581a9d3bb83f15ec97767cde398404e122c42fbf63c555637e8eb2bf0f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://394c9285cc6e5bfebadf8c66038f23ba9866f76819d209e92ca846293d1e634f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d533ce5a548196fcfb20da38773e2f8c00e91ca696111b5bf0096cb7a81cb51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:35Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:35 crc kubenswrapper[4837]: I0313 11:50:35.247151 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddb785bc-eb5f-41db-9d64-f1cecd2d25f9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f22c5fe3a62270693c25f87ecfb55bdd775a49445bc2d88cb26ec6c6daf2291\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a35cb83c3dfbdb94194292c22b9c7a42478f1dff83f6f703c45da3c08613a8da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20b14790e78b11453c1d1b4a35d40c25fa01684c6b20f05cac9002eda7645cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f0e16118f5b414af37ef05c357d964583bfd8467d1f7434ce8e778334909a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f0e16118f5b414af37ef05c357d964583bfd8467d1f7434ce8e778334909a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:35Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:35 crc kubenswrapper[4837]: I0313 11:50:35.263397 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:35Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:35 crc kubenswrapper[4837]: I0313 11:50:35.276894 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fa4d35f62d4053e21c7ccf3f15408f841789aca98290270b07bedc130614631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvtx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2td4d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:35Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:35 crc kubenswrapper[4837]: I0313 11:50:35.289956 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93dcd114-c39a-4b27-aa9c-a42e3ef7cd79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:48:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abb4f7913ed2023bd133ac1171cd590f8b0366200f10ee3b27c1d2c3195fc8ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T11:49:11Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 11:49:10.789921 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 11:49:10.790862 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 11:49:10.792348 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1361817431/tls.crt::/tmp/serving-cert-1361817431/tls.key\\\\\\\"\\\\nI0313 11:49:11.060533 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 11:49:11.064576 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 11:49:11.064598 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 11:49:11.064618 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 11:49:11.064623 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 11:49:11.074003 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 11:49:11.074062 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074073 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 11:49:11.074087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 11:49:11.074096 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 11:49:11.074104 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 11:49:11.074113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0313 11:49:11.074181 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0313 11:49:11.075668 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:48:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:48:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:48:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:48:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:35Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:35 crc kubenswrapper[4837]: I0313 11:50:35.304901 4837 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"501b48f2-bba8-44d4-81df-7a8b7df456b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T11:49:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef0f102e98673ab18c97a49b7663d696cfc34b8a477b625c17720f895014e128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T11:49:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://595633ac46035fbd9ecd0e0932be459bac052770959be5741ceacdc4750a9db1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dec43be6d303551988343bb2d1bac82273468313780a6e0c903e23ff0d859c39\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578f800112657e6a1333c4d14332543e1726ababa9e0f7615335254246ed4138\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f1a3305a593eb65a278fcc32089efa5f82cc4a165c4713a2fe77ab0660a8923\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d9079599c652c719a11577aac07f7f22f0156001bcfefd827b7882099c0831f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abbbc1f6dbd55fe289f7737e892adb6c5a2df05c66a04984ae25769cfe49ad11\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T11:49:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T11:49:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmhlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T11:49:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xkqn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:35Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.047427 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.047431 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:37 crc kubenswrapper[4837]: E0313 11:50:37.047660 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:37 crc kubenswrapper[4837]: E0313 11:50:37.047753 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.047480 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:37 crc kubenswrapper[4837]: E0313 11:50:37.047860 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.047955 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:37 crc kubenswrapper[4837]: E0313 11:50:37.048021 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.509889 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.509932 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.509941 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.509957 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.509968 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:37Z","lastTransitionTime":"2026-03-13T11:50:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:37 crc kubenswrapper[4837]: E0313 11:50:37.521533 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.524974 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.525009 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.525018 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.525035 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.525044 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:37Z","lastTransitionTime":"2026-03-13T11:50:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:37 crc kubenswrapper[4837]: E0313 11:50:37.544396 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.548241 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.548289 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.548298 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.548312 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.548321 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:37Z","lastTransitionTime":"2026-03-13T11:50:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:37 crc kubenswrapper[4837]: E0313 11:50:37.561439 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.565220 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.565252 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.565262 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.565276 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.565285 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:37Z","lastTransitionTime":"2026-03-13T11:50:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:37 crc kubenswrapper[4837]: E0313 11:50:37.577365 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.581038 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.581060 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.581069 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.581081 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:37 crc kubenswrapper[4837]: I0313 11:50:37.581090 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:37Z","lastTransitionTime":"2026-03-13T11:50:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:37 crc kubenswrapper[4837]: E0313 11:50:37.593264 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T11:50:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"205607ff-4e76-4a9e-84cc-5670826221a2\\\",\\\"systemUUID\\\":\\\"91a43e7e-d083-4b9e-bcd8-790411e8b2f1\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T11:50:37Z is after 2025-08-24T17:21:41Z" Mar 13 11:50:37 crc kubenswrapper[4837]: E0313 11:50:37.593599 4837 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 11:50:39 crc kubenswrapper[4837]: I0313 11:50:39.038197 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:50:39 crc kubenswrapper[4837]: I0313 11:50:39.038277 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:39 crc kubenswrapper[4837]: I0313 11:50:39.038300 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:39 crc kubenswrapper[4837]: E0313 11:50:39.038328 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:43.038303687 +0000 UTC m=+218.676570470 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:50:39 crc kubenswrapper[4837]: I0313 11:50:39.038387 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:39 crc kubenswrapper[4837]: E0313 11:50:39.038413 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 11:50:39 crc kubenswrapper[4837]: E0313 11:50:39.038430 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 11:50:39 crc kubenswrapper[4837]: I0313 11:50:39.038428 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:39 crc kubenswrapper[4837]: E0313 11:50:39.038440 4837 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:50:39 crc kubenswrapper[4837]: E0313 11:50:39.038472 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 11:50:39 crc kubenswrapper[4837]: E0313 11:50:39.038509 4837 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 11:50:39 crc kubenswrapper[4837]: E0313 11:50:39.038527 4837 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 11:50:39 crc kubenswrapper[4837]: E0313 11:50:39.038466 4837 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 11:50:39 crc kubenswrapper[4837]: E0313 11:50:39.038545 4837 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:50:39 crc kubenswrapper[4837]: E0313 11:50:39.038531 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 11:51:43.038516363 +0000 UTC m=+218.676783136 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:50:39 crc kubenswrapper[4837]: E0313 11:50:39.038658 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 11:51:43.038620646 +0000 UTC m=+218.676887429 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 11:50:39 crc kubenswrapper[4837]: E0313 11:50:39.038690 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 11:51:43.038676448 +0000 UTC m=+218.676943221 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 11:50:39 crc kubenswrapper[4837]: E0313 11:50:39.038715 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 11:51:43.038704779 +0000 UTC m=+218.676971552 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 11:50:39 crc kubenswrapper[4837]: I0313 11:50:39.047317 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:39 crc kubenswrapper[4837]: I0313 11:50:39.047347 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:39 crc kubenswrapper[4837]: I0313 11:50:39.047368 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:39 crc kubenswrapper[4837]: I0313 11:50:39.047390 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:39 crc kubenswrapper[4837]: E0313 11:50:39.047460 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:39 crc kubenswrapper[4837]: E0313 11:50:39.047574 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:39 crc kubenswrapper[4837]: E0313 11:50:39.047711 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:39 crc kubenswrapper[4837]: E0313 11:50:39.047773 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:39 crc kubenswrapper[4837]: I0313 11:50:39.138914 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86e5afeb-4720-4593-a53e-dfb5381d0b1d-metrics-certs\") pod \"network-metrics-daemon-cjn4q\" (UID: \"86e5afeb-4720-4593-a53e-dfb5381d0b1d\") " pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:39 crc kubenswrapper[4837]: E0313 11:50:39.139092 4837 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 11:50:39 crc kubenswrapper[4837]: E0313 11:50:39.139214 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86e5afeb-4720-4593-a53e-dfb5381d0b1d-metrics-certs podName:86e5afeb-4720-4593-a53e-dfb5381d0b1d nodeName:}" failed. No retries permitted until 2026-03-13 11:51:43.139193559 +0000 UTC m=+218.777460532 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86e5afeb-4720-4593-a53e-dfb5381d0b1d-metrics-certs") pod "network-metrics-daemon-cjn4q" (UID: "86e5afeb-4720-4593-a53e-dfb5381d0b1d") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 11:50:40 crc kubenswrapper[4837]: E0313 11:50:40.165864 4837 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 11:50:41 crc kubenswrapper[4837]: I0313 11:50:41.047817 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:41 crc kubenswrapper[4837]: I0313 11:50:41.047974 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:41 crc kubenswrapper[4837]: E0313 11:50:41.047976 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:41 crc kubenswrapper[4837]: I0313 11:50:41.048039 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:41 crc kubenswrapper[4837]: I0313 11:50:41.048058 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:41 crc kubenswrapper[4837]: E0313 11:50:41.048198 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:41 crc kubenswrapper[4837]: E0313 11:50:41.048273 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:41 crc kubenswrapper[4837]: E0313 11:50:41.048333 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:43 crc kubenswrapper[4837]: I0313 11:50:43.047726 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:43 crc kubenswrapper[4837]: I0313 11:50:43.047759 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:43 crc kubenswrapper[4837]: I0313 11:50:43.047737 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:43 crc kubenswrapper[4837]: E0313 11:50:43.047863 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:43 crc kubenswrapper[4837]: I0313 11:50:43.047864 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:43 crc kubenswrapper[4837]: E0313 11:50:43.047940 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:43 crc kubenswrapper[4837]: E0313 11:50:43.048053 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:43 crc kubenswrapper[4837]: E0313 11:50:43.048254 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:43 crc kubenswrapper[4837]: I0313 11:50:43.048933 4837 scope.go:117] "RemoveContainer" containerID="01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c" Mar 13 11:50:43 crc kubenswrapper[4837]: E0313 11:50:43.049075 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4zzrs_openshift-ovn-kubernetes(43df29f7-1351-41f5-bfca-17f804837cb4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" Mar 13 11:50:45 crc kubenswrapper[4837]: I0313 11:50:45.048012 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:45 crc kubenswrapper[4837]: I0313 11:50:45.048979 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:45 crc kubenswrapper[4837]: I0313 11:50:45.049039 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:45 crc kubenswrapper[4837]: I0313 11:50:45.049119 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:45 crc kubenswrapper[4837]: E0313 11:50:45.049348 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:45 crc kubenswrapper[4837]: E0313 11:50:45.049468 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:45 crc kubenswrapper[4837]: E0313 11:50:45.049511 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:45 crc kubenswrapper[4837]: E0313 11:50:45.048968 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:45 crc kubenswrapper[4837]: I0313 11:50:45.061841 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 13 11:50:45 crc kubenswrapper[4837]: I0313 11:50:45.089044 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=18.089021143 podStartE2EDuration="18.089021143s" podCreationTimestamp="2026-03-13 11:50:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:45.072384479 +0000 UTC m=+160.710651242" watchObservedRunningTime="2026-03-13 11:50:45.089021143 +0000 UTC m=+160.727287936" Mar 13 11:50:45 crc kubenswrapper[4837]: I0313 11:50:45.149929 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-np68d" podStartSLOduration=104.149887413 podStartE2EDuration="1m44.149887413s" podCreationTimestamp="2026-03-13 11:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:45.149851163 +0000 UTC m=+160.788117926" watchObservedRunningTime="2026-03-13 11:50:45.149887413 +0000 UTC m=+160.788154196" Mar 13 11:50:45 crc kubenswrapper[4837]: E0313 11:50:45.166693 4837 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 11:50:45 crc kubenswrapper[4837]: I0313 11:50:45.198931 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=57.19890946 podStartE2EDuration="57.19890946s" podCreationTimestamp="2026-03-13 11:49:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:45.181293144 +0000 UTC m=+160.819559907" watchObservedRunningTime="2026-03-13 11:50:45.19890946 +0000 UTC m=+160.837176223" Mar 13 11:50:45 crc kubenswrapper[4837]: I0313 11:50:45.211288 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=48.21126448 podStartE2EDuration="48.21126448s" podCreationTimestamp="2026-03-13 11:49:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:45.199186678 +0000 UTC m=+160.837453441" watchObservedRunningTime="2026-03-13 11:50:45.21126448 +0000 UTC m=+160.849531243" Mar 13 11:50:45 crc kubenswrapper[4837]: I0313 11:50:45.225658 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podStartSLOduration=103.225603501 podStartE2EDuration="1m43.225603501s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:45.225531039 +0000 UTC m=+160.863797822" watchObservedRunningTime="2026-03-13 11:50:45.225603501 +0000 UTC m=+160.863870284" Mar 13 11:50:45 crc kubenswrapper[4837]: I0313 11:50:45.250787 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt7fl" podStartSLOduration=103.250768845 podStartE2EDuration="1m43.250768845s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:45.23824136 +0000 UTC m=+160.876508133" watchObservedRunningTime="2026-03-13 11:50:45.250768845 +0000 UTC m=+160.889035608" Mar 13 11:50:45 crc kubenswrapper[4837]: I0313 11:50:45.251062 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-qg957" podStartSLOduration=103.251058374 podStartE2EDuration="1m43.251058374s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:45.249994901 +0000 UTC m=+160.888261674" watchObservedRunningTime="2026-03-13 11:50:45.251058374 +0000 UTC m=+160.889325137" Mar 13 11:50:45 crc kubenswrapper[4837]: I0313 11:50:45.309540 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-xkqn6" podStartSLOduration=103.309521028 podStartE2EDuration="1m43.309521028s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:45.309267031 +0000 UTC m=+160.947533804" watchObservedRunningTime="2026-03-13 11:50:45.309521028 +0000 UTC m=+160.947787801" Mar 13 11:50:45 crc kubenswrapper[4837]: I0313 11:50:45.309774 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=63.309767576 podStartE2EDuration="1m3.309767576s" podCreationTimestamp="2026-03-13 11:49:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:45.286522943 +0000 UTC m=+160.924789706" watchObservedRunningTime="2026-03-13 11:50:45.309767576 +0000 UTC m=+160.948034339" Mar 13 11:50:45 crc kubenswrapper[4837]: I0313 11:50:45.338371 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-xwmn9" podStartSLOduration=104.338350478 podStartE2EDuration="1m44.338350478s" podCreationTimestamp="2026-03-13 11:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:45.329461737 +0000 UTC m=+160.967728500" watchObservedRunningTime="2026-03-13 11:50:45.338350478 +0000 UTC m=+160.976617241" Mar 13 11:50:47 crc kubenswrapper[4837]: I0313 11:50:47.047876 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:47 crc kubenswrapper[4837]: I0313 11:50:47.047884 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:47 crc kubenswrapper[4837]: E0313 11:50:47.048090 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:47 crc kubenswrapper[4837]: I0313 11:50:47.047914 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:47 crc kubenswrapper[4837]: I0313 11:50:47.047914 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:47 crc kubenswrapper[4837]: E0313 11:50:47.048206 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:47 crc kubenswrapper[4837]: E0313 11:50:47.048265 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:47 crc kubenswrapper[4837]: E0313 11:50:47.048384 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:47 crc kubenswrapper[4837]: I0313 11:50:47.723103 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 11:50:47 crc kubenswrapper[4837]: I0313 11:50:47.723173 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 11:50:47 crc kubenswrapper[4837]: I0313 11:50:47.723200 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 11:50:47 crc kubenswrapper[4837]: I0313 11:50:47.723228 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 11:50:47 crc kubenswrapper[4837]: I0313 11:50:47.723250 4837 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T11:50:47Z","lastTransitionTime":"2026-03-13T11:50:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 11:50:47 crc kubenswrapper[4837]: I0313 11:50:47.797659 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-6dfqm"] Mar 13 11:50:47 crc kubenswrapper[4837]: I0313 11:50:47.798092 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6dfqm" Mar 13 11:50:47 crc kubenswrapper[4837]: I0313 11:50:47.799918 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 13 11:50:47 crc kubenswrapper[4837]: I0313 11:50:47.799973 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 13 11:50:47 crc kubenswrapper[4837]: I0313 11:50:47.800002 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 13 11:50:47 crc kubenswrapper[4837]: I0313 11:50:47.800578 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 13 11:50:47 crc kubenswrapper[4837]: I0313 11:50:47.927433 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/86283508-cb82-4fca-b672-4c5cd27b8018-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-6dfqm\" (UID: \"86283508-cb82-4fca-b672-4c5cd27b8018\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6dfqm" Mar 13 11:50:47 crc kubenswrapper[4837]: I0313 11:50:47.927474 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/86283508-cb82-4fca-b672-4c5cd27b8018-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-6dfqm\" (UID: \"86283508-cb82-4fca-b672-4c5cd27b8018\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6dfqm" Mar 13 11:50:47 crc kubenswrapper[4837]: I0313 11:50:47.927497 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/86283508-cb82-4fca-b672-4c5cd27b8018-service-ca\") pod \"cluster-version-operator-5c965bbfc6-6dfqm\" (UID: \"86283508-cb82-4fca-b672-4c5cd27b8018\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6dfqm" Mar 13 11:50:47 crc kubenswrapper[4837]: I0313 11:50:47.927523 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/86283508-cb82-4fca-b672-4c5cd27b8018-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-6dfqm\" (UID: \"86283508-cb82-4fca-b672-4c5cd27b8018\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6dfqm" Mar 13 11:50:47 crc kubenswrapper[4837]: I0313 11:50:47.927680 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86283508-cb82-4fca-b672-4c5cd27b8018-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-6dfqm\" (UID: \"86283508-cb82-4fca-b672-4c5cd27b8018\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6dfqm" Mar 13 11:50:48 crc kubenswrapper[4837]: I0313 11:50:48.029286 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86283508-cb82-4fca-b672-4c5cd27b8018-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-6dfqm\" (UID: \"86283508-cb82-4fca-b672-4c5cd27b8018\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6dfqm" Mar 13 11:50:48 crc kubenswrapper[4837]: I0313 11:50:48.029366 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/86283508-cb82-4fca-b672-4c5cd27b8018-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-6dfqm\" (UID: \"86283508-cb82-4fca-b672-4c5cd27b8018\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6dfqm" Mar 13 11:50:48 crc kubenswrapper[4837]: I0313 11:50:48.029431 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/86283508-cb82-4fca-b672-4c5cd27b8018-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-6dfqm\" (UID: \"86283508-cb82-4fca-b672-4c5cd27b8018\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6dfqm" Mar 13 11:50:48 crc kubenswrapper[4837]: I0313 11:50:48.029456 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/86283508-cb82-4fca-b672-4c5cd27b8018-service-ca\") pod \"cluster-version-operator-5c965bbfc6-6dfqm\" (UID: \"86283508-cb82-4fca-b672-4c5cd27b8018\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6dfqm" Mar 13 11:50:48 crc kubenswrapper[4837]: I0313 11:50:48.029498 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/86283508-cb82-4fca-b672-4c5cd27b8018-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-6dfqm\" (UID: \"86283508-cb82-4fca-b672-4c5cd27b8018\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6dfqm" Mar 13 11:50:48 crc kubenswrapper[4837]: I0313 11:50:48.029564 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/86283508-cb82-4fca-b672-4c5cd27b8018-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-6dfqm\" (UID: \"86283508-cb82-4fca-b672-4c5cd27b8018\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6dfqm" Mar 13 11:50:48 crc kubenswrapper[4837]: I0313 11:50:48.029564 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/86283508-cb82-4fca-b672-4c5cd27b8018-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-6dfqm\" (UID: \"86283508-cb82-4fca-b672-4c5cd27b8018\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6dfqm" Mar 13 11:50:48 crc kubenswrapper[4837]: I0313 11:50:48.030763 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/86283508-cb82-4fca-b672-4c5cd27b8018-service-ca\") pod \"cluster-version-operator-5c965bbfc6-6dfqm\" (UID: \"86283508-cb82-4fca-b672-4c5cd27b8018\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6dfqm" Mar 13 11:50:48 crc kubenswrapper[4837]: I0313 11:50:48.036817 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86283508-cb82-4fca-b672-4c5cd27b8018-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-6dfqm\" (UID: \"86283508-cb82-4fca-b672-4c5cd27b8018\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6dfqm" Mar 13 11:50:48 crc kubenswrapper[4837]: I0313 11:50:48.049493 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/86283508-cb82-4fca-b672-4c5cd27b8018-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-6dfqm\" (UID: \"86283508-cb82-4fca-b672-4c5cd27b8018\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6dfqm" Mar 13 11:50:48 crc kubenswrapper[4837]: I0313 11:50:48.111765 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6dfqm" Mar 13 11:50:48 crc kubenswrapper[4837]: I0313 11:50:48.139558 4837 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 13 11:50:48 crc kubenswrapper[4837]: I0313 11:50:48.149148 4837 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 13 11:50:48 crc kubenswrapper[4837]: I0313 11:50:48.813562 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6dfqm" event={"ID":"86283508-cb82-4fca-b672-4c5cd27b8018","Type":"ContainerStarted","Data":"a436092eb4cf4025a437251be7e13e7be3ca141be88a723720a9c9b6a732bf29"} Mar 13 11:50:48 crc kubenswrapper[4837]: I0313 11:50:48.813608 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6dfqm" event={"ID":"86283508-cb82-4fca-b672-4c5cd27b8018","Type":"ContainerStarted","Data":"0c8bbf900ab9ff859b13a83e5127257a9dd11f980c2466929bccfbeac6ae5eef"} Mar 13 11:50:48 crc kubenswrapper[4837]: I0313 11:50:48.828620 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=3.828587033 podStartE2EDuration="3.828587033s" podCreationTimestamp="2026-03-13 11:50:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:47.828722777 +0000 UTC m=+163.466989540" watchObservedRunningTime="2026-03-13 11:50:48.828587033 +0000 UTC m=+164.466853836" Mar 13 11:50:48 crc kubenswrapper[4837]: I0313 11:50:48.829446 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6dfqm" podStartSLOduration=107.82943348 podStartE2EDuration="1m47.82943348s" podCreationTimestamp="2026-03-13 11:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:50:48.827878012 +0000 UTC m=+164.466144805" watchObservedRunningTime="2026-03-13 11:50:48.82943348 +0000 UTC m=+164.467700293" Mar 13 11:50:49 crc kubenswrapper[4837]: I0313 11:50:49.047677 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:49 crc kubenswrapper[4837]: I0313 11:50:49.047677 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:49 crc kubenswrapper[4837]: E0313 11:50:49.047859 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:49 crc kubenswrapper[4837]: I0313 11:50:49.047697 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:49 crc kubenswrapper[4837]: I0313 11:50:49.047697 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:49 crc kubenswrapper[4837]: E0313 11:50:49.047965 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:49 crc kubenswrapper[4837]: E0313 11:50:49.048133 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:49 crc kubenswrapper[4837]: E0313 11:50:49.048246 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:50 crc kubenswrapper[4837]: E0313 11:50:50.167701 4837 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 11:50:51 crc kubenswrapper[4837]: I0313 11:50:51.048162 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:51 crc kubenswrapper[4837]: I0313 11:50:51.048168 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:51 crc kubenswrapper[4837]: E0313 11:50:51.048367 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:51 crc kubenswrapper[4837]: I0313 11:50:51.048192 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:51 crc kubenswrapper[4837]: I0313 11:50:51.048169 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:51 crc kubenswrapper[4837]: E0313 11:50:51.048465 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:51 crc kubenswrapper[4837]: E0313 11:50:51.048547 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:51 crc kubenswrapper[4837]: E0313 11:50:51.048618 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:53 crc kubenswrapper[4837]: I0313 11:50:53.047832 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:53 crc kubenswrapper[4837]: I0313 11:50:53.047865 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:53 crc kubenswrapper[4837]: I0313 11:50:53.047878 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:53 crc kubenswrapper[4837]: E0313 11:50:53.047989 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:53 crc kubenswrapper[4837]: I0313 11:50:53.048050 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:53 crc kubenswrapper[4837]: E0313 11:50:53.048196 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:53 crc kubenswrapper[4837]: E0313 11:50:53.048314 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:53 crc kubenswrapper[4837]: E0313 11:50:53.048384 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:54 crc kubenswrapper[4837]: I0313 11:50:54.048849 4837 scope.go:117] "RemoveContainer" containerID="01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c" Mar 13 11:50:54 crc kubenswrapper[4837]: E0313 11:50:54.049128 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4zzrs_openshift-ovn-kubernetes(43df29f7-1351-41f5-bfca-17f804837cb4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" Mar 13 11:50:55 crc kubenswrapper[4837]: I0313 11:50:55.047544 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:55 crc kubenswrapper[4837]: I0313 11:50:55.047606 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:55 crc kubenswrapper[4837]: I0313 11:50:55.047701 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:55 crc kubenswrapper[4837]: I0313 11:50:55.048619 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:55 crc kubenswrapper[4837]: E0313 11:50:55.048613 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:55 crc kubenswrapper[4837]: E0313 11:50:55.048764 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:55 crc kubenswrapper[4837]: E0313 11:50:55.048848 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:55 crc kubenswrapper[4837]: E0313 11:50:55.049135 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:55 crc kubenswrapper[4837]: E0313 11:50:55.168652 4837 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 11:50:57 crc kubenswrapper[4837]: I0313 11:50:57.048108 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:57 crc kubenswrapper[4837]: I0313 11:50:57.048168 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:57 crc kubenswrapper[4837]: I0313 11:50:57.048242 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:57 crc kubenswrapper[4837]: I0313 11:50:57.048116 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:57 crc kubenswrapper[4837]: E0313 11:50:57.048293 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:57 crc kubenswrapper[4837]: E0313 11:50:57.048467 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:57 crc kubenswrapper[4837]: E0313 11:50:57.048707 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:57 crc kubenswrapper[4837]: E0313 11:50:57.049292 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:50:59 crc kubenswrapper[4837]: I0313 11:50:59.047236 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:50:59 crc kubenswrapper[4837]: I0313 11:50:59.047293 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:50:59 crc kubenswrapper[4837]: E0313 11:50:59.047420 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:50:59 crc kubenswrapper[4837]: I0313 11:50:59.047486 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:50:59 crc kubenswrapper[4837]: I0313 11:50:59.047531 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:50:59 crc kubenswrapper[4837]: E0313 11:50:59.047618 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:50:59 crc kubenswrapper[4837]: E0313 11:50:59.047789 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:50:59 crc kubenswrapper[4837]: E0313 11:50:59.047841 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:51:00 crc kubenswrapper[4837]: E0313 11:51:00.169806 4837 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 11:51:01 crc kubenswrapper[4837]: I0313 11:51:01.047611 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:51:01 crc kubenswrapper[4837]: I0313 11:51:01.047772 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:51:01 crc kubenswrapper[4837]: E0313 11:51:01.047869 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:51:01 crc kubenswrapper[4837]: I0313 11:51:01.047780 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:51:01 crc kubenswrapper[4837]: E0313 11:51:01.048008 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:51:01 crc kubenswrapper[4837]: E0313 11:51:01.048244 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:51:01 crc kubenswrapper[4837]: I0313 11:51:01.048592 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:51:01 crc kubenswrapper[4837]: E0313 11:51:01.048745 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:51:03 crc kubenswrapper[4837]: I0313 11:51:03.048082 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:51:03 crc kubenswrapper[4837]: I0313 11:51:03.048144 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:51:03 crc kubenswrapper[4837]: I0313 11:51:03.048180 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:51:03 crc kubenswrapper[4837]: I0313 11:51:03.048117 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:51:03 crc kubenswrapper[4837]: E0313 11:51:03.048236 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:51:03 crc kubenswrapper[4837]: E0313 11:51:03.048325 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:51:03 crc kubenswrapper[4837]: E0313 11:51:03.048413 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:51:03 crc kubenswrapper[4837]: E0313 11:51:03.048536 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:51:05 crc kubenswrapper[4837]: I0313 11:51:05.048162 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:51:05 crc kubenswrapper[4837]: I0313 11:51:05.048192 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:51:05 crc kubenswrapper[4837]: I0313 11:51:05.048232 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:51:05 crc kubenswrapper[4837]: I0313 11:51:05.048416 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:51:05 crc kubenswrapper[4837]: E0313 11:51:05.049674 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:51:05 crc kubenswrapper[4837]: E0313 11:51:05.049760 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:51:05 crc kubenswrapper[4837]: E0313 11:51:05.049906 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:51:05 crc kubenswrapper[4837]: E0313 11:51:05.050024 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:51:05 crc kubenswrapper[4837]: E0313 11:51:05.170323 4837 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 11:51:07 crc kubenswrapper[4837]: I0313 11:51:07.047747 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:51:07 crc kubenswrapper[4837]: I0313 11:51:07.047815 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:51:07 crc kubenswrapper[4837]: I0313 11:51:07.047876 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:51:07 crc kubenswrapper[4837]: E0313 11:51:07.047977 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:51:07 crc kubenswrapper[4837]: I0313 11:51:07.048019 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:51:07 crc kubenswrapper[4837]: E0313 11:51:07.048214 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:51:07 crc kubenswrapper[4837]: E0313 11:51:07.048528 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:51:07 crc kubenswrapper[4837]: E0313 11:51:07.048692 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:51:08 crc kubenswrapper[4837]: I0313 11:51:08.048486 4837 scope.go:117] "RemoveContainer" containerID="01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c" Mar 13 11:51:08 crc kubenswrapper[4837]: E0313 11:51:08.048688 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4zzrs_openshift-ovn-kubernetes(43df29f7-1351-41f5-bfca-17f804837cb4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" Mar 13 11:51:09 crc kubenswrapper[4837]: I0313 11:51:09.047904 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:51:09 crc kubenswrapper[4837]: I0313 11:51:09.047938 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:51:09 crc kubenswrapper[4837]: I0313 11:51:09.047981 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:51:09 crc kubenswrapper[4837]: I0313 11:51:09.048078 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:51:09 crc kubenswrapper[4837]: E0313 11:51:09.048081 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:51:09 crc kubenswrapper[4837]: E0313 11:51:09.048199 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:51:09 crc kubenswrapper[4837]: E0313 11:51:09.048301 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:51:09 crc kubenswrapper[4837]: E0313 11:51:09.048413 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:51:09 crc kubenswrapper[4837]: I0313 11:51:09.888841 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qg957_cbb3f4c6-a6c5-4059-8beb-04179d70aff5/kube-multus/1.log" Mar 13 11:51:09 crc kubenswrapper[4837]: I0313 11:51:09.889212 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qg957_cbb3f4c6-a6c5-4059-8beb-04179d70aff5/kube-multus/0.log" Mar 13 11:51:09 crc kubenswrapper[4837]: I0313 11:51:09.889248 4837 generic.go:334] "Generic (PLEG): container finished" podID="cbb3f4c6-a6c5-4059-8beb-04179d70aff5" containerID="19b8a72f10c691a74098997e9d2383adf1aeb1811ad22dc8a74b5a47945d1e3e" exitCode=1 Mar 13 11:51:09 crc kubenswrapper[4837]: I0313 11:51:09.889280 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qg957" event={"ID":"cbb3f4c6-a6c5-4059-8beb-04179d70aff5","Type":"ContainerDied","Data":"19b8a72f10c691a74098997e9d2383adf1aeb1811ad22dc8a74b5a47945d1e3e"} Mar 13 11:51:09 crc kubenswrapper[4837]: I0313 11:51:09.889311 4837 scope.go:117] "RemoveContainer" containerID="9de398c1433d502cfa6bcb1da8cac72bfced99028ef5172f1e038bb7cbf38a27" Mar 13 11:51:09 crc kubenswrapper[4837]: I0313 11:51:09.889693 4837 scope.go:117] "RemoveContainer" containerID="19b8a72f10c691a74098997e9d2383adf1aeb1811ad22dc8a74b5a47945d1e3e" Mar 13 11:51:09 crc kubenswrapper[4837]: E0313 11:51:09.889851 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-qg957_openshift-multus(cbb3f4c6-a6c5-4059-8beb-04179d70aff5)\"" pod="openshift-multus/multus-qg957" podUID="cbb3f4c6-a6c5-4059-8beb-04179d70aff5" Mar 13 11:51:10 crc kubenswrapper[4837]: E0313 11:51:10.171514 4837 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 11:51:10 crc kubenswrapper[4837]: I0313 11:51:10.892946 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qg957_cbb3f4c6-a6c5-4059-8beb-04179d70aff5/kube-multus/1.log" Mar 13 11:51:11 crc kubenswrapper[4837]: I0313 11:51:11.048199 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:51:11 crc kubenswrapper[4837]: I0313 11:51:11.048200 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:51:11 crc kubenswrapper[4837]: E0313 11:51:11.048337 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:51:11 crc kubenswrapper[4837]: I0313 11:51:11.048386 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:51:11 crc kubenswrapper[4837]: I0313 11:51:11.048429 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:51:11 crc kubenswrapper[4837]: E0313 11:51:11.048617 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:51:11 crc kubenswrapper[4837]: E0313 11:51:11.048789 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:51:11 crc kubenswrapper[4837]: E0313 11:51:11.048891 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:51:13 crc kubenswrapper[4837]: I0313 11:51:13.047141 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:51:13 crc kubenswrapper[4837]: I0313 11:51:13.047171 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:51:13 crc kubenswrapper[4837]: I0313 11:51:13.047193 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:51:13 crc kubenswrapper[4837]: E0313 11:51:13.047262 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:51:13 crc kubenswrapper[4837]: I0313 11:51:13.047340 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:51:13 crc kubenswrapper[4837]: E0313 11:51:13.047402 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:51:13 crc kubenswrapper[4837]: E0313 11:51:13.047493 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:51:13 crc kubenswrapper[4837]: E0313 11:51:13.047560 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:51:15 crc kubenswrapper[4837]: I0313 11:51:15.047391 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:51:15 crc kubenswrapper[4837]: E0313 11:51:15.049134 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:51:15 crc kubenswrapper[4837]: I0313 11:51:15.049277 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:51:15 crc kubenswrapper[4837]: I0313 11:51:15.049385 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:51:15 crc kubenswrapper[4837]: I0313 11:51:15.049434 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:51:15 crc kubenswrapper[4837]: E0313 11:51:15.049514 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:51:15 crc kubenswrapper[4837]: E0313 11:51:15.049805 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:51:15 crc kubenswrapper[4837]: E0313 11:51:15.050299 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:51:15 crc kubenswrapper[4837]: E0313 11:51:15.172482 4837 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 11:51:17 crc kubenswrapper[4837]: I0313 11:51:17.047682 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:51:17 crc kubenswrapper[4837]: I0313 11:51:17.047715 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:51:17 crc kubenswrapper[4837]: I0313 11:51:17.047682 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:51:17 crc kubenswrapper[4837]: I0313 11:51:17.047710 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:51:17 crc kubenswrapper[4837]: E0313 11:51:17.047815 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:51:17 crc kubenswrapper[4837]: E0313 11:51:17.047923 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:51:17 crc kubenswrapper[4837]: E0313 11:51:17.047970 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:51:17 crc kubenswrapper[4837]: E0313 11:51:17.048129 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:51:19 crc kubenswrapper[4837]: I0313 11:51:19.047944 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:51:19 crc kubenswrapper[4837]: I0313 11:51:19.048010 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:51:19 crc kubenswrapper[4837]: I0313 11:51:19.047964 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:51:19 crc kubenswrapper[4837]: I0313 11:51:19.048390 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:51:19 crc kubenswrapper[4837]: E0313 11:51:19.048502 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:51:19 crc kubenswrapper[4837]: E0313 11:51:19.048851 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:51:19 crc kubenswrapper[4837]: E0313 11:51:19.048962 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:51:19 crc kubenswrapper[4837]: E0313 11:51:19.049015 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:51:19 crc kubenswrapper[4837]: I0313 11:51:19.049032 4837 scope.go:117] "RemoveContainer" containerID="01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c" Mar 13 11:51:19 crc kubenswrapper[4837]: I0313 11:51:19.920742 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4zzrs_43df29f7-1351-41f5-bfca-17f804837cb4/ovnkube-controller/3.log" Mar 13 11:51:19 crc kubenswrapper[4837]: I0313 11:51:19.923404 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" event={"ID":"43df29f7-1351-41f5-bfca-17f804837cb4","Type":"ContainerStarted","Data":"f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282"} Mar 13 11:51:19 crc kubenswrapper[4837]: I0313 11:51:19.923796 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:51:19 crc kubenswrapper[4837]: I0313 11:51:19.954914 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-cjn4q"] Mar 13 11:51:19 crc kubenswrapper[4837]: I0313 11:51:19.955042 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:51:19 crc kubenswrapper[4837]: E0313 11:51:19.955132 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:51:19 crc kubenswrapper[4837]: I0313 11:51:19.965114 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" podStartSLOduration=137.965097201 podStartE2EDuration="2m17.965097201s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:19.964258284 +0000 UTC m=+195.602525047" watchObservedRunningTime="2026-03-13 11:51:19.965097201 +0000 UTC m=+195.603363964" Mar 13 11:51:20 crc kubenswrapper[4837]: E0313 11:51:20.173817 4837 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 11:51:21 crc kubenswrapper[4837]: I0313 11:51:21.047151 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:51:21 crc kubenswrapper[4837]: E0313 11:51:21.047549 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:51:21 crc kubenswrapper[4837]: I0313 11:51:21.047364 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:51:21 crc kubenswrapper[4837]: E0313 11:51:21.047698 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:51:21 crc kubenswrapper[4837]: I0313 11:51:21.047166 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:51:21 crc kubenswrapper[4837]: I0313 11:51:21.047582 4837 scope.go:117] "RemoveContainer" containerID="19b8a72f10c691a74098997e9d2383adf1aeb1811ad22dc8a74b5a47945d1e3e" Mar 13 11:51:21 crc kubenswrapper[4837]: E0313 11:51:21.047770 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:51:21 crc kubenswrapper[4837]: I0313 11:51:21.932192 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qg957_cbb3f4c6-a6c5-4059-8beb-04179d70aff5/kube-multus/1.log" Mar 13 11:51:21 crc kubenswrapper[4837]: I0313 11:51:21.932859 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qg957" event={"ID":"cbb3f4c6-a6c5-4059-8beb-04179d70aff5","Type":"ContainerStarted","Data":"1effae1c86d3c4f5369295262f269b1dad692c561321e1c868d2b4fe7f736d7c"} Mar 13 11:51:22 crc kubenswrapper[4837]: I0313 11:51:22.047158 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:51:22 crc kubenswrapper[4837]: E0313 11:51:22.047288 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:51:23 crc kubenswrapper[4837]: I0313 11:51:23.047867 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:51:23 crc kubenswrapper[4837]: I0313 11:51:23.047903 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:51:23 crc kubenswrapper[4837]: I0313 11:51:23.047960 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:51:23 crc kubenswrapper[4837]: E0313 11:51:23.047992 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:51:23 crc kubenswrapper[4837]: E0313 11:51:23.048123 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:51:23 crc kubenswrapper[4837]: E0313 11:51:23.048179 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:51:24 crc kubenswrapper[4837]: I0313 11:51:24.047880 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:51:24 crc kubenswrapper[4837]: E0313 11:51:24.048119 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cjn4q" podUID="86e5afeb-4720-4593-a53e-dfb5381d0b1d" Mar 13 11:51:25 crc kubenswrapper[4837]: I0313 11:51:25.048096 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:51:25 crc kubenswrapper[4837]: E0313 11:51:25.049529 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 11:51:25 crc kubenswrapper[4837]: I0313 11:51:25.049577 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:51:25 crc kubenswrapper[4837]: I0313 11:51:25.049587 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:51:25 crc kubenswrapper[4837]: E0313 11:51:25.049752 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 11:51:25 crc kubenswrapper[4837]: E0313 11:51:25.049875 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 11:51:26 crc kubenswrapper[4837]: I0313 11:51:26.047575 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:51:26 crc kubenswrapper[4837]: I0313 11:51:26.050122 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 13 11:51:26 crc kubenswrapper[4837]: I0313 11:51:26.051893 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 13 11:51:27 crc kubenswrapper[4837]: I0313 11:51:27.048176 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:51:27 crc kubenswrapper[4837]: I0313 11:51:27.048238 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:51:27 crc kubenswrapper[4837]: I0313 11:51:27.048176 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:51:27 crc kubenswrapper[4837]: I0313 11:51:27.050511 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 13 11:51:27 crc kubenswrapper[4837]: I0313 11:51:27.051931 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 13 11:51:27 crc kubenswrapper[4837]: I0313 11:51:27.051995 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 13 11:51:27 crc kubenswrapper[4837]: I0313 11:51:27.052110 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.882208 4837 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.935004 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-qqkbm"] Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.936257 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.940523 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.940559 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.940547 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.940730 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.940876 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.941135 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.949712 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.954431 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-vsp2m"] Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.954994 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.955428 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-vsp2m" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.955428 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb"] Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.956947 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.958157 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9dbhc"] Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.958933 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.961867 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-dhrww"] Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.962523 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dhrww" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.963541 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs"] Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.964188 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.964874 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dv9wr"] Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.965522 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dv9wr" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.970807 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8q6j6"] Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.971743 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.976436 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.989859 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.990261 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.990671 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.991158 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.991406 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.991652 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.991660 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.992009 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.992276 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.992336 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.992527 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.992617 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-f97pg"] Mar 13 11:51:28 crc kubenswrapper[4837]: I0313 11:51:28.993957 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.001972 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.002304 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.002438 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.004288 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.005735 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-84ccm"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.006024 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-8dj7w"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.006433 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-8dj7w" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.006656 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-q2qpt"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.007236 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.007400 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.008167 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.008466 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.008688 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.008727 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.008998 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.008773 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.009346 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.009445 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.009453 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-f97pg" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.009615 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.009889 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.009743 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-84ccm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.010243 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.008779 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.010484 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.010710 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.008892 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.010996 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.010547 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.009263 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.011546 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.011562 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.009258 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.011683 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.012174 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.015735 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.015976 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jpbgx"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.016567 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-8ktsx"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.016886 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fgk4q"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.017034 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jpbgx" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.017259 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fgk4q" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.017627 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-8ktsx" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.018016 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.018183 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.018484 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.018659 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.018844 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.018867 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.018969 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.018996 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.019630 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.019937 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.020573 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.021038 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l57bl"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.021620 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l57bl" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.021659 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2w96t"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.022112 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.023300 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5nslp"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.026223 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5nslp" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.026808 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4rjht"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.027353 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4rjht" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.030001 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v526f"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.030613 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-2jcxn"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.031147 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-64xpb"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.031692 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-l4rxn"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.031730 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-64xpb" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.032049 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v526f" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.032111 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2jcxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.032610 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-l4rxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.032926 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pmg8q"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.033326 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pmg8q" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.035925 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-8zzqp"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.036734 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-8zzqp" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.037002 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-9tkxg"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.038075 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-9tkxg" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.039629 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9dbhc"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.041780 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-69xj9"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.042412 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-wcfj4"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.042957 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-wcfj4" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.043221 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-69xj9" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.066372 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.071262 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.075220 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.096344 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.096617 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.103993 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.104858 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.107238 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-hffr5"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.107981 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jrm5t"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.108342 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bcfcc"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.108510 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hffr5" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.108977 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/27d45de2-e0ab-4c3e-b3da-b20e60e26801-audit-dir\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109015 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ffb5553f-d2d5-4584-9bf8-7212a378f358-encryption-config\") pod \"apiserver-7bbb656c7d-472bb\" (UID: \"ffb5553f-d2d5-4584-9bf8-7212a378f358\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109041 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a3cabe4-69ee-49f7-a783-e72ac1a56821-config\") pod \"route-controller-manager-6576b87f9c-qs2qs\" (UID: \"5a3cabe4-69ee-49f7-a783-e72ac1a56821\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109067 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f8bc408a-bca6-42ff-8572-2ba9a3978682-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-9dbhc\" (UID: \"f8bc408a-bca6-42ff-8572-2ba9a3978682\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109098 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109125 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/10ac507b-7307-4e09-ab72-b956d0139396-auth-proxy-config\") pod \"machine-approver-56656f9798-dhrww\" (UID: \"10ac507b-7307-4e09-ab72-b956d0139396\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dhrww" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109148 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e1f747d-78f3-4cbc-b313-eed531936c02-config\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109170 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d8974a7e-ac32-4644-b7ee-2d3908daf2fa-service-ca-bundle\") pod \"authentication-operator-69f744f599-84ccm\" (UID: \"d8974a7e-ac32-4644-b7ee-2d3908daf2fa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-84ccm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109194 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74s9f\" (UniqueName: \"kubernetes.io/projected/d8974a7e-ac32-4644-b7ee-2d3908daf2fa-kube-api-access-74s9f\") pod \"authentication-operator-69f744f599-84ccm\" (UID: \"d8974a7e-ac32-4644-b7ee-2d3908daf2fa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-84ccm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109218 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109247 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f8d8640c-c4bc-40ad-9594-7b4fb2c4beb0-available-featuregates\") pod \"openshift-config-operator-7777fb866f-f97pg\" (UID: \"f8d8640c-c4bc-40ad-9594-7b4fb2c4beb0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-f97pg" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109270 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109290 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ffb5553f-d2d5-4584-9bf8-7212a378f358-audit-policies\") pod \"apiserver-7bbb656c7d-472bb\" (UID: \"ffb5553f-d2d5-4584-9bf8-7212a378f358\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109312 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsdcp\" (UniqueName: \"kubernetes.io/projected/ffb5553f-d2d5-4584-9bf8-7212a378f358-kube-api-access-zsdcp\") pod \"apiserver-7bbb656c7d-472bb\" (UID: \"ffb5553f-d2d5-4584-9bf8-7212a378f358\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109336 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9blhw\" (UniqueName: \"kubernetes.io/projected/003e8201-4e67-4356-b0c1-8cc135451069-kube-api-access-9blhw\") pod \"console-operator-58897d9998-8dj7w\" (UID: \"003e8201-4e67-4356-b0c1-8cc135451069\") " pod="openshift-console-operator/console-operator-58897d9998-8dj7w" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109358 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6dmk\" (UniqueName: \"kubernetes.io/projected/6db10103-96be-4420-b302-a7064e347f61-kube-api-access-q6dmk\") pod \"machine-api-operator-5694c8668f-vsp2m\" (UID: \"6db10103-96be-4420-b302-a7064e347f61\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vsp2m" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109380 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ad2861b-4f40-4551-8aff-304359734792-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-dv9wr\" (UID: \"6ad2861b-4f40-4551-8aff-304359734792\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dv9wr" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109402 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109424 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109443 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c83842ec-9933-4f84-bb4a-c84ca61a28e1-oauth-serving-cert\") pod \"console-f9d7485db-q2qpt\" (UID: \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\") " pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109465 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/10ac507b-7307-4e09-ab72-b956d0139396-machine-approver-tls\") pod \"machine-approver-56656f9798-dhrww\" (UID: \"10ac507b-7307-4e09-ab72-b956d0139396\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dhrww" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109487 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ffb5553f-d2d5-4584-9bf8-7212a378f358-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-472bb\" (UID: \"ffb5553f-d2d5-4584-9bf8-7212a378f358\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109508 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mclx9\" (UniqueName: \"kubernetes.io/projected/6ad2861b-4f40-4551-8aff-304359734792-kube-api-access-mclx9\") pod \"openshift-apiserver-operator-796bbdcf4f-dv9wr\" (UID: \"6ad2861b-4f40-4551-8aff-304359734792\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dv9wr" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109530 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109567 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6db10103-96be-4420-b302-a7064e347f61-config\") pod \"machine-api-operator-5694c8668f-vsp2m\" (UID: \"6db10103-96be-4420-b302-a7064e347f61\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vsp2m" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109588 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvs46\" (UniqueName: \"kubernetes.io/projected/f8bc408a-bca6-42ff-8572-2ba9a3978682-kube-api-access-xvs46\") pod \"controller-manager-879f6c89f-9dbhc\" (UID: \"f8bc408a-bca6-42ff-8572-2ba9a3978682\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109621 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3e1f747d-78f3-4cbc-b313-eed531936c02-encryption-config\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109673 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bcfcc" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109675 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vffzw\" (UniqueName: \"kubernetes.io/projected/3e1f747d-78f3-4cbc-b313-eed531936c02-kube-api-access-vffzw\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109742 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8bc408a-bca6-42ff-8572-2ba9a3978682-serving-cert\") pod \"controller-manager-879f6c89f-9dbhc\" (UID: \"f8bc408a-bca6-42ff-8572-2ba9a3978682\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109892 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ffb5553f-d2d5-4584-9bf8-7212a378f358-etcd-client\") pod \"apiserver-7bbb656c7d-472bb\" (UID: \"ffb5553f-d2d5-4584-9bf8-7212a378f358\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109915 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jrm5t" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.109922 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks5xn\" (UniqueName: \"kubernetes.io/projected/f8d8640c-c4bc-40ad-9594-7b4fb2c4beb0-kube-api-access-ks5xn\") pod \"openshift-config-operator-7777fb866f-f97pg\" (UID: \"f8d8640c-c4bc-40ad-9594-7b4fb2c4beb0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-f97pg" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.110015 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3e1f747d-78f3-4cbc-b313-eed531936c02-audit\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.110053 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3e1f747d-78f3-4cbc-b313-eed531936c02-image-import-ca\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.110086 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj2pk\" (UniqueName: \"kubernetes.io/projected/5a3cabe4-69ee-49f7-a783-e72ac1a56821-kube-api-access-sj2pk\") pod \"route-controller-manager-6576b87f9c-qs2qs\" (UID: \"5a3cabe4-69ee-49f7-a783-e72ac1a56821\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.110114 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3e1f747d-78f3-4cbc-b313-eed531936c02-node-pullsecrets\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.110135 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8974a7e-ac32-4644-b7ee-2d3908daf2fa-config\") pod \"authentication-operator-69f744f599-84ccm\" (UID: \"d8974a7e-ac32-4644-b7ee-2d3908daf2fa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-84ccm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.110159 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/003e8201-4e67-4356-b0c1-8cc135451069-config\") pod \"console-operator-58897d9998-8dj7w\" (UID: \"003e8201-4e67-4356-b0c1-8cc135451069\") " pod="openshift-console-operator/console-operator-58897d9998-8dj7w" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.110195 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.110226 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5a3cabe4-69ee-49f7-a783-e72ac1a56821-client-ca\") pod \"route-controller-manager-6576b87f9c-qs2qs\" (UID: \"5a3cabe4-69ee-49f7-a783-e72ac1a56821\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.110255 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6db10103-96be-4420-b302-a7064e347f61-images\") pod \"machine-api-operator-5694c8668f-vsp2m\" (UID: \"6db10103-96be-4420-b302-a7064e347f61\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vsp2m" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.110280 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f8bc408a-bca6-42ff-8572-2ba9a3978682-client-ca\") pod \"controller-manager-879f6c89f-9dbhc\" (UID: \"f8bc408a-bca6-42ff-8572-2ba9a3978682\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.110304 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.110325 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8d8640c-c4bc-40ad-9594-7b4fb2c4beb0-serving-cert\") pod \"openshift-config-operator-7777fb866f-f97pg\" (UID: \"f8d8640c-c4bc-40ad-9594-7b4fb2c4beb0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-f97pg" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.110367 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ad2861b-4f40-4551-8aff-304359734792-config\") pod \"openshift-apiserver-operator-796bbdcf4f-dv9wr\" (UID: \"6ad2861b-4f40-4551-8aff-304359734792\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dv9wr" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.110397 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c83842ec-9933-4f84-bb4a-c84ca61a28e1-service-ca\") pod \"console-f9d7485db-q2qpt\" (UID: \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\") " pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.110444 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ffb5553f-d2d5-4584-9bf8-7212a378f358-audit-dir\") pod \"apiserver-7bbb656c7d-472bb\" (UID: \"ffb5553f-d2d5-4584-9bf8-7212a378f358\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.110631 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.110672 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.110502 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.110813 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.110855 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ffb5553f-d2d5-4584-9bf8-7212a378f358-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-472bb\" (UID: \"ffb5553f-d2d5-4584-9bf8-7212a378f358\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.110880 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e1f747d-78f3-4cbc-b313-eed531936c02-trusted-ca-bundle\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.110909 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/003e8201-4e67-4356-b0c1-8cc135451069-trusted-ca\") pod \"console-operator-58897d9998-8dj7w\" (UID: \"003e8201-4e67-4356-b0c1-8cc135451069\") " pod="openshift-console-operator/console-operator-58897d9998-8dj7w" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.110934 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3e1f747d-78f3-4cbc-b313-eed531936c02-audit-dir\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.110957 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3e1f747d-78f3-4cbc-b313-eed531936c02-etcd-client\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.110980 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3e1f747d-78f3-4cbc-b313-eed531936c02-etcd-serving-ca\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.111000 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c83842ec-9933-4f84-bb4a-c84ca61a28e1-console-config\") pod \"console-f9d7485db-q2qpt\" (UID: \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\") " pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.111021 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8974a7e-ac32-4644-b7ee-2d3908daf2fa-serving-cert\") pod \"authentication-operator-69f744f599-84ccm\" (UID: \"d8974a7e-ac32-4644-b7ee-2d3908daf2fa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-84ccm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.111044 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c83842ec-9933-4f84-bb4a-c84ca61a28e1-console-serving-cert\") pod \"console-f9d7485db-q2qpt\" (UID: \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\") " pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.111065 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e1f747d-78f3-4cbc-b313-eed531936c02-serving-cert\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.111086 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnwqw\" (UniqueName: \"kubernetes.io/projected/10ac507b-7307-4e09-ab72-b956d0139396-kube-api-access-cnwqw\") pod \"machine-approver-56656f9798-dhrww\" (UID: \"10ac507b-7307-4e09-ab72-b956d0139396\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dhrww" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.111127 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/27d45de2-e0ab-4c3e-b3da-b20e60e26801-audit-policies\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.111147 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fca26784-7fdf-4923-bd07-35d182c2ad14-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-jpbgx\" (UID: \"fca26784-7fdf-4923-bd07-35d182c2ad14\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jpbgx" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.111167 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a3cabe4-69ee-49f7-a783-e72ac1a56821-serving-cert\") pod \"route-controller-manager-6576b87f9c-qs2qs\" (UID: \"5a3cabe4-69ee-49f7-a783-e72ac1a56821\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.111191 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffb5553f-d2d5-4584-9bf8-7212a378f358-serving-cert\") pod \"apiserver-7bbb656c7d-472bb\" (UID: \"ffb5553f-d2d5-4584-9bf8-7212a378f358\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.111210 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/003e8201-4e67-4356-b0c1-8cc135451069-serving-cert\") pod \"console-operator-58897d9998-8dj7w\" (UID: \"003e8201-4e67-4356-b0c1-8cc135451069\") " pod="openshift-console-operator/console-operator-58897d9998-8dj7w" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.111233 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c83842ec-9933-4f84-bb4a-c84ca61a28e1-console-oauth-config\") pod \"console-f9d7485db-q2qpt\" (UID: \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\") " pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.111256 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdq6z\" (UniqueName: \"kubernetes.io/projected/fca26784-7fdf-4923-bd07-35d182c2ad14-kube-api-access-hdq6z\") pod \"cluster-samples-operator-665b6dd947-jpbgx\" (UID: \"fca26784-7fdf-4923-bd07-35d182c2ad14\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jpbgx" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.111280 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d8974a7e-ac32-4644-b7ee-2d3908daf2fa-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-84ccm\" (UID: \"d8974a7e-ac32-4644-b7ee-2d3908daf2fa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-84ccm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.111301 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6db10103-96be-4420-b302-a7064e347f61-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-vsp2m\" (UID: \"6db10103-96be-4420-b302-a7064e347f61\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vsp2m" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.111323 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpjrd\" (UniqueName: \"kubernetes.io/projected/c83842ec-9933-4f84-bb4a-c84ca61a28e1-kube-api-access-jpjrd\") pod \"console-f9d7485db-q2qpt\" (UID: \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\") " pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.111344 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8bc408a-bca6-42ff-8572-2ba9a3978682-config\") pod \"controller-manager-879f6c89f-9dbhc\" (UID: \"f8bc408a-bca6-42ff-8572-2ba9a3978682\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.111366 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.111387 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zn764\" (UniqueName: \"kubernetes.io/projected/27d45de2-e0ab-4c3e-b3da-b20e60e26801-kube-api-access-zn764\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.111409 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c83842ec-9933-4f84-bb4a-c84ca61a28e1-trusted-ca-bundle\") pod \"console-f9d7485db-q2qpt\" (UID: \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\") " pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.111439 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10ac507b-7307-4e09-ab72-b956d0139396-config\") pod \"machine-approver-56656f9798-dhrww\" (UID: \"10ac507b-7307-4e09-ab72-b956d0139396\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dhrww" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.111827 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.128310 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.129905 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.130312 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.130516 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.130750 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.130797 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.130909 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.130967 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.131058 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.131155 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.131314 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.131426 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.131525 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.131588 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.131626 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.131751 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.131790 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.131060 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.131922 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.131941 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.132104 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.132218 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.132463 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.132577 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.131752 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.132764 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.132783 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.134400 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-659h7"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.137101 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.140101 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.147022 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.149565 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.154231 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.154465 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.154687 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.154882 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.156521 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-659h7" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.163760 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4gsck"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.164083 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.164343 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xhx6c"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.164432 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4gsck" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.164818 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8vgmn"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.165133 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8vgmn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.165296 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xhx6c" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.165503 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-ng8zt"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.188608 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.189239 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.189379 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-xfcxm"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.189511 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ng8zt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.190577 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556710-lcprh"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.190728 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-xfcxm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.191397 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556705-kllhr"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.192062 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556710-lcprh" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.193654 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556705-kllhr" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.193895 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.194359 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-qqkbm"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.197205 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-84xjl"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.198359 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.199357 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-8dj7w"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.199462 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-84xjl" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.200515 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dv9wr"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.201710 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fgk4q"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.203038 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-69xj9"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.204332 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4rjht"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.205728 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jpbgx"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.207024 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-wcfj4"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.208392 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bcfcc"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.209738 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.211048 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2w96t"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212132 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8bc408a-bca6-42ff-8572-2ba9a3978682-serving-cert\") pod \"controller-manager-879f6c89f-9dbhc\" (UID: \"f8bc408a-bca6-42ff-8572-2ba9a3978682\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212159 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ks5xn\" (UniqueName: \"kubernetes.io/projected/f8d8640c-c4bc-40ad-9594-7b4fb2c4beb0-kube-api-access-ks5xn\") pod \"openshift-config-operator-7777fb866f-f97pg\" (UID: \"f8d8640c-c4bc-40ad-9594-7b4fb2c4beb0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-f97pg" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212184 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwg6b\" (UniqueName: \"kubernetes.io/projected/255ab2ef-dead-4148-bc85-2514618767b9-kube-api-access-pwg6b\") pod \"openshift-controller-manager-operator-756b6f6bc6-5nslp\" (UID: \"255ab2ef-dead-4148-bc85-2514618767b9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5nslp" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212203 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3eaa54fb-8d70-463c-8388-9f8443a480ed-service-ca-bundle\") pod \"router-default-5444994796-9tkxg\" (UID: \"3eaa54fb-8d70-463c-8388-9f8443a480ed\") " pod="openshift-ingress/router-default-5444994796-9tkxg" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212221 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f5681b96-47c5-44f8-9e5d-671678930750-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-bcfcc\" (UID: \"f5681b96-47c5-44f8-9e5d-671678930750\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bcfcc" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212245 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/003e8201-4e67-4356-b0c1-8cc135451069-config\") pod \"console-operator-58897d9998-8dj7w\" (UID: \"003e8201-4e67-4356-b0c1-8cc135451069\") " pod="openshift-console-operator/console-operator-58897d9998-8dj7w" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212270 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6db10103-96be-4420-b302-a7064e347f61-images\") pod \"machine-api-operator-5694c8668f-vsp2m\" (UID: \"6db10103-96be-4420-b302-a7064e347f61\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vsp2m" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212293 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f8bc408a-bca6-42ff-8572-2ba9a3978682-client-ca\") pod \"controller-manager-879f6c89f-9dbhc\" (UID: \"f8bc408a-bca6-42ff-8572-2ba9a3978682\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212318 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8d8640c-c4bc-40ad-9594-7b4fb2c4beb0-serving-cert\") pod \"openshift-config-operator-7777fb866f-f97pg\" (UID: \"f8d8640c-c4bc-40ad-9594-7b4fb2c4beb0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-f97pg" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212343 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ad2861b-4f40-4551-8aff-304359734792-config\") pod \"openshift-apiserver-operator-796bbdcf4f-dv9wr\" (UID: \"6ad2861b-4f40-4551-8aff-304359734792\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dv9wr" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212369 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a5abeecf-9533-4cd9-8ce3-29bb6d8a00bd-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-fgk4q\" (UID: \"a5abeecf-9533-4cd9-8ce3-29bb6d8a00bd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fgk4q" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212393 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmhx9\" (UniqueName: \"kubernetes.io/projected/e6a94afd-1f9a-4281-9d94-2fac3916f2c3-kube-api-access-kmhx9\") pod \"multus-admission-controller-857f4d67dd-wcfj4\" (UID: \"e6a94afd-1f9a-4281-9d94-2fac3916f2c3\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wcfj4" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212419 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ffb5553f-d2d5-4584-9bf8-7212a378f358-audit-dir\") pod \"apiserver-7bbb656c7d-472bb\" (UID: \"ffb5553f-d2d5-4584-9bf8-7212a378f358\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212434 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/3eaa54fb-8d70-463c-8388-9f8443a480ed-default-certificate\") pod \"router-default-5444994796-9tkxg\" (UID: \"3eaa54fb-8d70-463c-8388-9f8443a480ed\") " pod="openshift-ingress/router-default-5444994796-9tkxg" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212452 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212468 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ffb5553f-d2d5-4584-9bf8-7212a378f358-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-472bb\" (UID: \"ffb5553f-d2d5-4584-9bf8-7212a378f358\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212485 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e1f747d-78f3-4cbc-b313-eed531936c02-trusted-ca-bundle\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212505 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4a3cd73-aa6c-4128-8a5f-561719e9b170-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pmg8q\" (UID: \"a4a3cd73-aa6c-4128-8a5f-561719e9b170\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pmg8q" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212521 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9025cb05-7c57-488b-a8cb-441552547aae-config\") pod \"kube-apiserver-operator-766d6c64bb-4rjht\" (UID: \"9025cb05-7c57-488b-a8cb-441552547aae\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4rjht" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212546 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/003e8201-4e67-4356-b0c1-8cc135451069-trusted-ca\") pod \"console-operator-58897d9998-8dj7w\" (UID: \"003e8201-4e67-4356-b0c1-8cc135451069\") " pod="openshift-console-operator/console-operator-58897d9998-8dj7w" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212565 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e6a94afd-1f9a-4281-9d94-2fac3916f2c3-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-wcfj4\" (UID: \"e6a94afd-1f9a-4281-9d94-2fac3916f2c3\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wcfj4" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212580 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3e1f747d-78f3-4cbc-b313-eed531936c02-etcd-client\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212598 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8974a7e-ac32-4644-b7ee-2d3908daf2fa-serving-cert\") pod \"authentication-operator-69f744f599-84ccm\" (UID: \"d8974a7e-ac32-4644-b7ee-2d3908daf2fa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-84ccm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212616 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkvjg\" (UniqueName: \"kubernetes.io/projected/2c2663fa-7df3-4801-be78-52517eb1f1cf-kube-api-access-gkvjg\") pod \"kube-storage-version-migrator-operator-b67b599dd-v526f\" (UID: \"2c2663fa-7df3-4801-be78-52517eb1f1cf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v526f" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212651 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/10be2947-2e91-4a8e-b54e-69cdab598955-etcd-ca\") pod \"etcd-operator-b45778765-l4rxn\" (UID: \"10be2947-2e91-4a8e-b54e-69cdab598955\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l4rxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212669 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec4b9459-d392-4fc5-9b6f-a87ca50e85b1-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-l57bl\" (UID: \"ec4b9459-d392-4fc5-9b6f-a87ca50e85b1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l57bl" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212683 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a5abeecf-9533-4cd9-8ce3-29bb6d8a00bd-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-fgk4q\" (UID: \"a5abeecf-9533-4cd9-8ce3-29bb6d8a00bd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fgk4q" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212708 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnwqw\" (UniqueName: \"kubernetes.io/projected/10ac507b-7307-4e09-ab72-b956d0139396-kube-api-access-cnwqw\") pod \"machine-approver-56656f9798-dhrww\" (UID: \"10ac507b-7307-4e09-ab72-b956d0139396\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dhrww" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212723 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4a3cd73-aa6c-4128-8a5f-561719e9b170-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pmg8q\" (UID: \"a4a3cd73-aa6c-4128-8a5f-561719e9b170\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pmg8q" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212741 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fca26784-7fdf-4923-bd07-35d182c2ad14-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-jpbgx\" (UID: \"fca26784-7fdf-4923-bd07-35d182c2ad14\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jpbgx" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212760 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a3cabe4-69ee-49f7-a783-e72ac1a56821-serving-cert\") pod \"route-controller-manager-6576b87f9c-qs2qs\" (UID: \"5a3cabe4-69ee-49f7-a783-e72ac1a56821\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212777 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffb5553f-d2d5-4584-9bf8-7212a378f358-serving-cert\") pod \"apiserver-7bbb656c7d-472bb\" (UID: \"ffb5553f-d2d5-4584-9bf8-7212a378f358\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212792 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/003e8201-4e67-4356-b0c1-8cc135451069-serving-cert\") pod \"console-operator-58897d9998-8dj7w\" (UID: \"003e8201-4e67-4356-b0c1-8cc135451069\") " pod="openshift-console-operator/console-operator-58897d9998-8dj7w" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212809 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10be2947-2e91-4a8e-b54e-69cdab598955-config\") pod \"etcd-operator-b45778765-l4rxn\" (UID: \"10be2947-2e91-4a8e-b54e-69cdab598955\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l4rxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212824 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/3eaa54fb-8d70-463c-8388-9f8443a480ed-stats-auth\") pod \"router-default-5444994796-9tkxg\" (UID: \"3eaa54fb-8d70-463c-8388-9f8443a480ed\") " pod="openshift-ingress/router-default-5444994796-9tkxg" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212838 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n895\" (UniqueName: \"kubernetes.io/projected/f5681b96-47c5-44f8-9e5d-671678930750-kube-api-access-4n895\") pod \"package-server-manager-789f6589d5-bcfcc\" (UID: \"f5681b96-47c5-44f8-9e5d-671678930750\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bcfcc" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212855 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdq6z\" (UniqueName: \"kubernetes.io/projected/fca26784-7fdf-4923-bd07-35d182c2ad14-kube-api-access-hdq6z\") pod \"cluster-samples-operator-665b6dd947-jpbgx\" (UID: \"fca26784-7fdf-4923-bd07-35d182c2ad14\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jpbgx" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212871 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d8974a7e-ac32-4644-b7ee-2d3908daf2fa-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-84ccm\" (UID: \"d8974a7e-ac32-4644-b7ee-2d3908daf2fa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-84ccm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212885 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec4b9459-d392-4fc5-9b6f-a87ca50e85b1-config\") pod \"kube-controller-manager-operator-78b949d7b-l57bl\" (UID: \"ec4b9459-d392-4fc5-9b6f-a87ca50e85b1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l57bl" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212899 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ec4b9459-d392-4fc5-9b6f-a87ca50e85b1-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-l57bl\" (UID: \"ec4b9459-d392-4fc5-9b6f-a87ca50e85b1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l57bl" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212915 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6db10103-96be-4420-b302-a7064e347f61-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-vsp2m\" (UID: \"6db10103-96be-4420-b302-a7064e347f61\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vsp2m" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212932 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpjrd\" (UniqueName: \"kubernetes.io/projected/c83842ec-9933-4f84-bb4a-c84ca61a28e1-kube-api-access-jpjrd\") pod \"console-f9d7485db-q2qpt\" (UID: \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\") " pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212948 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzgwf\" (UniqueName: \"kubernetes.io/projected/3eaa54fb-8d70-463c-8388-9f8443a480ed-kube-api-access-fzgwf\") pod \"router-default-5444994796-9tkxg\" (UID: \"3eaa54fb-8d70-463c-8388-9f8443a480ed\") " pod="openshift-ingress/router-default-5444994796-9tkxg" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212965 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/00848ba6-522a-45c7-81bd-7ab287d77626-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-jrm5t\" (UID: \"00848ba6-522a-45c7-81bd-7ab287d77626\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jrm5t" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.212990 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ffb5553f-d2d5-4584-9bf8-7212a378f358-encryption-config\") pod \"apiserver-7bbb656c7d-472bb\" (UID: \"ffb5553f-d2d5-4584-9bf8-7212a378f358\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213006 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f8bc408a-bca6-42ff-8572-2ba9a3978682-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-9dbhc\" (UID: \"f8bc408a-bca6-42ff-8572-2ba9a3978682\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213024 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/10ac507b-7307-4e09-ab72-b956d0139396-auth-proxy-config\") pod \"machine-approver-56656f9798-dhrww\" (UID: \"10ac507b-7307-4e09-ab72-b956d0139396\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dhrww" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213045 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d8974a7e-ac32-4644-b7ee-2d3908daf2fa-service-ca-bundle\") pod \"authentication-operator-69f744f599-84ccm\" (UID: \"d8974a7e-ac32-4644-b7ee-2d3908daf2fa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-84ccm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213063 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/255ab2ef-dead-4148-bc85-2514618767b9-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-5nslp\" (UID: \"255ab2ef-dead-4148-bc85-2514618767b9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5nslp" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213082 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213096 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10be2947-2e91-4a8e-b54e-69cdab598955-serving-cert\") pod \"etcd-operator-b45778765-l4rxn\" (UID: \"10be2947-2e91-4a8e-b54e-69cdab598955\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l4rxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213112 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213127 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9025cb05-7c57-488b-a8cb-441552547aae-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-4rjht\" (UID: \"9025cb05-7c57-488b-a8cb-441552547aae\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4rjht" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213144 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ad2861b-4f40-4551-8aff-304359734792-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-dv9wr\" (UID: \"6ad2861b-4f40-4551-8aff-304359734792\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dv9wr" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213162 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6db10103-96be-4420-b302-a7064e347f61-config\") pod \"machine-api-operator-5694c8668f-vsp2m\" (UID: \"6db10103-96be-4420-b302-a7064e347f61\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vsp2m" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213182 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvs46\" (UniqueName: \"kubernetes.io/projected/f8bc408a-bca6-42ff-8572-2ba9a3978682-kube-api-access-xvs46\") pod \"controller-manager-879f6c89f-9dbhc\" (UID: \"f8bc408a-bca6-42ff-8572-2ba9a3978682\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213199 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vffzw\" (UniqueName: \"kubernetes.io/projected/3e1f747d-78f3-4cbc-b313-eed531936c02-kube-api-access-vffzw\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213216 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3e1f747d-78f3-4cbc-b313-eed531936c02-image-import-ca\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213233 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj2pk\" (UniqueName: \"kubernetes.io/projected/5a3cabe4-69ee-49f7-a783-e72ac1a56821-kube-api-access-sj2pk\") pod \"route-controller-manager-6576b87f9c-qs2qs\" (UID: \"5a3cabe4-69ee-49f7-a783-e72ac1a56821\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213251 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/10be2947-2e91-4a8e-b54e-69cdab598955-etcd-client\") pod \"etcd-operator-b45778765-l4rxn\" (UID: \"10be2947-2e91-4a8e-b54e-69cdab598955\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l4rxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213267 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ffb5553f-d2d5-4584-9bf8-7212a378f358-etcd-client\") pod \"apiserver-7bbb656c7d-472bb\" (UID: \"ffb5553f-d2d5-4584-9bf8-7212a378f358\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213284 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3e1f747d-78f3-4cbc-b313-eed531936c02-audit\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213301 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3e1f747d-78f3-4cbc-b313-eed531936c02-node-pullsecrets\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213317 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8974a7e-ac32-4644-b7ee-2d3908daf2fa-config\") pod \"authentication-operator-69f744f599-84ccm\" (UID: \"d8974a7e-ac32-4644-b7ee-2d3908daf2fa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-84ccm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213336 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/80d5bedc-a598-4779-be24-2d512ea7d148-bound-sa-token\") pod \"ingress-operator-5b745b69d9-2jcxn\" (UID: \"80d5bedc-a598-4779-be24-2d512ea7d148\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2jcxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213353 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5bhr\" (UniqueName: \"kubernetes.io/projected/80d5bedc-a598-4779-be24-2d512ea7d148-kube-api-access-r5bhr\") pod \"ingress-operator-5b745b69d9-2jcxn\" (UID: \"80d5bedc-a598-4779-be24-2d512ea7d148\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2jcxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213370 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213386 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5a3cabe4-69ee-49f7-a783-e72ac1a56821-client-ca\") pod \"route-controller-manager-6576b87f9c-qs2qs\" (UID: \"5a3cabe4-69ee-49f7-a783-e72ac1a56821\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213402 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213418 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndkht\" (UniqueName: \"kubernetes.io/projected/a5abeecf-9533-4cd9-8ce3-29bb6d8a00bd-kube-api-access-ndkht\") pod \"cluster-image-registry-operator-dc59b4c8b-fgk4q\" (UID: \"a5abeecf-9533-4cd9-8ce3-29bb6d8a00bd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fgk4q" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213435 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c83842ec-9933-4f84-bb4a-c84ca61a28e1-service-ca\") pod \"console-f9d7485db-q2qpt\" (UID: \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\") " pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213449 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/255ab2ef-dead-4148-bc85-2514618767b9-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-5nslp\" (UID: \"255ab2ef-dead-4148-bc85-2514618767b9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5nslp" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213471 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c2663fa-7df3-4801-be78-52517eb1f1cf-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-v526f\" (UID: \"2c2663fa-7df3-4801-be78-52517eb1f1cf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v526f" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213489 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3eaa54fb-8d70-463c-8388-9f8443a480ed-metrics-certs\") pod \"router-default-5444994796-9tkxg\" (UID: \"3eaa54fb-8d70-463c-8388-9f8443a480ed\") " pod="openshift-ingress/router-default-5444994796-9tkxg" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213506 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213523 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3e1f747d-78f3-4cbc-b313-eed531936c02-audit-dir\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213539 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3e1f747d-78f3-4cbc-b313-eed531936c02-etcd-serving-ca\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213554 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c83842ec-9933-4f84-bb4a-c84ca61a28e1-console-config\") pod \"console-f9d7485db-q2qpt\" (UID: \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\") " pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213568 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c45j\" (UniqueName: \"kubernetes.io/projected/00848ba6-522a-45c7-81bd-7ab287d77626-kube-api-access-7c45j\") pod \"control-plane-machine-set-operator-78cbb6b69f-jrm5t\" (UID: \"00848ba6-522a-45c7-81bd-7ab287d77626\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jrm5t" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213586 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c83842ec-9933-4f84-bb4a-c84ca61a28e1-console-serving-cert\") pod \"console-f9d7485db-q2qpt\" (UID: \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\") " pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213601 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/10be2947-2e91-4a8e-b54e-69cdab598955-etcd-service-ca\") pod \"etcd-operator-b45778765-l4rxn\" (UID: \"10be2947-2e91-4a8e-b54e-69cdab598955\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l4rxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213620 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e1f747d-78f3-4cbc-b313-eed531936c02-serving-cert\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213648 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zblb5\" (UniqueName: \"kubernetes.io/projected/fe42bd29-b8a7-4a9f-89e2-ab3b944d7c26-kube-api-access-zblb5\") pod \"migrator-59844c95c7-64xpb\" (UID: \"fe42bd29-b8a7-4a9f-89e2-ab3b944d7c26\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-64xpb" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213667 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/27d45de2-e0ab-4c3e-b3da-b20e60e26801-audit-policies\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213683 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bb2n\" (UniqueName: \"kubernetes.io/projected/44f59229-dec6-4d9b-a63b-bd562b4523cf-kube-api-access-5bb2n\") pod \"machine-config-controller-84d6567774-hffr5\" (UID: \"44f59229-dec6-4d9b-a63b-bd562b4523cf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hffr5" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213701 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/80d5bedc-a598-4779-be24-2d512ea7d148-metrics-tls\") pod \"ingress-operator-5b745b69d9-2jcxn\" (UID: \"80d5bedc-a598-4779-be24-2d512ea7d148\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2jcxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213727 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c83842ec-9933-4f84-bb4a-c84ca61a28e1-console-oauth-config\") pod \"console-f9d7485db-q2qpt\" (UID: \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\") " pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213754 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a4a3cd73-aa6c-4128-8a5f-561719e9b170-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pmg8q\" (UID: \"a4a3cd73-aa6c-4128-8a5f-561719e9b170\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pmg8q" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213778 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a5abeecf-9533-4cd9-8ce3-29bb6d8a00bd-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-fgk4q\" (UID: \"a5abeecf-9533-4cd9-8ce3-29bb6d8a00bd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fgk4q" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213803 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8bc408a-bca6-42ff-8572-2ba9a3978682-config\") pod \"controller-manager-879f6c89f-9dbhc\" (UID: \"f8bc408a-bca6-42ff-8572-2ba9a3978682\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213827 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213850 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zn764\" (UniqueName: \"kubernetes.io/projected/27d45de2-e0ab-4c3e-b3da-b20e60e26801-kube-api-access-zn764\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213871 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c83842ec-9933-4f84-bb4a-c84ca61a28e1-trusted-ca-bundle\") pod \"console-f9d7485db-q2qpt\" (UID: \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\") " pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213903 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10ac507b-7307-4e09-ab72-b956d0139396-config\") pod \"machine-approver-56656f9798-dhrww\" (UID: \"10ac507b-7307-4e09-ab72-b956d0139396\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dhrww" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213927 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/44f59229-dec6-4d9b-a63b-bd562b4523cf-proxy-tls\") pod \"machine-config-controller-84d6567774-hffr5\" (UID: \"44f59229-dec6-4d9b-a63b-bd562b4523cf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hffr5" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213952 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/80d5bedc-a598-4779-be24-2d512ea7d148-trusted-ca\") pod \"ingress-operator-5b745b69d9-2jcxn\" (UID: \"80d5bedc-a598-4779-be24-2d512ea7d148\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2jcxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213969 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/27d45de2-e0ab-4c3e-b3da-b20e60e26801-audit-dir\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.213989 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a3cabe4-69ee-49f7-a783-e72ac1a56821-config\") pod \"route-controller-manager-6576b87f9c-qs2qs\" (UID: \"5a3cabe4-69ee-49f7-a783-e72ac1a56821\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.214009 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.214027 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e1f747d-78f3-4cbc-b313-eed531936c02-config\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.214043 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74s9f\" (UniqueName: \"kubernetes.io/projected/d8974a7e-ac32-4644-b7ee-2d3908daf2fa-kube-api-access-74s9f\") pod \"authentication-operator-69f744f599-84ccm\" (UID: \"d8974a7e-ac32-4644-b7ee-2d3908daf2fa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-84ccm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.214059 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/44f59229-dec6-4d9b-a63b-bd562b4523cf-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-hffr5\" (UID: \"44f59229-dec6-4d9b-a63b-bd562b4523cf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hffr5" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.214076 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f8d8640c-c4bc-40ad-9594-7b4fb2c4beb0-available-featuregates\") pod \"openshift-config-operator-7777fb866f-f97pg\" (UID: \"f8d8640c-c4bc-40ad-9594-7b4fb2c4beb0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-f97pg" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.214091 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c2663fa-7df3-4801-be78-52517eb1f1cf-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-v526f\" (UID: \"2c2663fa-7df3-4801-be78-52517eb1f1cf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v526f" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.214109 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ffb5553f-d2d5-4584-9bf8-7212a378f358-audit-policies\") pod \"apiserver-7bbb656c7d-472bb\" (UID: \"ffb5553f-d2d5-4584-9bf8-7212a378f358\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.214124 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsdcp\" (UniqueName: \"kubernetes.io/projected/ffb5553f-d2d5-4584-9bf8-7212a378f358-kube-api-access-zsdcp\") pod \"apiserver-7bbb656c7d-472bb\" (UID: \"ffb5553f-d2d5-4584-9bf8-7212a378f358\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.214140 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9blhw\" (UniqueName: \"kubernetes.io/projected/003e8201-4e67-4356-b0c1-8cc135451069-kube-api-access-9blhw\") pod \"console-operator-58897d9998-8dj7w\" (UID: \"003e8201-4e67-4356-b0c1-8cc135451069\") " pod="openshift-console-operator/console-operator-58897d9998-8dj7w" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.214158 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6dmk\" (UniqueName: \"kubernetes.io/projected/6db10103-96be-4420-b302-a7064e347f61-kube-api-access-q6dmk\") pod \"machine-api-operator-5694c8668f-vsp2m\" (UID: \"6db10103-96be-4420-b302-a7064e347f61\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vsp2m" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.214177 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.214192 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.214207 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c83842ec-9933-4f84-bb4a-c84ca61a28e1-oauth-serving-cert\") pod \"console-f9d7485db-q2qpt\" (UID: \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\") " pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.214223 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/10ac507b-7307-4e09-ab72-b956d0139396-machine-approver-tls\") pod \"machine-approver-56656f9798-dhrww\" (UID: \"10ac507b-7307-4e09-ab72-b956d0139396\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dhrww" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.214268 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9025cb05-7c57-488b-a8cb-441552547aae-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-4rjht\" (UID: \"9025cb05-7c57-488b-a8cb-441552547aae\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4rjht" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.214288 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ffb5553f-d2d5-4584-9bf8-7212a378f358-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-472bb\" (UID: \"ffb5553f-d2d5-4584-9bf8-7212a378f358\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.214305 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mclx9\" (UniqueName: \"kubernetes.io/projected/6ad2861b-4f40-4551-8aff-304359734792-kube-api-access-mclx9\") pod \"openshift-apiserver-operator-796bbdcf4f-dv9wr\" (UID: \"6ad2861b-4f40-4551-8aff-304359734792\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dv9wr" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.214323 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm2dw\" (UniqueName: \"kubernetes.io/projected/10be2947-2e91-4a8e-b54e-69cdab598955-kube-api-access-mm2dw\") pod \"etcd-operator-b45778765-l4rxn\" (UID: \"10be2947-2e91-4a8e-b54e-69cdab598955\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l4rxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.214339 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.214363 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3e1f747d-78f3-4cbc-b313-eed531936c02-encryption-config\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.215070 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8q6j6"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.215110 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-8ktsx"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.215123 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l57bl"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.216163 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f8bc408a-bca6-42ff-8572-2ba9a3978682-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-9dbhc\" (UID: \"f8bc408a-bca6-42ff-8572-2ba9a3978682\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.218092 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/003e8201-4e67-4356-b0c1-8cc135451069-config\") pod \"console-operator-58897d9998-8dj7w\" (UID: \"003e8201-4e67-4356-b0c1-8cc135451069\") " pod="openshift-console-operator/console-operator-58897d9998-8dj7w" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.218717 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6db10103-96be-4420-b302-a7064e347f61-images\") pod \"machine-api-operator-5694c8668f-vsp2m\" (UID: \"6db10103-96be-4420-b302-a7064e347f61\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vsp2m" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.218998 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ffb5553f-d2d5-4584-9bf8-7212a378f358-audit-dir\") pod \"apiserver-7bbb656c7d-472bb\" (UID: \"ffb5553f-d2d5-4584-9bf8-7212a378f358\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.220050 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-q2qpt"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.220088 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-vsp2m"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.221282 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a3cabe4-69ee-49f7-a783-e72ac1a56821-serving-cert\") pod \"route-controller-manager-6576b87f9c-qs2qs\" (UID: \"5a3cabe4-69ee-49f7-a783-e72ac1a56821\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.221746 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ad2861b-4f40-4551-8aff-304359734792-config\") pod \"openshift-apiserver-operator-796bbdcf4f-dv9wr\" (UID: \"6ad2861b-4f40-4551-8aff-304359734792\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dv9wr" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.222470 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8bc408a-bca6-42ff-8572-2ba9a3978682-config\") pod \"controller-manager-879f6c89f-9dbhc\" (UID: \"f8bc408a-bca6-42ff-8572-2ba9a3978682\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.223515 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6db10103-96be-4420-b302-a7064e347f61-config\") pod \"machine-api-operator-5694c8668f-vsp2m\" (UID: \"6db10103-96be-4420-b302-a7064e347f61\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vsp2m" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.223795 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3e1f747d-78f3-4cbc-b313-eed531936c02-node-pullsecrets\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.224446 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10ac507b-7307-4e09-ab72-b956d0139396-config\") pod \"machine-approver-56656f9798-dhrww\" (UID: \"10ac507b-7307-4e09-ab72-b956d0139396\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dhrww" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.224486 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/10ac507b-7307-4e09-ab72-b956d0139396-auth-proxy-config\") pod \"machine-approver-56656f9798-dhrww\" (UID: \"10ac507b-7307-4e09-ab72-b956d0139396\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dhrww" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.224512 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ffb5553f-d2d5-4584-9bf8-7212a378f358-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-472bb\" (UID: \"ffb5553f-d2d5-4584-9bf8-7212a378f358\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.224702 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffb5553f-d2d5-4584-9bf8-7212a378f358-serving-cert\") pod \"apiserver-7bbb656c7d-472bb\" (UID: \"ffb5553f-d2d5-4584-9bf8-7212a378f358\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.224778 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f8d8640c-c4bc-40ad-9594-7b4fb2c4beb0-available-featuregates\") pod \"openshift-config-operator-7777fb866f-f97pg\" (UID: \"f8d8640c-c4bc-40ad-9594-7b4fb2c4beb0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-f97pg" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.224935 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d8974a7e-ac32-4644-b7ee-2d3908daf2fa-service-ca-bundle\") pod \"authentication-operator-69f744f599-84ccm\" (UID: \"d8974a7e-ac32-4644-b7ee-2d3908daf2fa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-84ccm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.225205 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3e1f747d-78f3-4cbc-b313-eed531936c02-audit\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.225207 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e1f747d-78f3-4cbc-b313-eed531936c02-trusted-ca-bundle\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.225307 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6db10103-96be-4420-b302-a7064e347f61-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-vsp2m\" (UID: \"6db10103-96be-4420-b302-a7064e347f61\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vsp2m" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.225329 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/27d45de2-e0ab-4c3e-b3da-b20e60e26801-audit-dir\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.225723 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3e1f747d-78f3-4cbc-b313-eed531936c02-encryption-config\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.225865 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ad2861b-4f40-4551-8aff-304359734792-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-dv9wr\" (UID: \"6ad2861b-4f40-4551-8aff-304359734792\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dv9wr" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.225914 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3e1f747d-78f3-4cbc-b313-eed531936c02-audit-dir\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.226054 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ffb5553f-d2d5-4584-9bf8-7212a378f358-audit-policies\") pod \"apiserver-7bbb656c7d-472bb\" (UID: \"ffb5553f-d2d5-4584-9bf8-7212a378f358\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.226136 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5a3cabe4-69ee-49f7-a783-e72ac1a56821-client-ca\") pod \"route-controller-manager-6576b87f9c-qs2qs\" (UID: \"5a3cabe4-69ee-49f7-a783-e72ac1a56821\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.226458 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8d8640c-c4bc-40ad-9594-7b4fb2c4beb0-serving-cert\") pod \"openshift-config-operator-7777fb866f-f97pg\" (UID: \"f8d8640c-c4bc-40ad-9594-7b4fb2c4beb0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-f97pg" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.226575 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3e1f747d-78f3-4cbc-b313-eed531936c02-etcd-serving-ca\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.226728 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.226878 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e1f747d-78f3-4cbc-b313-eed531936c02-serving-cert\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.227461 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8bc408a-bca6-42ff-8572-2ba9a3978682-serving-cert\") pod \"controller-manager-879f6c89f-9dbhc\" (UID: \"f8bc408a-bca6-42ff-8572-2ba9a3978682\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.227467 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8974a7e-ac32-4644-b7ee-2d3908daf2fa-config\") pod \"authentication-operator-69f744f599-84ccm\" (UID: \"d8974a7e-ac32-4644-b7ee-2d3908daf2fa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-84ccm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.227556 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ffb5553f-d2d5-4584-9bf8-7212a378f358-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-472bb\" (UID: \"ffb5553f-d2d5-4584-9bf8-7212a378f358\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.227981 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f8bc408a-bca6-42ff-8572-2ba9a3978682-client-ca\") pod \"controller-manager-879f6c89f-9dbhc\" (UID: \"f8bc408a-bca6-42ff-8572-2ba9a3978682\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.228004 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/003e8201-4e67-4356-b0c1-8cc135451069-serving-cert\") pod \"console-operator-58897d9998-8dj7w\" (UID: \"003e8201-4e67-4356-b0c1-8cc135451069\") " pod="openshift-console-operator/console-operator-58897d9998-8dj7w" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.228306 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/10ac507b-7307-4e09-ab72-b956d0139396-machine-approver-tls\") pod \"machine-approver-56656f9798-dhrww\" (UID: \"10ac507b-7307-4e09-ab72-b956d0139396\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dhrww" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.228345 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-84ccm"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.228370 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v526f"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.228701 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c83842ec-9933-4f84-bb4a-c84ca61a28e1-console-config\") pod \"console-f9d7485db-q2qpt\" (UID: \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\") " pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.228744 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e1f747d-78f3-4cbc-b313-eed531936c02-config\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.229606 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c83842ec-9933-4f84-bb4a-c84ca61a28e1-trusted-ca-bundle\") pod \"console-f9d7485db-q2qpt\" (UID: \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\") " pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.230069 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3e1f747d-78f3-4cbc-b313-eed531936c02-image-import-ca\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.230241 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c83842ec-9933-4f84-bb4a-c84ca61a28e1-service-ca\") pod \"console-f9d7485db-q2qpt\" (UID: \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\") " pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.230393 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/27d45de2-e0ab-4c3e-b3da-b20e60e26801-audit-policies\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.230449 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a3cabe4-69ee-49f7-a783-e72ac1a56821-config\") pod \"route-controller-manager-6576b87f9c-qs2qs\" (UID: \"5a3cabe4-69ee-49f7-a783-e72ac1a56821\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.230596 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/003e8201-4e67-4356-b0c1-8cc135451069-trusted-ca\") pod \"console-operator-58897d9998-8dj7w\" (UID: \"003e8201-4e67-4356-b0c1-8cc135451069\") " pod="openshift-console-operator/console-operator-58897d9998-8dj7w" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.230604 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.230871 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.231560 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.231778 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.231852 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d8974a7e-ac32-4644-b7ee-2d3908daf2fa-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-84ccm\" (UID: \"d8974a7e-ac32-4644-b7ee-2d3908daf2fa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-84ccm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.232061 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.232060 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jrm5t"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.232392 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.232553 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3e1f747d-78f3-4cbc-b313-eed531936c02-etcd-client\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.233695 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.234496 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.234553 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-z9thp"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.234998 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8974a7e-ac32-4644-b7ee-2d3908daf2fa-serving-cert\") pod \"authentication-operator-69f744f599-84ccm\" (UID: \"d8974a7e-ac32-4644-b7ee-2d3908daf2fa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-84ccm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.235308 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.235503 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-z9thp" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.236134 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ffb5553f-d2d5-4584-9bf8-7212a378f358-encryption-config\") pod \"apiserver-7bbb656c7d-472bb\" (UID: \"ffb5553f-d2d5-4584-9bf8-7212a378f358\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.236709 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.236993 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c83842ec-9933-4f84-bb4a-c84ca61a28e1-console-serving-cert\") pod \"console-f9d7485db-q2qpt\" (UID: \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\") " pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.237848 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fca26784-7fdf-4923-bd07-35d182c2ad14-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-jpbgx\" (UID: \"fca26784-7fdf-4923-bd07-35d182c2ad14\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jpbgx" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.238102 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.238115 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.238329 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c83842ec-9933-4f84-bb4a-c84ca61a28e1-console-oauth-config\") pod \"console-f9d7485db-q2qpt\" (UID: \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\") " pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.238737 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-9hkj4"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.239383 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9hkj4" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.240874 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-64xpb"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.243422 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-hffr5"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.245522 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4gsck"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.247438 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.248023 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ffb5553f-d2d5-4584-9bf8-7212a378f358-etcd-client\") pod \"apiserver-7bbb656c7d-472bb\" (UID: \"ffb5553f-d2d5-4584-9bf8-7212a378f358\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.248713 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-f97pg"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.249746 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pmg8q"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.251095 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-l4rxn"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.252307 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8vgmn"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.253664 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5nslp"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.255276 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-8zzqp"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.256482 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-2jcxn"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.257928 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.258225 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-659h7"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.259710 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-z9thp"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.261052 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556710-lcprh"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.262709 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-84xjl"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.263866 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9hkj4"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.265279 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-ng8zt"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.266488 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-xfcxm"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.268147 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xhx6c"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.269266 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-9g2bm"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.269836 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-9g2bm" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.271030 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556705-kllhr"] Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.278813 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.298408 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315193 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a5abeecf-9533-4cd9-8ce3-29bb6d8a00bd-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-fgk4q\" (UID: \"a5abeecf-9533-4cd9-8ce3-29bb6d8a00bd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fgk4q" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315233 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmhx9\" (UniqueName: \"kubernetes.io/projected/e6a94afd-1f9a-4281-9d94-2fac3916f2c3-kube-api-access-kmhx9\") pod \"multus-admission-controller-857f4d67dd-wcfj4\" (UID: \"e6a94afd-1f9a-4281-9d94-2fac3916f2c3\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wcfj4" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315255 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/3eaa54fb-8d70-463c-8388-9f8443a480ed-default-certificate\") pod \"router-default-5444994796-9tkxg\" (UID: \"3eaa54fb-8d70-463c-8388-9f8443a480ed\") " pod="openshift-ingress/router-default-5444994796-9tkxg" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315273 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9025cb05-7c57-488b-a8cb-441552547aae-config\") pod \"kube-apiserver-operator-766d6c64bb-4rjht\" (UID: \"9025cb05-7c57-488b-a8cb-441552547aae\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4rjht" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315293 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4a3cd73-aa6c-4128-8a5f-561719e9b170-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pmg8q\" (UID: \"a4a3cd73-aa6c-4128-8a5f-561719e9b170\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pmg8q" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315314 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e6a94afd-1f9a-4281-9d94-2fac3916f2c3-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-wcfj4\" (UID: \"e6a94afd-1f9a-4281-9d94-2fac3916f2c3\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wcfj4" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315331 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkvjg\" (UniqueName: \"kubernetes.io/projected/2c2663fa-7df3-4801-be78-52517eb1f1cf-kube-api-access-gkvjg\") pod \"kube-storage-version-migrator-operator-b67b599dd-v526f\" (UID: \"2c2663fa-7df3-4801-be78-52517eb1f1cf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v526f" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315348 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/10be2947-2e91-4a8e-b54e-69cdab598955-etcd-ca\") pod \"etcd-operator-b45778765-l4rxn\" (UID: \"10be2947-2e91-4a8e-b54e-69cdab598955\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l4rxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315364 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec4b9459-d392-4fc5-9b6f-a87ca50e85b1-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-l57bl\" (UID: \"ec4b9459-d392-4fc5-9b6f-a87ca50e85b1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l57bl" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315379 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a5abeecf-9533-4cd9-8ce3-29bb6d8a00bd-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-fgk4q\" (UID: \"a5abeecf-9533-4cd9-8ce3-29bb6d8a00bd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fgk4q" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315406 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4a3cd73-aa6c-4128-8a5f-561719e9b170-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pmg8q\" (UID: \"a4a3cd73-aa6c-4128-8a5f-561719e9b170\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pmg8q" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315424 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10be2947-2e91-4a8e-b54e-69cdab598955-config\") pod \"etcd-operator-b45778765-l4rxn\" (UID: \"10be2947-2e91-4a8e-b54e-69cdab598955\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l4rxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315445 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/3eaa54fb-8d70-463c-8388-9f8443a480ed-stats-auth\") pod \"router-default-5444994796-9tkxg\" (UID: \"3eaa54fb-8d70-463c-8388-9f8443a480ed\") " pod="openshift-ingress/router-default-5444994796-9tkxg" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315463 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n895\" (UniqueName: \"kubernetes.io/projected/f5681b96-47c5-44f8-9e5d-671678930750-kube-api-access-4n895\") pod \"package-server-manager-789f6589d5-bcfcc\" (UID: \"f5681b96-47c5-44f8-9e5d-671678930750\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bcfcc" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315479 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec4b9459-d392-4fc5-9b6f-a87ca50e85b1-config\") pod \"kube-controller-manager-operator-78b949d7b-l57bl\" (UID: \"ec4b9459-d392-4fc5-9b6f-a87ca50e85b1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l57bl" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315495 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ec4b9459-d392-4fc5-9b6f-a87ca50e85b1-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-l57bl\" (UID: \"ec4b9459-d392-4fc5-9b6f-a87ca50e85b1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l57bl" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315516 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzgwf\" (UniqueName: \"kubernetes.io/projected/3eaa54fb-8d70-463c-8388-9f8443a480ed-kube-api-access-fzgwf\") pod \"router-default-5444994796-9tkxg\" (UID: \"3eaa54fb-8d70-463c-8388-9f8443a480ed\") " pod="openshift-ingress/router-default-5444994796-9tkxg" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315531 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/00848ba6-522a-45c7-81bd-7ab287d77626-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-jrm5t\" (UID: \"00848ba6-522a-45c7-81bd-7ab287d77626\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jrm5t" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315549 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/255ab2ef-dead-4148-bc85-2514618767b9-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-5nslp\" (UID: \"255ab2ef-dead-4148-bc85-2514618767b9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5nslp" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315563 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10be2947-2e91-4a8e-b54e-69cdab598955-serving-cert\") pod \"etcd-operator-b45778765-l4rxn\" (UID: \"10be2947-2e91-4a8e-b54e-69cdab598955\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l4rxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315578 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9025cb05-7c57-488b-a8cb-441552547aae-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-4rjht\" (UID: \"9025cb05-7c57-488b-a8cb-441552547aae\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4rjht" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315613 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/10be2947-2e91-4a8e-b54e-69cdab598955-etcd-client\") pod \"etcd-operator-b45778765-l4rxn\" (UID: \"10be2947-2e91-4a8e-b54e-69cdab598955\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l4rxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315629 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/80d5bedc-a598-4779-be24-2d512ea7d148-bound-sa-token\") pod \"ingress-operator-5b745b69d9-2jcxn\" (UID: \"80d5bedc-a598-4779-be24-2d512ea7d148\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2jcxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315668 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5bhr\" (UniqueName: \"kubernetes.io/projected/80d5bedc-a598-4779-be24-2d512ea7d148-kube-api-access-r5bhr\") pod \"ingress-operator-5b745b69d9-2jcxn\" (UID: \"80d5bedc-a598-4779-be24-2d512ea7d148\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2jcxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315688 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndkht\" (UniqueName: \"kubernetes.io/projected/a5abeecf-9533-4cd9-8ce3-29bb6d8a00bd-kube-api-access-ndkht\") pod \"cluster-image-registry-operator-dc59b4c8b-fgk4q\" (UID: \"a5abeecf-9533-4cd9-8ce3-29bb6d8a00bd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fgk4q" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315709 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/255ab2ef-dead-4148-bc85-2514618767b9-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-5nslp\" (UID: \"255ab2ef-dead-4148-bc85-2514618767b9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5nslp" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315725 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c2663fa-7df3-4801-be78-52517eb1f1cf-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-v526f\" (UID: \"2c2663fa-7df3-4801-be78-52517eb1f1cf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v526f" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315742 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3eaa54fb-8d70-463c-8388-9f8443a480ed-metrics-certs\") pod \"router-default-5444994796-9tkxg\" (UID: \"3eaa54fb-8d70-463c-8388-9f8443a480ed\") " pod="openshift-ingress/router-default-5444994796-9tkxg" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315758 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7c45j\" (UniqueName: \"kubernetes.io/projected/00848ba6-522a-45c7-81bd-7ab287d77626-kube-api-access-7c45j\") pod \"control-plane-machine-set-operator-78cbb6b69f-jrm5t\" (UID: \"00848ba6-522a-45c7-81bd-7ab287d77626\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jrm5t" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315777 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/10be2947-2e91-4a8e-b54e-69cdab598955-etcd-service-ca\") pod \"etcd-operator-b45778765-l4rxn\" (UID: \"10be2947-2e91-4a8e-b54e-69cdab598955\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l4rxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315792 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zblb5\" (UniqueName: \"kubernetes.io/projected/fe42bd29-b8a7-4a9f-89e2-ab3b944d7c26-kube-api-access-zblb5\") pod \"migrator-59844c95c7-64xpb\" (UID: \"fe42bd29-b8a7-4a9f-89e2-ab3b944d7c26\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-64xpb" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315808 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bb2n\" (UniqueName: \"kubernetes.io/projected/44f59229-dec6-4d9b-a63b-bd562b4523cf-kube-api-access-5bb2n\") pod \"machine-config-controller-84d6567774-hffr5\" (UID: \"44f59229-dec6-4d9b-a63b-bd562b4523cf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hffr5" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315824 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/80d5bedc-a598-4779-be24-2d512ea7d148-metrics-tls\") pod \"ingress-operator-5b745b69d9-2jcxn\" (UID: \"80d5bedc-a598-4779-be24-2d512ea7d148\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2jcxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315843 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a4a3cd73-aa6c-4128-8a5f-561719e9b170-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pmg8q\" (UID: \"a4a3cd73-aa6c-4128-8a5f-561719e9b170\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pmg8q" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315856 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a5abeecf-9533-4cd9-8ce3-29bb6d8a00bd-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-fgk4q\" (UID: \"a5abeecf-9533-4cd9-8ce3-29bb6d8a00bd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fgk4q" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315882 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/44f59229-dec6-4d9b-a63b-bd562b4523cf-proxy-tls\") pod \"machine-config-controller-84d6567774-hffr5\" (UID: \"44f59229-dec6-4d9b-a63b-bd562b4523cf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hffr5" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315897 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/80d5bedc-a598-4779-be24-2d512ea7d148-trusted-ca\") pod \"ingress-operator-5b745b69d9-2jcxn\" (UID: \"80d5bedc-a598-4779-be24-2d512ea7d148\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2jcxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315925 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/44f59229-dec6-4d9b-a63b-bd562b4523cf-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-hffr5\" (UID: \"44f59229-dec6-4d9b-a63b-bd562b4523cf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hffr5" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.315946 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c2663fa-7df3-4801-be78-52517eb1f1cf-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-v526f\" (UID: \"2c2663fa-7df3-4801-be78-52517eb1f1cf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v526f" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.316022 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9025cb05-7c57-488b-a8cb-441552547aae-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-4rjht\" (UID: \"9025cb05-7c57-488b-a8cb-441552547aae\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4rjht" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.316048 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm2dw\" (UniqueName: \"kubernetes.io/projected/10be2947-2e91-4a8e-b54e-69cdab598955-kube-api-access-mm2dw\") pod \"etcd-operator-b45778765-l4rxn\" (UID: \"10be2947-2e91-4a8e-b54e-69cdab598955\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l4rxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.316080 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3eaa54fb-8d70-463c-8388-9f8443a480ed-service-ca-bundle\") pod \"router-default-5444994796-9tkxg\" (UID: \"3eaa54fb-8d70-463c-8388-9f8443a480ed\") " pod="openshift-ingress/router-default-5444994796-9tkxg" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.316106 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwg6b\" (UniqueName: \"kubernetes.io/projected/255ab2ef-dead-4148-bc85-2514618767b9-kube-api-access-pwg6b\") pod \"openshift-controller-manager-operator-756b6f6bc6-5nslp\" (UID: \"255ab2ef-dead-4148-bc85-2514618767b9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5nslp" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.316129 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f5681b96-47c5-44f8-9e5d-671678930750-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-bcfcc\" (UID: \"f5681b96-47c5-44f8-9e5d-671678930750\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bcfcc" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.316937 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec4b9459-d392-4fc5-9b6f-a87ca50e85b1-config\") pod \"kube-controller-manager-operator-78b949d7b-l57bl\" (UID: \"ec4b9459-d392-4fc5-9b6f-a87ca50e85b1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l57bl" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.318799 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.321074 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/255ab2ef-dead-4148-bc85-2514618767b9-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-5nslp\" (UID: \"255ab2ef-dead-4148-bc85-2514618767b9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5nslp" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.323302 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/44f59229-dec6-4d9b-a63b-bd562b4523cf-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-hffr5\" (UID: \"44f59229-dec6-4d9b-a63b-bd562b4523cf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hffr5" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.324919 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9025cb05-7c57-488b-a8cb-441552547aae-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-4rjht\" (UID: \"9025cb05-7c57-488b-a8cb-441552547aae\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4rjht" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.326167 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a5abeecf-9533-4cd9-8ce3-29bb6d8a00bd-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-fgk4q\" (UID: \"a5abeecf-9533-4cd9-8ce3-29bb6d8a00bd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fgk4q" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.328565 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec4b9459-d392-4fc5-9b6f-a87ca50e85b1-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-l57bl\" (UID: \"ec4b9459-d392-4fc5-9b6f-a87ca50e85b1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l57bl" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.328595 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/255ab2ef-dead-4148-bc85-2514618767b9-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-5nslp\" (UID: \"255ab2ef-dead-4148-bc85-2514618767b9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5nslp" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.329014 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a5abeecf-9533-4cd9-8ce3-29bb6d8a00bd-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-fgk4q\" (UID: \"a5abeecf-9533-4cd9-8ce3-29bb6d8a00bd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fgk4q" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.338581 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.357881 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.378211 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.388883 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4a3cd73-aa6c-4128-8a5f-561719e9b170-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pmg8q\" (UID: \"a4a3cd73-aa6c-4128-8a5f-561719e9b170\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pmg8q" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.398316 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.407544 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4a3cd73-aa6c-4128-8a5f-561719e9b170-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pmg8q\" (UID: \"a4a3cd73-aa6c-4128-8a5f-561719e9b170\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pmg8q" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.418190 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.438337 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.439747 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c2663fa-7df3-4801-be78-52517eb1f1cf-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-v526f\" (UID: \"2c2663fa-7df3-4801-be78-52517eb1f1cf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v526f" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.458934 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.478743 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.482429 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c2663fa-7df3-4801-be78-52517eb1f1cf-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-v526f\" (UID: \"2c2663fa-7df3-4801-be78-52517eb1f1cf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v526f" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.499036 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.501014 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/10be2947-2e91-4a8e-b54e-69cdab598955-etcd-service-ca\") pod \"etcd-operator-b45778765-l4rxn\" (UID: \"10be2947-2e91-4a8e-b54e-69cdab598955\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l4rxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.518523 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.538961 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.547574 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10be2947-2e91-4a8e-b54e-69cdab598955-config\") pod \"etcd-operator-b45778765-l4rxn\" (UID: \"10be2947-2e91-4a8e-b54e-69cdab598955\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l4rxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.558823 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.568115 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/10be2947-2e91-4a8e-b54e-69cdab598955-etcd-ca\") pod \"etcd-operator-b45778765-l4rxn\" (UID: \"10be2947-2e91-4a8e-b54e-69cdab598955\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l4rxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.578153 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.583441 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10be2947-2e91-4a8e-b54e-69cdab598955-serving-cert\") pod \"etcd-operator-b45778765-l4rxn\" (UID: \"10be2947-2e91-4a8e-b54e-69cdab598955\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l4rxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.598430 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.618231 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.638495 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.644085 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/10be2947-2e91-4a8e-b54e-69cdab598955-etcd-client\") pod \"etcd-operator-b45778765-l4rxn\" (UID: \"10be2947-2e91-4a8e-b54e-69cdab598955\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l4rxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.666600 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.672013 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/80d5bedc-a598-4779-be24-2d512ea7d148-trusted-ca\") pod \"ingress-operator-5b745b69d9-2jcxn\" (UID: \"80d5bedc-a598-4779-be24-2d512ea7d148\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2jcxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.679177 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.699706 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.714142 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/80d5bedc-a598-4779-be24-2d512ea7d148-metrics-tls\") pod \"ingress-operator-5b745b69d9-2jcxn\" (UID: \"80d5bedc-a598-4779-be24-2d512ea7d148\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2jcxn" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.718422 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.738765 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.759150 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.763742 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c83842ec-9933-4f84-bb4a-c84ca61a28e1-oauth-serving-cert\") pod \"console-f9d7485db-q2qpt\" (UID: \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\") " pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.778750 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.798771 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.819693 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.839347 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.858447 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.868791 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9025cb05-7c57-488b-a8cb-441552547aae-config\") pod \"kube-apiserver-operator-766d6c64bb-4rjht\" (UID: \"9025cb05-7c57-488b-a8cb-441552547aae\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4rjht" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.878331 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.898879 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.903257 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3eaa54fb-8d70-463c-8388-9f8443a480ed-metrics-certs\") pod \"router-default-5444994796-9tkxg\" (UID: \"3eaa54fb-8d70-463c-8388-9f8443a480ed\") " pod="openshift-ingress/router-default-5444994796-9tkxg" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.919068 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.938155 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.950400 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/3eaa54fb-8d70-463c-8388-9f8443a480ed-default-certificate\") pod \"router-default-5444994796-9tkxg\" (UID: \"3eaa54fb-8d70-463c-8388-9f8443a480ed\") " pod="openshift-ingress/router-default-5444994796-9tkxg" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.959588 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.969922 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/3eaa54fb-8d70-463c-8388-9f8443a480ed-stats-auth\") pod \"router-default-5444994796-9tkxg\" (UID: \"3eaa54fb-8d70-463c-8388-9f8443a480ed\") " pod="openshift-ingress/router-default-5444994796-9tkxg" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.978603 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.981228 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3eaa54fb-8d70-463c-8388-9f8443a480ed-service-ca-bundle\") pod \"router-default-5444994796-9tkxg\" (UID: \"3eaa54fb-8d70-463c-8388-9f8443a480ed\") " pod="openshift-ingress/router-default-5444994796-9tkxg" Mar 13 11:51:29 crc kubenswrapper[4837]: I0313 11:51:29.999206 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.018664 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.025727 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e6a94afd-1f9a-4281-9d94-2fac3916f2c3-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-wcfj4\" (UID: \"e6a94afd-1f9a-4281-9d94-2fac3916f2c3\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wcfj4" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.039056 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.056670 4837 request.go:700] Waited for 1.013104437s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-multus/secrets?fieldSelector=metadata.name%3Dmultus-ac-dockercfg-9lkdf&limit=500&resourceVersion=0 Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.057955 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.079036 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.099473 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.118539 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.139161 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.159297 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.170864 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f5681b96-47c5-44f8-9e5d-671678930750-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-bcfcc\" (UID: \"f5681b96-47c5-44f8-9e5d-671678930750\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bcfcc" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.179768 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.198963 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.219654 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.237671 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/00848ba6-522a-45c7-81bd-7ab287d77626-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-jrm5t\" (UID: \"00848ba6-522a-45c7-81bd-7ab287d77626\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jrm5t" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.238797 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.259057 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.267103 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/44f59229-dec6-4d9b-a63b-bd562b4523cf-proxy-tls\") pod \"machine-config-controller-84d6567774-hffr5\" (UID: \"44f59229-dec6-4d9b-a63b-bd562b4523cf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hffr5" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.319111 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.339319 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.358911 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.379031 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.400305 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.419450 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.438404 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.468792 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.480009 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.500015 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.519457 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.539616 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.559417 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.579585 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.598467 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.618728 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.639082 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.658657 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.678127 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.698856 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.718490 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.739343 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.758849 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.778622 4837 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.798897 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.819279 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.861628 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks5xn\" (UniqueName: \"kubernetes.io/projected/f8d8640c-c4bc-40ad-9594-7b4fb2c4beb0-kube-api-access-ks5xn\") pod \"openshift-config-operator-7777fb866f-f97pg\" (UID: \"f8d8640c-c4bc-40ad-9594-7b4fb2c4beb0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-f97pg" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.876521 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdq6z\" (UniqueName: \"kubernetes.io/projected/fca26784-7fdf-4923-bd07-35d182c2ad14-kube-api-access-hdq6z\") pod \"cluster-samples-operator-665b6dd947-jpbgx\" (UID: \"fca26784-7fdf-4923-bd07-35d182c2ad14\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jpbgx" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.899747 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpjrd\" (UniqueName: \"kubernetes.io/projected/c83842ec-9933-4f84-bb4a-c84ca61a28e1-kube-api-access-jpjrd\") pod \"console-f9d7485db-q2qpt\" (UID: \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\") " pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.912591 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74s9f\" (UniqueName: \"kubernetes.io/projected/d8974a7e-ac32-4644-b7ee-2d3908daf2fa-kube-api-access-74s9f\") pod \"authentication-operator-69f744f599-84ccm\" (UID: \"d8974a7e-ac32-4644-b7ee-2d3908daf2fa\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-84ccm" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.933069 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvs46\" (UniqueName: \"kubernetes.io/projected/f8bc408a-bca6-42ff-8572-2ba9a3978682-kube-api-access-xvs46\") pod \"controller-manager-879f6c89f-9dbhc\" (UID: \"f8bc408a-bca6-42ff-8572-2ba9a3978682\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.951683 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vffzw\" (UniqueName: \"kubernetes.io/projected/3e1f747d-78f3-4cbc-b313-eed531936c02-kube-api-access-vffzw\") pod \"apiserver-76f77b778f-qqkbm\" (UID: \"3e1f747d-78f3-4cbc-b313-eed531936c02\") " pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.973518 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj2pk\" (UniqueName: \"kubernetes.io/projected/5a3cabe4-69ee-49f7-a783-e72ac1a56821-kube-api-access-sj2pk\") pod \"route-controller-manager-6576b87f9c-qs2qs\" (UID: \"5a3cabe4-69ee-49f7-a783-e72ac1a56821\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs" Mar 13 11:51:30 crc kubenswrapper[4837]: I0313 11:51:30.991759 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6dmk\" (UniqueName: \"kubernetes.io/projected/6db10103-96be-4420-b302-a7064e347f61-kube-api-access-q6dmk\") pod \"machine-api-operator-5694c8668f-vsp2m\" (UID: \"6db10103-96be-4420-b302-a7064e347f61\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vsp2m" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.003151 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-f97pg" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.010283 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-84ccm" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.017408 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9blhw\" (UniqueName: \"kubernetes.io/projected/003e8201-4e67-4356-b0c1-8cc135451069-kube-api-access-9blhw\") pod \"console-operator-58897d9998-8dj7w\" (UID: \"003e8201-4e67-4356-b0c1-8cc135451069\") " pod="openshift-console-operator/console-operator-58897d9998-8dj7w" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.033836 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zn764\" (UniqueName: \"kubernetes.io/projected/27d45de2-e0ab-4c3e-b3da-b20e60e26801-kube-api-access-zn764\") pod \"oauth-openshift-558db77b4-8q6j6\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.054298 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsdcp\" (UniqueName: \"kubernetes.io/projected/ffb5553f-d2d5-4584-9bf8-7212a378f358-kube-api-access-zsdcp\") pod \"apiserver-7bbb656c7d-472bb\" (UID: \"ffb5553f-d2d5-4584-9bf8-7212a378f358\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.055793 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.057378 4837 request.go:700] Waited for 1.830411174s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-apiserver-operator/serviceaccounts/openshift-apiserver-operator/token Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.072946 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.078264 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jpbgx" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.079686 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mclx9\" (UniqueName: \"kubernetes.io/projected/6ad2861b-4f40-4551-8aff-304359734792-kube-api-access-mclx9\") pod \"openshift-apiserver-operator-796bbdcf4f-dv9wr\" (UID: \"6ad2861b-4f40-4551-8aff-304359734792\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dv9wr" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.096733 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnwqw\" (UniqueName: \"kubernetes.io/projected/10ac507b-7307-4e09-ab72-b956d0139396-kube-api-access-cnwqw\") pod \"machine-approver-56656f9798-dhrww\" (UID: \"10ac507b-7307-4e09-ab72-b956d0139396\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dhrww" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.099271 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.121373 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.123996 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-vsp2m" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.128012 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.138824 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.159666 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.170820 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.179862 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.200429 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.207294 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dhrww" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.226008 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.231913 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.239899 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dv9wr" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.241118 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.258613 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.261691 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.281226 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.294518 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-8dj7w" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.315855 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n895\" (UniqueName: \"kubernetes.io/projected/f5681b96-47c5-44f8-9e5d-671678930750-kube-api-access-4n895\") pod \"package-server-manager-789f6589d5-bcfcc\" (UID: \"f5681b96-47c5-44f8-9e5d-671678930750\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bcfcc" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.335346 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmhx9\" (UniqueName: \"kubernetes.io/projected/e6a94afd-1f9a-4281-9d94-2fac3916f2c3-kube-api-access-kmhx9\") pod \"multus-admission-controller-857f4d67dd-wcfj4\" (UID: \"e6a94afd-1f9a-4281-9d94-2fac3916f2c3\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wcfj4" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.358979 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ec4b9459-d392-4fc5-9b6f-a87ca50e85b1-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-l57bl\" (UID: \"ec4b9459-d392-4fc5-9b6f-a87ca50e85b1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l57bl" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.378680 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzgwf\" (UniqueName: \"kubernetes.io/projected/3eaa54fb-8d70-463c-8388-9f8443a480ed-kube-api-access-fzgwf\") pod \"router-default-5444994796-9tkxg\" (UID: \"3eaa54fb-8d70-463c-8388-9f8443a480ed\") " pod="openshift-ingress/router-default-5444994796-9tkxg" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.405394 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l57bl" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.413662 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-f97pg"] Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.416844 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkvjg\" (UniqueName: \"kubernetes.io/projected/2c2663fa-7df3-4801-be78-52517eb1f1cf-kube-api-access-gkvjg\") pod \"kube-storage-version-migrator-operator-b67b599dd-v526f\" (UID: \"2c2663fa-7df3-4801-be78-52517eb1f1cf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v526f" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.433342 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zblb5\" (UniqueName: \"kubernetes.io/projected/fe42bd29-b8a7-4a9f-89e2-ab3b944d7c26-kube-api-access-zblb5\") pod \"migrator-59844c95c7-64xpb\" (UID: \"fe42bd29-b8a7-4a9f-89e2-ab3b944d7c26\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-64xpb" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.433363 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bb2n\" (UniqueName: \"kubernetes.io/projected/44f59229-dec6-4d9b-a63b-bd562b4523cf-kube-api-access-5bb2n\") pod \"machine-config-controller-84d6567774-hffr5\" (UID: \"44f59229-dec6-4d9b-a63b-bd562b4523cf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hffr5" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.446385 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-84ccm"] Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.450467 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-64xpb" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.456017 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndkht\" (UniqueName: \"kubernetes.io/projected/a5abeecf-9533-4cd9-8ce3-29bb6d8a00bd-kube-api-access-ndkht\") pod \"cluster-image-registry-operator-dc59b4c8b-fgk4q\" (UID: \"a5abeecf-9533-4cd9-8ce3-29bb6d8a00bd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fgk4q" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.458897 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v526f" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.462798 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-vsp2m"] Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.476542 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a4a3cd73-aa6c-4128-8a5f-561719e9b170-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pmg8q\" (UID: \"a4a3cd73-aa6c-4128-8a5f-561719e9b170\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pmg8q" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.489746 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-9tkxg" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.493835 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a5abeecf-9533-4cd9-8ce3-29bb6d8a00bd-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-fgk4q\" (UID: \"a5abeecf-9533-4cd9-8ce3-29bb6d8a00bd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fgk4q" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.499412 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-wcfj4" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.518915 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hffr5" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.527004 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bcfcc" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.527988 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9025cb05-7c57-488b-a8cb-441552547aae-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-4rjht\" (UID: \"9025cb05-7c57-488b-a8cb-441552547aae\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4rjht" Mar 13 11:51:31 crc kubenswrapper[4837]: W0313 11:51:31.532601 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3eaa54fb_8d70_463c_8388_9f8443a480ed.slice/crio-b1b89859732d6e3f130db8770074382bf2ec9c2b4d0b2c135f5f19ccd80108b4 WatchSource:0}: Error finding container b1b89859732d6e3f130db8770074382bf2ec9c2b4d0b2c135f5f19ccd80108b4: Status 404 returned error can't find the container with id b1b89859732d6e3f130db8770074382bf2ec9c2b4d0b2c135f5f19ccd80108b4 Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.544979 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwg6b\" (UniqueName: \"kubernetes.io/projected/255ab2ef-dead-4148-bc85-2514618767b9-kube-api-access-pwg6b\") pod \"openshift-controller-manager-operator-756b6f6bc6-5nslp\" (UID: \"255ab2ef-dead-4148-bc85-2514618767b9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5nslp" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.556556 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/80d5bedc-a598-4779-be24-2d512ea7d148-bound-sa-token\") pod \"ingress-operator-5b745b69d9-2jcxn\" (UID: \"80d5bedc-a598-4779-be24-2d512ea7d148\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2jcxn" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.584080 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm2dw\" (UniqueName: \"kubernetes.io/projected/10be2947-2e91-4a8e-b54e-69cdab598955-kube-api-access-mm2dw\") pod \"etcd-operator-b45778765-l4rxn\" (UID: \"10be2947-2e91-4a8e-b54e-69cdab598955\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l4rxn" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.586843 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jpbgx"] Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.588911 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-qqkbm"] Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.591671 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-q2qpt"] Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.596571 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5bhr\" (UniqueName: \"kubernetes.io/projected/80d5bedc-a598-4779-be24-2d512ea7d148-kube-api-access-r5bhr\") pod \"ingress-operator-5b745b69d9-2jcxn\" (UID: \"80d5bedc-a598-4779-be24-2d512ea7d148\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2jcxn" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.618086 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c45j\" (UniqueName: \"kubernetes.io/projected/00848ba6-522a-45c7-81bd-7ab287d77626-kube-api-access-7c45j\") pod \"control-plane-machine-set-operator-78cbb6b69f-jrm5t\" (UID: \"00848ba6-522a-45c7-81bd-7ab287d77626\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jrm5t" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.658369 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-installation-pull-secrets\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.658407 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-bound-sa-token\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.658430 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.658451 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpw7f\" (UniqueName: \"kubernetes.io/projected/416fd214-ef6d-45b4-bf11-a35c92909523-kube-api-access-vpw7f\") pod \"dns-operator-744455d44c-8zzqp\" (UID: \"416fd214-ef6d-45b4-bf11-a35c92909523\") " pod="openshift-dns-operator/dns-operator-744455d44c-8zzqp" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.658470 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-ca-trust-extracted\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.658493 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7xs7\" (UniqueName: \"kubernetes.io/projected/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-kube-api-access-g7xs7\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.658510 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/416fd214-ef6d-45b4-bf11-a35c92909523-metrics-tls\") pod \"dns-operator-744455d44c-8zzqp\" (UID: \"416fd214-ef6d-45b4-bf11-a35c92909523\") " pod="openshift-dns-operator/dns-operator-744455d44c-8zzqp" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.658565 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/edf3b6c4-d4e3-4ec6-8fdf-ff01abe23e57-auth-proxy-config\") pod \"machine-config-operator-74547568cd-69xj9\" (UID: \"edf3b6c4-d4e3-4ec6-8fdf-ff01abe23e57\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-69xj9" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.658588 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-registry-tls\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.658602 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-trusted-ca\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.658653 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/edf3b6c4-d4e3-4ec6-8fdf-ff01abe23e57-images\") pod \"machine-config-operator-74547568cd-69xj9\" (UID: \"edf3b6c4-d4e3-4ec6-8fdf-ff01abe23e57\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-69xj9" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.658669 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b45pg\" (UniqueName: \"kubernetes.io/projected/edf3b6c4-d4e3-4ec6-8fdf-ff01abe23e57-kube-api-access-b45pg\") pod \"machine-config-operator-74547568cd-69xj9\" (UID: \"edf3b6c4-d4e3-4ec6-8fdf-ff01abe23e57\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-69xj9" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.658691 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/edf3b6c4-d4e3-4ec6-8fdf-ff01abe23e57-proxy-tls\") pod \"machine-config-operator-74547568cd-69xj9\" (UID: \"edf3b6c4-d4e3-4ec6-8fdf-ff01abe23e57\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-69xj9" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.658717 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-registry-certificates\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.658743 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzdbz\" (UniqueName: \"kubernetes.io/projected/85ac6950-8b98-4d0c-8a2b-7eeeac8d1435-kube-api-access-jzdbz\") pod \"downloads-7954f5f757-8ktsx\" (UID: \"85ac6950-8b98-4d0c-8a2b-7eeeac8d1435\") " pod="openshift-console/downloads-7954f5f757-8ktsx" Mar 13 11:51:31 crc kubenswrapper[4837]: E0313 11:51:31.659295 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:32.159284811 +0000 UTC m=+207.797551574 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.689150 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fgk4q" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.692860 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb"] Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.718268 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5nslp" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.744512 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4rjht" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.755537 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8q6j6"] Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.760243 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.760418 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6085cb91-fec3-45bd-bfdc-a10e6043049f-metrics-tls\") pod \"dns-default-z9thp\" (UID: \"6085cb91-fec3-45bd-bfdc-a10e6043049f\") " pod="openshift-dns/dns-default-z9thp" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.760442 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/edf3b6c4-d4e3-4ec6-8fdf-ff01abe23e57-auth-proxy-config\") pod \"machine-config-operator-74547568cd-69xj9\" (UID: \"edf3b6c4-d4e3-4ec6-8fdf-ff01abe23e57\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-69xj9" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.760462 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d0005e35-a11c-4773-a0d1-94fa4aff8a14-apiservice-cert\") pod \"packageserver-d55dfcdfc-xhx6c\" (UID: \"d0005e35-a11c-4773-a0d1-94fa4aff8a14\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xhx6c" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.760519 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d0005e35-a11c-4773-a0d1-94fa4aff8a14-webhook-cert\") pod \"packageserver-d55dfcdfc-xhx6c\" (UID: \"d0005e35-a11c-4773-a0d1-94fa4aff8a14\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xhx6c" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.760533 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/90308f63-bacc-491b-9ce2-ffbb2eaaea1f-cert\") pod \"ingress-canary-9hkj4\" (UID: \"90308f63-bacc-491b-9ce2-ffbb2eaaea1f\") " pod="openshift-ingress-canary/ingress-canary-9hkj4" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.760549 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e366a2cd-5dfa-45c9-b187-92772da0b827-srv-cert\") pod \"olm-operator-6b444d44fb-659h7\" (UID: \"e366a2cd-5dfa-45c9-b187-92772da0b827\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-659h7" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.760589 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-registry-tls\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.760607 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjqzb\" (UniqueName: \"kubernetes.io/projected/2960b8ba-5517-4915-b524-1f3f6d0f043c-kube-api-access-fjqzb\") pod \"service-ca-9c57cc56f-xfcxm\" (UID: \"2960b8ba-5517-4915-b524-1f3f6d0f043c\") " pod="openshift-service-ca/service-ca-9c57cc56f-xfcxm" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.760657 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-trusted-ca\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.760674 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d0005e35-a11c-4773-a0d1-94fa4aff8a14-tmpfs\") pod \"packageserver-d55dfcdfc-xhx6c\" (UID: \"d0005e35-a11c-4773-a0d1-94fa4aff8a14\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xhx6c" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.760689 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlwdw\" (UniqueName: \"kubernetes.io/projected/0484d991-f239-47a2-80ff-0237945c27ac-kube-api-access-dlwdw\") pod \"auto-csr-approver-29556710-lcprh\" (UID: \"0484d991-f239-47a2-80ff-0237945c27ac\") " pod="openshift-infra/auto-csr-approver-29556710-lcprh" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.760740 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8bc71239-c925-4911-bfa5-e7a564dcd654-registration-dir\") pod \"csi-hostpathplugin-84xjl\" (UID: \"8bc71239-c925-4911-bfa5-e7a564dcd654\") " pod="hostpath-provisioner/csi-hostpathplugin-84xjl" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.761737 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/edf3b6c4-d4e3-4ec6-8fdf-ff01abe23e57-auth-proxy-config\") pod \"machine-config-operator-74547568cd-69xj9\" (UID: \"edf3b6c4-d4e3-4ec6-8fdf-ff01abe23e57\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-69xj9" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.761748 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/8bc71239-c925-4911-bfa5-e7a564dcd654-mountpoint-dir\") pod \"csi-hostpathplugin-84xjl\" (UID: \"8bc71239-c925-4911-bfa5-e7a564dcd654\") " pod="hostpath-provisioner/csi-hostpathplugin-84xjl" Mar 13 11:51:31 crc kubenswrapper[4837]: E0313 11:51:31.761892 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:32.261874203 +0000 UTC m=+207.900140976 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.762059 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/8bc71239-c925-4911-bfa5-e7a564dcd654-csi-data-dir\") pod \"csi-hostpathplugin-84xjl\" (UID: \"8bc71239-c925-4911-bfa5-e7a564dcd654\") " pod="hostpath-provisioner/csi-hostpathplugin-84xjl" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.762081 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8bc71239-c925-4911-bfa5-e7a564dcd654-socket-dir\") pod \"csi-hostpathplugin-84xjl\" (UID: \"8bc71239-c925-4911-bfa5-e7a564dcd654\") " pod="hostpath-provisioner/csi-hostpathplugin-84xjl" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.762334 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/edf3b6c4-d4e3-4ec6-8fdf-ff01abe23e57-images\") pod \"machine-config-operator-74547568cd-69xj9\" (UID: \"edf3b6c4-d4e3-4ec6-8fdf-ff01abe23e57\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-69xj9" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.762347 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-trusted-ca\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.762374 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b45pg\" (UniqueName: \"kubernetes.io/projected/edf3b6c4-d4e3-4ec6-8fdf-ff01abe23e57-kube-api-access-b45pg\") pod \"machine-config-operator-74547568cd-69xj9\" (UID: \"edf3b6c4-d4e3-4ec6-8fdf-ff01abe23e57\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-69xj9" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.762512 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl4f8\" (UniqueName: \"kubernetes.io/projected/831db5b2-5229-4b52-8783-f99c640ba856-kube-api-access-pl4f8\") pod \"collect-profiles-29556705-kllhr\" (UID: \"831db5b2-5229-4b52-8783-f99c640ba856\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556705-kllhr" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.762547 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrxsc\" (UniqueName: \"kubernetes.io/projected/6085cb91-fec3-45bd-bfdc-a10e6043049f-kube-api-access-rrxsc\") pod \"dns-default-z9thp\" (UID: \"6085cb91-fec3-45bd-bfdc-a10e6043049f\") " pod="openshift-dns/dns-default-z9thp" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.762566 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/41e982da-ccd1-4b0c-9f0e-c220e06052a0-srv-cert\") pod \"catalog-operator-68c6474976-4gsck\" (UID: \"41e982da-ccd1-4b0c-9f0e-c220e06052a0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4gsck" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.762729 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/edf3b6c4-d4e3-4ec6-8fdf-ff01abe23e57-proxy-tls\") pod \"machine-config-operator-74547568cd-69xj9\" (UID: \"edf3b6c4-d4e3-4ec6-8fdf-ff01abe23e57\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-69xj9" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.762849 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/edf3b6c4-d4e3-4ec6-8fdf-ff01abe23e57-images\") pod \"machine-config-operator-74547568cd-69xj9\" (UID: \"edf3b6c4-d4e3-4ec6-8fdf-ff01abe23e57\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-69xj9" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.763441 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8fb85cad-ec2d-4ada-bd68-55937d96a779-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8vgmn\" (UID: \"8fb85cad-ec2d-4ada-bd68-55937d96a779\") " pod="openshift-marketplace/marketplace-operator-79b997595-8vgmn" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.763733 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-registry-certificates\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.764918 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j25rc\" (UniqueName: \"kubernetes.io/projected/8fb85cad-ec2d-4ada-bd68-55937d96a779-kube-api-access-j25rc\") pod \"marketplace-operator-79b997595-8vgmn\" (UID: \"8fb85cad-ec2d-4ada-bd68-55937d96a779\") " pod="openshift-marketplace/marketplace-operator-79b997595-8vgmn" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.765390 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/41e982da-ccd1-4b0c-9f0e-c220e06052a0-profile-collector-cert\") pod \"catalog-operator-68c6474976-4gsck\" (UID: \"41e982da-ccd1-4b0c-9f0e-c220e06052a0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4gsck" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.765449 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-registry-certificates\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.765454 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzdbz\" (UniqueName: \"kubernetes.io/projected/85ac6950-8b98-4d0c-8a2b-7eeeac8d1435-kube-api-access-jzdbz\") pod \"downloads-7954f5f757-8ktsx\" (UID: \"85ac6950-8b98-4d0c-8a2b-7eeeac8d1435\") " pod="openshift-console/downloads-7954f5f757-8ktsx" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.765671 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/831db5b2-5229-4b52-8783-f99c640ba856-config-volume\") pod \"collect-profiles-29556705-kllhr\" (UID: \"831db5b2-5229-4b52-8783-f99c640ba856\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556705-kllhr" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.765935 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pmg8q" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.766027 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e366a2cd-5dfa-45c9-b187-92772da0b827-profile-collector-cert\") pod \"olm-operator-6b444d44fb-659h7\" (UID: \"e366a2cd-5dfa-45c9-b187-92772da0b827\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-659h7" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.766051 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b42d2c64-cd10-4923-aed0-dc586696da9a-node-bootstrap-token\") pod \"machine-config-server-9g2bm\" (UID: \"b42d2c64-cd10-4923-aed0-dc586696da9a\") " pod="openshift-machine-config-operator/machine-config-server-9g2bm" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.766124 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/998432c5-238a-466a-a779-7d5126210706-serving-cert\") pod \"service-ca-operator-777779d784-ng8zt\" (UID: \"998432c5-238a-466a-a779-7d5126210706\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ng8zt" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.766179 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfq6d\" (UniqueName: \"kubernetes.io/projected/b42d2c64-cd10-4923-aed0-dc586696da9a-kube-api-access-nfq6d\") pod \"machine-config-server-9g2bm\" (UID: \"b42d2c64-cd10-4923-aed0-dc586696da9a\") " pod="openshift-machine-config-operator/machine-config-server-9g2bm" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.766756 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-installation-pull-secrets\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.766830 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llv9w\" (UniqueName: \"kubernetes.io/projected/998432c5-238a-466a-a779-7d5126210706-kube-api-access-llv9w\") pod \"service-ca-operator-777779d784-ng8zt\" (UID: \"998432c5-238a-466a-a779-7d5126210706\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ng8zt" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.767005 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-bound-sa-token\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.767175 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:31 crc kubenswrapper[4837]: E0313 11:51:31.767491 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:32.267474619 +0000 UTC m=+207.905741452 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.767543 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbmq8\" (UniqueName: \"kubernetes.io/projected/8bc71239-c925-4911-bfa5-e7a564dcd654-kube-api-access-dbmq8\") pod \"csi-hostpathplugin-84xjl\" (UID: \"8bc71239-c925-4911-bfa5-e7a564dcd654\") " pod="hostpath-provisioner/csi-hostpathplugin-84xjl" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.767762 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2960b8ba-5517-4915-b524-1f3f6d0f043c-signing-key\") pod \"service-ca-9c57cc56f-xfcxm\" (UID: \"2960b8ba-5517-4915-b524-1f3f6d0f043c\") " pod="openshift-service-ca/service-ca-9c57cc56f-xfcxm" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.767959 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpw7f\" (UniqueName: \"kubernetes.io/projected/416fd214-ef6d-45b4-bf11-a35c92909523-kube-api-access-vpw7f\") pod \"dns-operator-744455d44c-8zzqp\" (UID: \"416fd214-ef6d-45b4-bf11-a35c92909523\") " pod="openshift-dns-operator/dns-operator-744455d44c-8zzqp" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.767997 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b42d2c64-cd10-4923-aed0-dc586696da9a-certs\") pod \"machine-config-server-9g2bm\" (UID: \"b42d2c64-cd10-4923-aed0-dc586696da9a\") " pod="openshift-machine-config-operator/machine-config-server-9g2bm" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.768057 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8kxg\" (UniqueName: \"kubernetes.io/projected/d0005e35-a11c-4773-a0d1-94fa4aff8a14-kube-api-access-d8kxg\") pod \"packageserver-d55dfcdfc-xhx6c\" (UID: \"d0005e35-a11c-4773-a0d1-94fa4aff8a14\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xhx6c" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.768079 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/998432c5-238a-466a-a779-7d5126210706-config\") pod \"service-ca-operator-777779d784-ng8zt\" (UID: \"998432c5-238a-466a-a779-7d5126210706\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ng8zt" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.768717 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/edf3b6c4-d4e3-4ec6-8fdf-ff01abe23e57-proxy-tls\") pod \"machine-config-operator-74547568cd-69xj9\" (UID: \"edf3b6c4-d4e3-4ec6-8fdf-ff01abe23e57\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-69xj9" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.768809 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s78rb\" (UniqueName: \"kubernetes.io/projected/90308f63-bacc-491b-9ce2-ffbb2eaaea1f-kube-api-access-s78rb\") pod \"ingress-canary-9hkj4\" (UID: \"90308f63-bacc-491b-9ce2-ffbb2eaaea1f\") " pod="openshift-ingress-canary/ingress-canary-9hkj4" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.768839 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8fb85cad-ec2d-4ada-bd68-55937d96a779-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8vgmn\" (UID: \"8fb85cad-ec2d-4ada-bd68-55937d96a779\") " pod="openshift-marketplace/marketplace-operator-79b997595-8vgmn" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.768893 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd28x\" (UniqueName: \"kubernetes.io/projected/41e982da-ccd1-4b0c-9f0e-c220e06052a0-kube-api-access-wd28x\") pod \"catalog-operator-68c6474976-4gsck\" (UID: \"41e982da-ccd1-4b0c-9f0e-c220e06052a0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4gsck" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.768933 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/8bc71239-c925-4911-bfa5-e7a564dcd654-plugins-dir\") pod \"csi-hostpathplugin-84xjl\" (UID: \"8bc71239-c925-4911-bfa5-e7a564dcd654\") " pod="hostpath-provisioner/csi-hostpathplugin-84xjl" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.768997 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-ca-trust-extracted\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.769131 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9zsz\" (UniqueName: \"kubernetes.io/projected/e366a2cd-5dfa-45c9-b187-92772da0b827-kube-api-access-d9zsz\") pod \"olm-operator-6b444d44fb-659h7\" (UID: \"e366a2cd-5dfa-45c9-b187-92772da0b827\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-659h7" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.769339 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7xs7\" (UniqueName: \"kubernetes.io/projected/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-kube-api-access-g7xs7\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.771315 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/416fd214-ef6d-45b4-bf11-a35c92909523-metrics-tls\") pod \"dns-operator-744455d44c-8zzqp\" (UID: \"416fd214-ef6d-45b4-bf11-a35c92909523\") " pod="openshift-dns-operator/dns-operator-744455d44c-8zzqp" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.771829 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6085cb91-fec3-45bd-bfdc-a10e6043049f-config-volume\") pod \"dns-default-z9thp\" (UID: \"6085cb91-fec3-45bd-bfdc-a10e6043049f\") " pod="openshift-dns/dns-default-z9thp" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.772308 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/831db5b2-5229-4b52-8783-f99c640ba856-secret-volume\") pod \"collect-profiles-29556705-kllhr\" (UID: \"831db5b2-5229-4b52-8783-f99c640ba856\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556705-kllhr" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.772512 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2jcxn" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.774760 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2960b8ba-5517-4915-b524-1f3f6d0f043c-signing-cabundle\") pod \"service-ca-9c57cc56f-xfcxm\" (UID: \"2960b8ba-5517-4915-b524-1f3f6d0f043c\") " pod="openshift-service-ca/service-ca-9c57cc56f-xfcxm" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.776262 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/416fd214-ef6d-45b4-bf11-a35c92909523-metrics-tls\") pod \"dns-operator-744455d44c-8zzqp\" (UID: \"416fd214-ef6d-45b4-bf11-a35c92909523\") " pod="openshift-dns-operator/dns-operator-744455d44c-8zzqp" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.788678 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-ca-trust-extracted\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.788915 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-registry-tls\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.799602 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b45pg\" (UniqueName: \"kubernetes.io/projected/edf3b6c4-d4e3-4ec6-8fdf-ff01abe23e57-kube-api-access-b45pg\") pod \"machine-config-operator-74547568cd-69xj9\" (UID: \"edf3b6c4-d4e3-4ec6-8fdf-ff01abe23e57\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-69xj9" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.805903 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-69xj9" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.810303 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-installation-pull-secrets\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.811247 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-l4rxn" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.814685 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzdbz\" (UniqueName: \"kubernetes.io/projected/85ac6950-8b98-4d0c-8a2b-7eeeac8d1435-kube-api-access-jzdbz\") pod \"downloads-7954f5f757-8ktsx\" (UID: \"85ac6950-8b98-4d0c-8a2b-7eeeac8d1435\") " pod="openshift-console/downloads-7954f5f757-8ktsx" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.839522 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jrm5t" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.843092 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9dbhc"] Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.843803 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-bound-sa-token\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.849331 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs"] Mar 13 11:51:31 crc kubenswrapper[4837]: W0313 11:51:31.862307 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8bc408a_bca6_42ff_8572_2ba9a3978682.slice/crio-79d44a31b910bec33360921358068b0857727b0bb4c82bc65255018460fa2174 WatchSource:0}: Error finding container 79d44a31b910bec33360921358068b0857727b0bb4c82bc65255018460fa2174: Status 404 returned error can't find the container with id 79d44a31b910bec33360921358068b0857727b0bb4c82bc65255018460fa2174 Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.864670 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dv9wr"] Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.866562 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-wcfj4"] Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.869747 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpw7f\" (UniqueName: \"kubernetes.io/projected/416fd214-ef6d-45b4-bf11-a35c92909523-kube-api-access-vpw7f\") pod \"dns-operator-744455d44c-8zzqp\" (UID: \"416fd214-ef6d-45b4-bf11-a35c92909523\") " pod="openshift-dns-operator/dns-operator-744455d44c-8zzqp" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.875440 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.875583 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2960b8ba-5517-4915-b524-1f3f6d0f043c-signing-cabundle\") pod \"service-ca-9c57cc56f-xfcxm\" (UID: \"2960b8ba-5517-4915-b524-1f3f6d0f043c\") " pod="openshift-service-ca/service-ca-9c57cc56f-xfcxm" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.875615 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6085cb91-fec3-45bd-bfdc-a10e6043049f-metrics-tls\") pod \"dns-default-z9thp\" (UID: \"6085cb91-fec3-45bd-bfdc-a10e6043049f\") " pod="openshift-dns/dns-default-z9thp" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.875633 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d0005e35-a11c-4773-a0d1-94fa4aff8a14-apiservice-cert\") pod \"packageserver-d55dfcdfc-xhx6c\" (UID: \"d0005e35-a11c-4773-a0d1-94fa4aff8a14\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xhx6c" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.875667 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d0005e35-a11c-4773-a0d1-94fa4aff8a14-webhook-cert\") pod \"packageserver-d55dfcdfc-xhx6c\" (UID: \"d0005e35-a11c-4773-a0d1-94fa4aff8a14\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xhx6c" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.875681 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/90308f63-bacc-491b-9ce2-ffbb2eaaea1f-cert\") pod \"ingress-canary-9hkj4\" (UID: \"90308f63-bacc-491b-9ce2-ffbb2eaaea1f\") " pod="openshift-ingress-canary/ingress-canary-9hkj4" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.875694 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e366a2cd-5dfa-45c9-b187-92772da0b827-srv-cert\") pod \"olm-operator-6b444d44fb-659h7\" (UID: \"e366a2cd-5dfa-45c9-b187-92772da0b827\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-659h7" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.875711 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjqzb\" (UniqueName: \"kubernetes.io/projected/2960b8ba-5517-4915-b524-1f3f6d0f043c-kube-api-access-fjqzb\") pod \"service-ca-9c57cc56f-xfcxm\" (UID: \"2960b8ba-5517-4915-b524-1f3f6d0f043c\") " pod="openshift-service-ca/service-ca-9c57cc56f-xfcxm" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.875723 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d0005e35-a11c-4773-a0d1-94fa4aff8a14-tmpfs\") pod \"packageserver-d55dfcdfc-xhx6c\" (UID: \"d0005e35-a11c-4773-a0d1-94fa4aff8a14\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xhx6c" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.875737 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlwdw\" (UniqueName: \"kubernetes.io/projected/0484d991-f239-47a2-80ff-0237945c27ac-kube-api-access-dlwdw\") pod \"auto-csr-approver-29556710-lcprh\" (UID: \"0484d991-f239-47a2-80ff-0237945c27ac\") " pod="openshift-infra/auto-csr-approver-29556710-lcprh" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.875763 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8bc71239-c925-4911-bfa5-e7a564dcd654-registration-dir\") pod \"csi-hostpathplugin-84xjl\" (UID: \"8bc71239-c925-4911-bfa5-e7a564dcd654\") " pod="hostpath-provisioner/csi-hostpathplugin-84xjl" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.875779 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/8bc71239-c925-4911-bfa5-e7a564dcd654-csi-data-dir\") pod \"csi-hostpathplugin-84xjl\" (UID: \"8bc71239-c925-4911-bfa5-e7a564dcd654\") " pod="hostpath-provisioner/csi-hostpathplugin-84xjl" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.875800 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/8bc71239-c925-4911-bfa5-e7a564dcd654-mountpoint-dir\") pod \"csi-hostpathplugin-84xjl\" (UID: \"8bc71239-c925-4911-bfa5-e7a564dcd654\") " pod="hostpath-provisioner/csi-hostpathplugin-84xjl" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.875817 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pl4f8\" (UniqueName: \"kubernetes.io/projected/831db5b2-5229-4b52-8783-f99c640ba856-kube-api-access-pl4f8\") pod \"collect-profiles-29556705-kllhr\" (UID: \"831db5b2-5229-4b52-8783-f99c640ba856\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556705-kllhr" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.875831 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrxsc\" (UniqueName: \"kubernetes.io/projected/6085cb91-fec3-45bd-bfdc-a10e6043049f-kube-api-access-rrxsc\") pod \"dns-default-z9thp\" (UID: \"6085cb91-fec3-45bd-bfdc-a10e6043049f\") " pod="openshift-dns/dns-default-z9thp" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.875846 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8bc71239-c925-4911-bfa5-e7a564dcd654-socket-dir\") pod \"csi-hostpathplugin-84xjl\" (UID: \"8bc71239-c925-4911-bfa5-e7a564dcd654\") " pod="hostpath-provisioner/csi-hostpathplugin-84xjl" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.875869 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/41e982da-ccd1-4b0c-9f0e-c220e06052a0-srv-cert\") pod \"catalog-operator-68c6474976-4gsck\" (UID: \"41e982da-ccd1-4b0c-9f0e-c220e06052a0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4gsck" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.875889 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8fb85cad-ec2d-4ada-bd68-55937d96a779-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8vgmn\" (UID: \"8fb85cad-ec2d-4ada-bd68-55937d96a779\") " pod="openshift-marketplace/marketplace-operator-79b997595-8vgmn" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.875910 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j25rc\" (UniqueName: \"kubernetes.io/projected/8fb85cad-ec2d-4ada-bd68-55937d96a779-kube-api-access-j25rc\") pod \"marketplace-operator-79b997595-8vgmn\" (UID: \"8fb85cad-ec2d-4ada-bd68-55937d96a779\") " pod="openshift-marketplace/marketplace-operator-79b997595-8vgmn" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.875926 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/41e982da-ccd1-4b0c-9f0e-c220e06052a0-profile-collector-cert\") pod \"catalog-operator-68c6474976-4gsck\" (UID: \"41e982da-ccd1-4b0c-9f0e-c220e06052a0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4gsck" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.875947 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/831db5b2-5229-4b52-8783-f99c640ba856-config-volume\") pod \"collect-profiles-29556705-kllhr\" (UID: \"831db5b2-5229-4b52-8783-f99c640ba856\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556705-kllhr" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.875962 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e366a2cd-5dfa-45c9-b187-92772da0b827-profile-collector-cert\") pod \"olm-operator-6b444d44fb-659h7\" (UID: \"e366a2cd-5dfa-45c9-b187-92772da0b827\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-659h7" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.875978 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b42d2c64-cd10-4923-aed0-dc586696da9a-node-bootstrap-token\") pod \"machine-config-server-9g2bm\" (UID: \"b42d2c64-cd10-4923-aed0-dc586696da9a\") " pod="openshift-machine-config-operator/machine-config-server-9g2bm" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.875996 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/998432c5-238a-466a-a779-7d5126210706-serving-cert\") pod \"service-ca-operator-777779d784-ng8zt\" (UID: \"998432c5-238a-466a-a779-7d5126210706\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ng8zt" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.876015 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfq6d\" (UniqueName: \"kubernetes.io/projected/b42d2c64-cd10-4923-aed0-dc586696da9a-kube-api-access-nfq6d\") pod \"machine-config-server-9g2bm\" (UID: \"b42d2c64-cd10-4923-aed0-dc586696da9a\") " pod="openshift-machine-config-operator/machine-config-server-9g2bm" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.876039 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llv9w\" (UniqueName: \"kubernetes.io/projected/998432c5-238a-466a-a779-7d5126210706-kube-api-access-llv9w\") pod \"service-ca-operator-777779d784-ng8zt\" (UID: \"998432c5-238a-466a-a779-7d5126210706\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ng8zt" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.876062 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbmq8\" (UniqueName: \"kubernetes.io/projected/8bc71239-c925-4911-bfa5-e7a564dcd654-kube-api-access-dbmq8\") pod \"csi-hostpathplugin-84xjl\" (UID: \"8bc71239-c925-4911-bfa5-e7a564dcd654\") " pod="hostpath-provisioner/csi-hostpathplugin-84xjl" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.876077 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2960b8ba-5517-4915-b524-1f3f6d0f043c-signing-key\") pod \"service-ca-9c57cc56f-xfcxm\" (UID: \"2960b8ba-5517-4915-b524-1f3f6d0f043c\") " pod="openshift-service-ca/service-ca-9c57cc56f-xfcxm" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.876102 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b42d2c64-cd10-4923-aed0-dc586696da9a-certs\") pod \"machine-config-server-9g2bm\" (UID: \"b42d2c64-cd10-4923-aed0-dc586696da9a\") " pod="openshift-machine-config-operator/machine-config-server-9g2bm" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.876131 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8kxg\" (UniqueName: \"kubernetes.io/projected/d0005e35-a11c-4773-a0d1-94fa4aff8a14-kube-api-access-d8kxg\") pod \"packageserver-d55dfcdfc-xhx6c\" (UID: \"d0005e35-a11c-4773-a0d1-94fa4aff8a14\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xhx6c" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.876153 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/998432c5-238a-466a-a779-7d5126210706-config\") pod \"service-ca-operator-777779d784-ng8zt\" (UID: \"998432c5-238a-466a-a779-7d5126210706\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ng8zt" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.876176 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s78rb\" (UniqueName: \"kubernetes.io/projected/90308f63-bacc-491b-9ce2-ffbb2eaaea1f-kube-api-access-s78rb\") pod \"ingress-canary-9hkj4\" (UID: \"90308f63-bacc-491b-9ce2-ffbb2eaaea1f\") " pod="openshift-ingress-canary/ingress-canary-9hkj4" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.876196 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8fb85cad-ec2d-4ada-bd68-55937d96a779-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8vgmn\" (UID: \"8fb85cad-ec2d-4ada-bd68-55937d96a779\") " pod="openshift-marketplace/marketplace-operator-79b997595-8vgmn" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.876216 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wd28x\" (UniqueName: \"kubernetes.io/projected/41e982da-ccd1-4b0c-9f0e-c220e06052a0-kube-api-access-wd28x\") pod \"catalog-operator-68c6474976-4gsck\" (UID: \"41e982da-ccd1-4b0c-9f0e-c220e06052a0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4gsck" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.876239 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/8bc71239-c925-4911-bfa5-e7a564dcd654-plugins-dir\") pod \"csi-hostpathplugin-84xjl\" (UID: \"8bc71239-c925-4911-bfa5-e7a564dcd654\") " pod="hostpath-provisioner/csi-hostpathplugin-84xjl" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.876271 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9zsz\" (UniqueName: \"kubernetes.io/projected/e366a2cd-5dfa-45c9-b187-92772da0b827-kube-api-access-d9zsz\") pod \"olm-operator-6b444d44fb-659h7\" (UID: \"e366a2cd-5dfa-45c9-b187-92772da0b827\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-659h7" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.876300 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6085cb91-fec3-45bd-bfdc-a10e6043049f-config-volume\") pod \"dns-default-z9thp\" (UID: \"6085cb91-fec3-45bd-bfdc-a10e6043049f\") " pod="openshift-dns/dns-default-z9thp" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.876315 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/831db5b2-5229-4b52-8783-f99c640ba856-secret-volume\") pod \"collect-profiles-29556705-kllhr\" (UID: \"831db5b2-5229-4b52-8783-f99c640ba856\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556705-kllhr" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.877141 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8bc71239-c925-4911-bfa5-e7a564dcd654-registration-dir\") pod \"csi-hostpathplugin-84xjl\" (UID: \"8bc71239-c925-4911-bfa5-e7a564dcd654\") " pod="hostpath-provisioner/csi-hostpathplugin-84xjl" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.877166 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d0005e35-a11c-4773-a0d1-94fa4aff8a14-tmpfs\") pod \"packageserver-d55dfcdfc-xhx6c\" (UID: \"d0005e35-a11c-4773-a0d1-94fa4aff8a14\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xhx6c" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.877228 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/8bc71239-c925-4911-bfa5-e7a564dcd654-csi-data-dir\") pod \"csi-hostpathplugin-84xjl\" (UID: \"8bc71239-c925-4911-bfa5-e7a564dcd654\") " pod="hostpath-provisioner/csi-hostpathplugin-84xjl" Mar 13 11:51:31 crc kubenswrapper[4837]: E0313 11:51:31.877247 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:32.377229685 +0000 UTC m=+208.015496448 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.877281 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/8bc71239-c925-4911-bfa5-e7a564dcd654-mountpoint-dir\") pod \"csi-hostpathplugin-84xjl\" (UID: \"8bc71239-c925-4911-bfa5-e7a564dcd654\") " pod="hostpath-provisioner/csi-hostpathplugin-84xjl" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.877467 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8bc71239-c925-4911-bfa5-e7a564dcd654-socket-dir\") pod \"csi-hostpathplugin-84xjl\" (UID: \"8bc71239-c925-4911-bfa5-e7a564dcd654\") " pod="hostpath-provisioner/csi-hostpathplugin-84xjl" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.880169 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/8bc71239-c925-4911-bfa5-e7a564dcd654-plugins-dir\") pod \"csi-hostpathplugin-84xjl\" (UID: \"8bc71239-c925-4911-bfa5-e7a564dcd654\") " pod="hostpath-provisioner/csi-hostpathplugin-84xjl" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.881075 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8fb85cad-ec2d-4ada-bd68-55937d96a779-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8vgmn\" (UID: \"8fb85cad-ec2d-4ada-bd68-55937d96a779\") " pod="openshift-marketplace/marketplace-operator-79b997595-8vgmn" Mar 13 11:51:31 crc kubenswrapper[4837]: W0313 11:51:31.881308 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a3cabe4_69ee_49f7_a783_e72ac1a56821.slice/crio-2bc8a3d69075e5c30fa5b45ad6a0c6f1944dbbd0064acaad5eadf14dc600adc9 WatchSource:0}: Error finding container 2bc8a3d69075e5c30fa5b45ad6a0c6f1944dbbd0064acaad5eadf14dc600adc9: Status 404 returned error can't find the container with id 2bc8a3d69075e5c30fa5b45ad6a0c6f1944dbbd0064acaad5eadf14dc600adc9 Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.881412 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2960b8ba-5517-4915-b524-1f3f6d0f043c-signing-cabundle\") pod \"service-ca-9c57cc56f-xfcxm\" (UID: \"2960b8ba-5517-4915-b524-1f3f6d0f043c\") " pod="openshift-service-ca/service-ca-9c57cc56f-xfcxm" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.881740 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/831db5b2-5229-4b52-8783-f99c640ba856-config-volume\") pod \"collect-profiles-29556705-kllhr\" (UID: \"831db5b2-5229-4b52-8783-f99c640ba856\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556705-kllhr" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.883048 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e366a2cd-5dfa-45c9-b187-92772da0b827-profile-collector-cert\") pod \"olm-operator-6b444d44fb-659h7\" (UID: \"e366a2cd-5dfa-45c9-b187-92772da0b827\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-659h7" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.883241 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b42d2c64-cd10-4923-aed0-dc586696da9a-node-bootstrap-token\") pod \"machine-config-server-9g2bm\" (UID: \"b42d2c64-cd10-4923-aed0-dc586696da9a\") " pod="openshift-machine-config-operator/machine-config-server-9g2bm" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.883436 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/41e982da-ccd1-4b0c-9f0e-c220e06052a0-srv-cert\") pod \"catalog-operator-68c6474976-4gsck\" (UID: \"41e982da-ccd1-4b0c-9f0e-c220e06052a0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4gsck" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.883619 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d0005e35-a11c-4773-a0d1-94fa4aff8a14-webhook-cert\") pod \"packageserver-d55dfcdfc-xhx6c\" (UID: \"d0005e35-a11c-4773-a0d1-94fa4aff8a14\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xhx6c" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.884074 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/831db5b2-5229-4b52-8783-f99c640ba856-secret-volume\") pod \"collect-profiles-29556705-kllhr\" (UID: \"831db5b2-5229-4b52-8783-f99c640ba856\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556705-kllhr" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.884368 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d0005e35-a11c-4773-a0d1-94fa4aff8a14-apiservice-cert\") pod \"packageserver-d55dfcdfc-xhx6c\" (UID: \"d0005e35-a11c-4773-a0d1-94fa4aff8a14\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xhx6c" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.885211 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-hffr5"] Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.885996 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8fb85cad-ec2d-4ada-bd68-55937d96a779-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8vgmn\" (UID: \"8fb85cad-ec2d-4ada-bd68-55937d96a779\") " pod="openshift-marketplace/marketplace-operator-79b997595-8vgmn" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.886286 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6085cb91-fec3-45bd-bfdc-a10e6043049f-metrics-tls\") pod \"dns-default-z9thp\" (UID: \"6085cb91-fec3-45bd-bfdc-a10e6043049f\") " pod="openshift-dns/dns-default-z9thp" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.886461 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/41e982da-ccd1-4b0c-9f0e-c220e06052a0-profile-collector-cert\") pod \"catalog-operator-68c6474976-4gsck\" (UID: \"41e982da-ccd1-4b0c-9f0e-c220e06052a0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4gsck" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.886493 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b42d2c64-cd10-4923-aed0-dc586696da9a-certs\") pod \"machine-config-server-9g2bm\" (UID: \"b42d2c64-cd10-4923-aed0-dc586696da9a\") " pod="openshift-machine-config-operator/machine-config-server-9g2bm" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.886468 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e366a2cd-5dfa-45c9-b187-92772da0b827-srv-cert\") pod \"olm-operator-6b444d44fb-659h7\" (UID: \"e366a2cd-5dfa-45c9-b187-92772da0b827\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-659h7" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.896112 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7xs7\" (UniqueName: \"kubernetes.io/projected/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-kube-api-access-g7xs7\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.935735 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j25rc\" (UniqueName: \"kubernetes.io/projected/8fb85cad-ec2d-4ada-bd68-55937d96a779-kube-api-access-j25rc\") pod \"marketplace-operator-79b997595-8vgmn\" (UID: \"8fb85cad-ec2d-4ada-bd68-55937d96a779\") " pod="openshift-marketplace/marketplace-operator-79b997595-8vgmn" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.937914 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-8dj7w"] Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.948382 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6085cb91-fec3-45bd-bfdc-a10e6043049f-config-volume\") pod \"dns-default-z9thp\" (UID: \"6085cb91-fec3-45bd-bfdc-a10e6043049f\") " pod="openshift-dns/dns-default-z9thp" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.950295 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/998432c5-238a-466a-a779-7d5126210706-config\") pod \"service-ca-operator-777779d784-ng8zt\" (UID: \"998432c5-238a-466a-a779-7d5126210706\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ng8zt" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.951964 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/998432c5-238a-466a-a779-7d5126210706-serving-cert\") pod \"service-ca-operator-777779d784-ng8zt\" (UID: \"998432c5-238a-466a-a779-7d5126210706\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ng8zt" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.952218 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/90308f63-bacc-491b-9ce2-ffbb2eaaea1f-cert\") pod \"ingress-canary-9hkj4\" (UID: \"90308f63-bacc-491b-9ce2-ffbb2eaaea1f\") " pod="openshift-ingress-canary/ingress-canary-9hkj4" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.953091 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlwdw\" (UniqueName: \"kubernetes.io/projected/0484d991-f239-47a2-80ff-0237945c27ac-kube-api-access-dlwdw\") pod \"auto-csr-approver-29556710-lcprh\" (UID: \"0484d991-f239-47a2-80ff-0237945c27ac\") " pod="openshift-infra/auto-csr-approver-29556710-lcprh" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.958056 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2960b8ba-5517-4915-b524-1f3f6d0f043c-signing-key\") pod \"service-ca-9c57cc56f-xfcxm\" (UID: \"2960b8ba-5517-4915-b524-1f3f6d0f043c\") " pod="openshift-service-ca/service-ca-9c57cc56f-xfcxm" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.975532 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl4f8\" (UniqueName: \"kubernetes.io/projected/831db5b2-5229-4b52-8783-f99c640ba856-kube-api-access-pl4f8\") pod \"collect-profiles-29556705-kllhr\" (UID: \"831db5b2-5229-4b52-8783-f99c640ba856\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556705-kllhr" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.978472 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:31 crc kubenswrapper[4837]: E0313 11:51:31.978961 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:32.478946229 +0000 UTC m=+208.117212992 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.982140 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-64xpb"] Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.984565 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-9tkxg" event={"ID":"3eaa54fb-8d70-463c-8388-9f8443a480ed","Type":"ContainerStarted","Data":"b1b89859732d6e3f130db8770074382bf2ec9c2b4d0b2c135f5f19ccd80108b4"} Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.985859 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-vsp2m" event={"ID":"6db10103-96be-4420-b302-a7064e347f61","Type":"ContainerStarted","Data":"b8b9904f90ea9cab9b908c8386f85ff72414d4e5b210240fa04eb6214cfb4a49"} Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.991519 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" event={"ID":"27d45de2-e0ab-4c3e-b3da-b20e60e26801","Type":"ContainerStarted","Data":"48f88856d0aa99c22451af4774004c789a7baf644ed71ee96a301b56c7368078"} Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.992474 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v526f"] Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.994157 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-8ktsx" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.998175 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrxsc\" (UniqueName: \"kubernetes.io/projected/6085cb91-fec3-45bd-bfdc-a10e6043049f-kube-api-access-rrxsc\") pod \"dns-default-z9thp\" (UID: \"6085cb91-fec3-45bd-bfdc-a10e6043049f\") " pod="openshift-dns/dns-default-z9thp" Mar 13 11:51:31 crc kubenswrapper[4837]: I0313 11:51:31.998318 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-q2qpt" event={"ID":"c83842ec-9933-4f84-bb4a-c84ca61a28e1","Type":"ContainerStarted","Data":"6d6886f8a08a9d6498bf2731a6faf601bf8b43c566b4a0dbe066c5557e5e15e0"} Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.010263 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-84ccm" event={"ID":"d8974a7e-ac32-4644-b7ee-2d3908daf2fa","Type":"ContainerStarted","Data":"8bd1e25605040a3ded2b2fcdf3aba6d9b1057256f3fd36ccbc6a37df5954cca1"} Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.015414 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" event={"ID":"3e1f747d-78f3-4cbc-b313-eed531936c02","Type":"ContainerStarted","Data":"3724ccf51dcc3ce781ab7f660589de5a9700c6b6b97a2a3012f9580d869b7a9e"} Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.016975 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs" event={"ID":"5a3cabe4-69ee-49f7-a783-e72ac1a56821","Type":"ContainerStarted","Data":"2bc8a3d69075e5c30fa5b45ad6a0c6f1944dbbd0064acaad5eadf14dc600adc9"} Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.024825 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" event={"ID":"f8bc408a-bca6-42ff-8572-2ba9a3978682","Type":"ContainerStarted","Data":"79d44a31b910bec33360921358068b0857727b0bb4c82bc65255018460fa2174"} Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.025461 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjqzb\" (UniqueName: \"kubernetes.io/projected/2960b8ba-5517-4915-b524-1f3f6d0f043c-kube-api-access-fjqzb\") pod \"service-ca-9c57cc56f-xfcxm\" (UID: \"2960b8ba-5517-4915-b524-1f3f6d0f043c\") " pod="openshift-service-ca/service-ca-9c57cc56f-xfcxm" Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.029985 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" event={"ID":"ffb5553f-d2d5-4584-9bf8-7212a378f358","Type":"ContainerStarted","Data":"e21d8c3cf5026263c4f5424f66828ba6aa5db357c326f97aa914eb0972b97eb0"} Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.035194 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llv9w\" (UniqueName: \"kubernetes.io/projected/998432c5-238a-466a-a779-7d5126210706-kube-api-access-llv9w\") pod \"service-ca-operator-777779d784-ng8zt\" (UID: \"998432c5-238a-466a-a779-7d5126210706\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ng8zt" Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.036342 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dhrww" event={"ID":"10ac507b-7307-4e09-ab72-b956d0139396","Type":"ContainerStarted","Data":"4bd51b06146c5f096b2a54598de78033d1db9d6e3a286772a166210354044d28"} Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.041686 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-f97pg" event={"ID":"f8d8640c-c4bc-40ad-9594-7b4fb2c4beb0","Type":"ContainerStarted","Data":"08dd677887c4dde1b1c0188517f8488051597c9a84fd864cd57af21ec82e64e6"} Mar 13 11:51:32 crc kubenswrapper[4837]: W0313 11:51:32.041766 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44f59229_dec6_4d9b_a63b_bd562b4523cf.slice/crio-3ef1ad2cfd596ea09a0783d40ce9413e65a0ca7227111a529229125c392c9dd0 WatchSource:0}: Error finding container 3ef1ad2cfd596ea09a0783d40ce9413e65a0ca7227111a529229125c392c9dd0: Status 404 returned error can't find the container with id 3ef1ad2cfd596ea09a0783d40ce9413e65a0ca7227111a529229125c392c9dd0 Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.043343 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l57bl"] Mar 13 11:51:32 crc kubenswrapper[4837]: W0313 11:51:32.046128 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod003e8201_4e67_4356_b0c1_8cc135451069.slice/crio-bac019194e3b35d01c48fc764e4ea75894e6d260733b68a6e762af1ad5c86dc7 WatchSource:0}: Error finding container bac019194e3b35d01c48fc764e4ea75894e6d260733b68a6e762af1ad5c86dc7: Status 404 returned error can't find the container with id bac019194e3b35d01c48fc764e4ea75894e6d260733b68a6e762af1ad5c86dc7 Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.048614 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5nslp"] Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.055205 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfq6d\" (UniqueName: \"kubernetes.io/projected/b42d2c64-cd10-4923-aed0-dc586696da9a-kube-api-access-nfq6d\") pod \"machine-config-server-9g2bm\" (UID: \"b42d2c64-cd10-4923-aed0-dc586696da9a\") " pod="openshift-machine-config-operator/machine-config-server-9g2bm" Mar 13 11:51:32 crc kubenswrapper[4837]: W0313 11:51:32.057648 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c2663fa_7df3_4801_be78_52517eb1f1cf.slice/crio-87b29cac5b0d8102de47d1b2f0bf13cd354e9c6469e4cd5a1e05a6523362cce8 WatchSource:0}: Error finding container 87b29cac5b0d8102de47d1b2f0bf13cd354e9c6469e4cd5a1e05a6523362cce8: Status 404 returned error can't find the container with id 87b29cac5b0d8102de47d1b2f0bf13cd354e9c6469e4cd5a1e05a6523362cce8 Mar 13 11:51:32 crc kubenswrapper[4837]: W0313 11:51:32.058519 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe42bd29_b8a7_4a9f_89e2_ab3b944d7c26.slice/crio-947cbb59d184a3abaa202539dda8ef07f684be8aca67a5026c36c546ecb7c17d WatchSource:0}: Error finding container 947cbb59d184a3abaa202539dda8ef07f684be8aca67a5026c36c546ecb7c17d: Status 404 returned error can't find the container with id 947cbb59d184a3abaa202539dda8ef07f684be8aca67a5026c36c546ecb7c17d Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.073109 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd28x\" (UniqueName: \"kubernetes.io/projected/41e982da-ccd1-4b0c-9f0e-c220e06052a0-kube-api-access-wd28x\") pod \"catalog-operator-68c6474976-4gsck\" (UID: \"41e982da-ccd1-4b0c-9f0e-c220e06052a0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4gsck" Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.079610 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:32 crc kubenswrapper[4837]: E0313 11:51:32.080017 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:32.579997042 +0000 UTC m=+208.218263815 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.081349 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-8zzqp" Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.086305 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bcfcc"] Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.091998 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9zsz\" (UniqueName: \"kubernetes.io/projected/e366a2cd-5dfa-45c9-b187-92772da0b827-kube-api-access-d9zsz\") pod \"olm-operator-6b444d44fb-659h7\" (UID: \"e366a2cd-5dfa-45c9-b187-92772da0b827\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-659h7" Mar 13 11:51:32 crc kubenswrapper[4837]: W0313 11:51:32.092711 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod255ab2ef_dead_4148_bc85_2514618767b9.slice/crio-eb83a01e844ced5ebb50f5c6f4f23872dfc2208d1286a3c2f7a421eb4b88a923 WatchSource:0}: Error finding container eb83a01e844ced5ebb50f5c6f4f23872dfc2208d1286a3c2f7a421eb4b88a923: Status 404 returned error can't find the container with id eb83a01e844ced5ebb50f5c6f4f23872dfc2208d1286a3c2f7a421eb4b88a923 Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.124657 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8kxg\" (UniqueName: \"kubernetes.io/projected/d0005e35-a11c-4773-a0d1-94fa4aff8a14-kube-api-access-d8kxg\") pod \"packageserver-d55dfcdfc-xhx6c\" (UID: \"d0005e35-a11c-4773-a0d1-94fa4aff8a14\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xhx6c" Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.139121 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s78rb\" (UniqueName: \"kubernetes.io/projected/90308f63-bacc-491b-9ce2-ffbb2eaaea1f-kube-api-access-s78rb\") pod \"ingress-canary-9hkj4\" (UID: \"90308f63-bacc-491b-9ce2-ffbb2eaaea1f\") " pod="openshift-ingress-canary/ingress-canary-9hkj4" Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.147134 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-659h7" Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.150858 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4gsck" Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.158232 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbmq8\" (UniqueName: \"kubernetes.io/projected/8bc71239-c925-4911-bfa5-e7a564dcd654-kube-api-access-dbmq8\") pod \"csi-hostpathplugin-84xjl\" (UID: \"8bc71239-c925-4911-bfa5-e7a564dcd654\") " pod="hostpath-provisioner/csi-hostpathplugin-84xjl" Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.165202 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8vgmn" Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.172507 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xhx6c" Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.182568 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:32 crc kubenswrapper[4837]: E0313 11:51:32.183345 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:32.683325797 +0000 UTC m=+208.321592560 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.183576 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ng8zt" Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.195304 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-xfcxm" Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.226344 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556710-lcprh" Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.234439 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556705-kllhr" Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.248032 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-84xjl" Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.273528 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-z9thp" Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.281822 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9hkj4" Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.284668 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:32 crc kubenswrapper[4837]: E0313 11:51:32.285050 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:32.785028531 +0000 UTC m=+208.423295294 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.287755 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-9g2bm" Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.386426 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:32 crc kubenswrapper[4837]: E0313 11:51:32.389153 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:32.888702427 +0000 UTC m=+208.526969190 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.487748 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:32 crc kubenswrapper[4837]: E0313 11:51:32.488102 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:32.988071168 +0000 UTC m=+208.626337931 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.531789 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-69xj9"] Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.578813 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pmg8q"] Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.580583 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-2jcxn"] Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.589015 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:32 crc kubenswrapper[4837]: E0313 11:51:32.589442 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:33.089428411 +0000 UTC m=+208.727695174 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.598951 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jrm5t"] Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.606602 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fgk4q"] Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.660558 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4rjht"] Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.683823 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-l4rxn"] Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.692433 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:32 crc kubenswrapper[4837]: E0313 11:51:32.692688 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:33.192621461 +0000 UTC m=+208.830888224 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.692813 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:32 crc kubenswrapper[4837]: E0313 11:51:32.693215 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:33.1932007 +0000 UTC m=+208.831467463 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.793987 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:32 crc kubenswrapper[4837]: E0313 11:51:32.794583 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:33.294567843 +0000 UTC m=+208.932834606 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.908181 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:32 crc kubenswrapper[4837]: E0313 11:51:32.908517 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:33.40850217 +0000 UTC m=+209.046768933 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:32 crc kubenswrapper[4837]: I0313 11:51:32.941095 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-8ktsx"] Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.009728 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:33 crc kubenswrapper[4837]: E0313 11:51:33.010064 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:33.510039599 +0000 UTC m=+209.148306362 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:33 crc kubenswrapper[4837]: W0313 11:51:33.105916 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85ac6950_8b98_4d0c_8a2b_7eeeac8d1435.slice/crio-e010eb156dd5e54ff5b67933f30569e7dfedf9e8ebf856adbf791bedb5ca7007 WatchSource:0}: Error finding container e010eb156dd5e54ff5b67933f30569e7dfedf9e8ebf856adbf791bedb5ca7007: Status 404 returned error can't find the container with id e010eb156dd5e54ff5b67933f30569e7dfedf9e8ebf856adbf791bedb5ca7007 Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.106820 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs" Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.106851 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.106868 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4rjht" event={"ID":"9025cb05-7c57-488b-a8cb-441552547aae","Type":"ContainerStarted","Data":"fe0205be8dce14655377c945d336b37f32e1ec709e068892e4478220341b3086"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.106890 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs" event={"ID":"5a3cabe4-69ee-49f7-a783-e72ac1a56821","Type":"ContainerStarted","Data":"0b1af16cc6188236788eb10501019d25c79e6c73c18075a85efbfcfdd6e8d90d"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.106899 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" event={"ID":"27d45de2-e0ab-4c3e-b3da-b20e60e26801","Type":"ContainerStarted","Data":"7788f0babcbd0ba3005289dc42abd3560a56f1f0efe57b0376342454820793c4"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.106910 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.106919 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" event={"ID":"f8bc408a-bca6-42ff-8572-2ba9a3978682","Type":"ContainerStarted","Data":"0737572e5f80685157a6578fd12aead5fdbe12b0fbb802f48732112a9a3e2ca5"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.106927 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-69xj9" event={"ID":"edf3b6c4-d4e3-4ec6-8fdf-ff01abe23e57","Type":"ContainerStarted","Data":"d15d6148e3b4641e21c1219be8fd949b853388c951967107866968824b8b411d"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.106936 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-84ccm" event={"ID":"d8974a7e-ac32-4644-b7ee-2d3908daf2fa","Type":"ContainerStarted","Data":"0f0ac7761242186c6486cc067d961428306b628dd37e60d608bfeec53567ce76"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.107781 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-9tkxg" event={"ID":"3eaa54fb-8d70-463c-8388-9f8443a480ed","Type":"ContainerStarted","Data":"8909e983ef2780c7ed608dd72d62ffc2711e88e6e546fc3ca22041d9c9d9f368"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.111240 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:33 crc kubenswrapper[4837]: E0313 11:51:33.111725 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:33.611707772 +0000 UTC m=+209.249974535 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.118907 4837 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-8q6j6 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" start-of-body= Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.118960 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" podUID="27d45de2-e0ab-4c3e-b3da-b20e60e26801" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.118907 4837 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-9dbhc container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.119046 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" podUID="f8bc408a-bca6-42ff-8572-2ba9a3978682" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.118921 4837 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-qs2qs container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.119122 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs" podUID="5a3cabe4-69ee-49f7-a783-e72ac1a56821" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.120137 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-64xpb" event={"ID":"fe42bd29-b8a7-4a9f-89e2-ab3b944d7c26","Type":"ContainerStarted","Data":"6ca9ef7b6e1aeeb5f213e3151501982326403f14efe8a0393d6e04b54a7e03b1"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.120189 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-64xpb" event={"ID":"fe42bd29-b8a7-4a9f-89e2-ab3b944d7c26","Type":"ContainerStarted","Data":"947cbb59d184a3abaa202539dda8ef07f684be8aca67a5026c36c546ecb7c17d"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.125072 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-8dj7w" event={"ID":"003e8201-4e67-4356-b0c1-8cc135451069","Type":"ContainerStarted","Data":"24b5d222c04ee931204ab5357fdad08cdce8ed10f00338509d9dda5089a76343"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.125109 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-8dj7w" event={"ID":"003e8201-4e67-4356-b0c1-8cc135451069","Type":"ContainerStarted","Data":"bac019194e3b35d01c48fc764e4ea75894e6d260733b68a6e762af1ad5c86dc7"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.125600 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-8dj7w" Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.127700 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-l4rxn" event={"ID":"10be2947-2e91-4a8e-b54e-69cdab598955","Type":"ContainerStarted","Data":"27af68c1e8c6b927ac73e5302065bf60d6785c2334f48fb0b7e8b9eebff81e5d"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.143438 4837 patch_prober.go:28] interesting pod/console-operator-58897d9998-8dj7w container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.144806 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-8dj7w" podUID="003e8201-4e67-4356-b0c1-8cc135451069" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.153341 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5nslp" event={"ID":"255ab2ef-dead-4148-bc85-2514618767b9","Type":"ContainerStarted","Data":"d3ac728c6efb3f8a9d434e9b2db73dabc494a1a025502dc770d39636e7643e21"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.153385 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5nslp" event={"ID":"255ab2ef-dead-4148-bc85-2514618767b9","Type":"ContainerStarted","Data":"eb83a01e844ced5ebb50f5c6f4f23872dfc2208d1286a3c2f7a421eb4b88a923"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.213829 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.213871 4837 generic.go:334] "Generic (PLEG): container finished" podID="3e1f747d-78f3-4cbc-b313-eed531936c02" containerID="cc397fa1bf18472d61483cbd90123ad619e890e1c799ebf20e335d2d2900efd2" exitCode=0 Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.215097 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" event={"ID":"3e1f747d-78f3-4cbc-b313-eed531936c02","Type":"ContainerDied","Data":"cc397fa1bf18472d61483cbd90123ad619e890e1c799ebf20e335d2d2900efd2"} Mar 13 11:51:33 crc kubenswrapper[4837]: E0313 11:51:33.215600 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:33.715576794 +0000 UTC m=+209.353843607 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.222706 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l57bl" event={"ID":"ec4b9459-d392-4fc5-9b6f-a87ca50e85b1","Type":"ContainerStarted","Data":"ac72c30ea5814f3e17ef0ea5566cc399d618ec18408f0a194a4e0e876289f7a3"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.225940 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bcfcc" event={"ID":"f5681b96-47c5-44f8-9e5d-671678930750","Type":"ContainerStarted","Data":"a4e9e8e683453ec1ec97eac55a88db48d29f9aaf6a70433eb3f631db4c7b81d9"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.278260 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dhrww" event={"ID":"10ac507b-7307-4e09-ab72-b956d0139396","Type":"ContainerStarted","Data":"87a66e2fda27d9418ac0248881f7d3de9402f85590f174cb3538e70e9955baa9"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.318422 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:33 crc kubenswrapper[4837]: E0313 11:51:33.321080 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:33.821060636 +0000 UTC m=+209.459327509 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.323418 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4gsck"] Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.327450 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-8zzqp"] Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.331759 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v526f" event={"ID":"2c2663fa-7df3-4801-be78-52517eb1f1cf","Type":"ContainerStarted","Data":"27571341ae0d307e0f5fe511b98e899863223e7cd1f546bfe8d5b19e1c7422da"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.331795 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v526f" event={"ID":"2c2663fa-7df3-4801-be78-52517eb1f1cf","Type":"ContainerStarted","Data":"87b29cac5b0d8102de47d1b2f0bf13cd354e9c6469e4cd5a1e05a6523362cce8"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.349851 4837 generic.go:334] "Generic (PLEG): container finished" podID="f8d8640c-c4bc-40ad-9594-7b4fb2c4beb0" containerID="ca7706ee51695e703b71f0dfa955c9b51c9bd2ac8cba2d6910d4014415da7692" exitCode=0 Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.349981 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-f97pg" event={"ID":"f8d8640c-c4bc-40ad-9594-7b4fb2c4beb0","Type":"ContainerDied","Data":"ca7706ee51695e703b71f0dfa955c9b51c9bd2ac8cba2d6910d4014415da7692"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.359877 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2jcxn" event={"ID":"80d5bedc-a598-4779-be24-2d512ea7d148","Type":"ContainerStarted","Data":"236230e9d88a7986fa545c821a13267c56c256d3c89adf216bffb2cfb73ae7c2"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.361617 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hffr5" event={"ID":"44f59229-dec6-4d9b-a63b-bd562b4523cf","Type":"ContainerStarted","Data":"3ef1ad2cfd596ea09a0783d40ce9413e65a0ca7227111a529229125c392c9dd0"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.362954 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jrm5t" event={"ID":"00848ba6-522a-45c7-81bd-7ab287d77626","Type":"ContainerStarted","Data":"bb3553ad20bce98adabe99a30a48b45691b06798afd9f7e7897e02cca605715d"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.363696 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fgk4q" event={"ID":"a5abeecf-9533-4cd9-8ce3-29bb6d8a00bd","Type":"ContainerStarted","Data":"b69b65dfcd7e526c6965c8376e21c5fdea1ff5f7cf09fa0110e114348954b91d"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.365894 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-vsp2m" event={"ID":"6db10103-96be-4420-b302-a7064e347f61","Type":"ContainerStarted","Data":"34242361e539e0843f07ff6be10c070f33367f97039c925382d891dae818df9a"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.373772 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dv9wr" event={"ID":"6ad2861b-4f40-4551-8aff-304359734792","Type":"ContainerStarted","Data":"ecb6f4e687ec2286b4a0231baea09dabeeb624191aa322a66433bce22ed1353d"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.373819 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dv9wr" event={"ID":"6ad2861b-4f40-4551-8aff-304359734792","Type":"ContainerStarted","Data":"21bdf6a8e9acd59896f8179096077562702534a4681d2f2e339180db9df351b5"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.380438 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-wcfj4" event={"ID":"e6a94afd-1f9a-4281-9d94-2fac3916f2c3","Type":"ContainerStarted","Data":"bb2d7b77a707e2a0fceb7edc70566799a53b9e335b57fb7bcf31960f63eba7da"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.391347 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jpbgx" event={"ID":"fca26784-7fdf-4923-bd07-35d182c2ad14","Type":"ContainerStarted","Data":"a0ca8c97b2911c4bf447e84741f129e0c258bce5a17e6f8304df1aff10c8aa04"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.391394 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jpbgx" event={"ID":"fca26784-7fdf-4923-bd07-35d182c2ad14","Type":"ContainerStarted","Data":"bf33e77fd4365841b90221599801ebed872e8e5e376cd3f2f3c89ea2cdd90c87"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.392809 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-9g2bm" event={"ID":"b42d2c64-cd10-4923-aed0-dc586696da9a","Type":"ContainerStarted","Data":"25d408737e876ad711dada87c79742e26473e8799fa23d15fc77576a44b9ba2d"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.394631 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-q2qpt" event={"ID":"c83842ec-9933-4f84-bb4a-c84ca61a28e1","Type":"ContainerStarted","Data":"c3e3e9b2ed47e2f7480af78d679ab1d816ea01c193c35244aa52793e0f02f112"} Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.400760 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pmg8q" event={"ID":"a4a3cd73-aa6c-4128-8a5f-561719e9b170","Type":"ContainerStarted","Data":"c5767e3fcdbc5dc97c0a042c7acee915b4df773e7712c84a1e6a7c143810b3b2"} Mar 13 11:51:33 crc kubenswrapper[4837]: W0313 11:51:33.420278 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41e982da_ccd1_4b0c_9f0e_c220e06052a0.slice/crio-de0a9c7b74e03c5422dfa011bacd29880935071aff96993b9c3cf9d85694e4d2 WatchSource:0}: Error finding container de0a9c7b74e03c5422dfa011bacd29880935071aff96993b9c3cf9d85694e4d2: Status 404 returned error can't find the container with id de0a9c7b74e03c5422dfa011bacd29880935071aff96993b9c3cf9d85694e4d2 Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.420742 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:33 crc kubenswrapper[4837]: E0313 11:51:33.421854 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:33.921821741 +0000 UTC m=+209.560088504 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.427387 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:33 crc kubenswrapper[4837]: E0313 11:51:33.431301 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:33.931278407 +0000 UTC m=+209.569545170 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.492292 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-9tkxg" Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.497560 4837 patch_prober.go:28] interesting pod/router-default-5444994796-9tkxg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:51:33 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Mar 13 11:51:33 crc kubenswrapper[4837]: [+]process-running ok Mar 13 11:51:33 crc kubenswrapper[4837]: healthz check failed Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.497624 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9tkxg" podUID="3eaa54fb-8d70-463c-8388-9f8443a480ed" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.509967 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-659h7"] Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.512054 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xhx6c"] Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.531183 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:33 crc kubenswrapper[4837]: E0313 11:51:33.532060 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:34.032044682 +0000 UTC m=+209.670311445 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.533841 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" podStartSLOduration=151.533824577 podStartE2EDuration="2m31.533824577s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:33.528049266 +0000 UTC m=+209.166316029" watchObservedRunningTime="2026-03-13 11:51:33.533824577 +0000 UTC m=+209.172091340" Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.563179 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-84ccm" podStartSLOduration=152.563158065 podStartE2EDuration="2m32.563158065s" podCreationTimestamp="2026-03-13 11:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:33.54830319 +0000 UTC m=+209.186569953" watchObservedRunningTime="2026-03-13 11:51:33.563158065 +0000 UTC m=+209.201424838" Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.599581 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-xfcxm"] Mar 13 11:51:33 crc kubenswrapper[4837]: W0313 11:51:33.607788 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode366a2cd_5dfa_45c9_b187_92772da0b827.slice/crio-92b7274c7455968a613e949633813e0ece8512f287e59aea006ff7883bef05ab WatchSource:0}: Error finding container 92b7274c7455968a613e949633813e0ece8512f287e59aea006ff7883bef05ab: Status 404 returned error can't find the container with id 92b7274c7455968a613e949633813e0ece8512f287e59aea006ff7883bef05ab Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.608409 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-ng8zt"] Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.624169 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556705-kllhr"] Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.634392 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:33 crc kubenswrapper[4837]: E0313 11:51:33.634795 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:34.134784768 +0000 UTC m=+209.773051531 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.635380 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5nslp" podStartSLOduration=151.635370786 podStartE2EDuration="2m31.635370786s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:33.633838778 +0000 UTC m=+209.272105541" watchObservedRunningTime="2026-03-13 11:51:33.635370786 +0000 UTC m=+209.273637549" Mar 13 11:51:33 crc kubenswrapper[4837]: W0313 11:51:33.688414 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2960b8ba_5517_4915_b524_1f3f6d0f043c.slice/crio-1b7479fdaddb40cc2b2b5d9856e0bfd15305f77890603dcca573328b382a1ea9 WatchSource:0}: Error finding container 1b7479fdaddb40cc2b2b5d9856e0bfd15305f77890603dcca573328b382a1ea9: Status 404 returned error can't find the container with id 1b7479fdaddb40cc2b2b5d9856e0bfd15305f77890603dcca573328b382a1ea9 Mar 13 11:51:33 crc kubenswrapper[4837]: W0313 11:51:33.691175 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod831db5b2_5229_4b52_8783_f99c640ba856.slice/crio-3c294e8fd793300ee1caaf9d2068d0b14e0a9dc058385ad90bb33a7237e0e283 WatchSource:0}: Error finding container 3c294e8fd793300ee1caaf9d2068d0b14e0a9dc058385ad90bb33a7237e0e283: Status 404 returned error can't find the container with id 3c294e8fd793300ee1caaf9d2068d0b14e0a9dc058385ad90bb33a7237e0e283 Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.701747 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8vgmn"] Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.735344 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:33 crc kubenswrapper[4837]: E0313 11:51:33.735554 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:34.235519391 +0000 UTC m=+209.873786164 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.735776 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:33 crc kubenswrapper[4837]: E0313 11:51:33.736169 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:34.236153831 +0000 UTC m=+209.874420594 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.753977 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" podStartSLOduration=152.753951888 podStartE2EDuration="2m32.753951888s" podCreationTimestamp="2026-03-13 11:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:33.723770043 +0000 UTC m=+209.362036826" watchObservedRunningTime="2026-03-13 11:51:33.753951888 +0000 UTC m=+209.392218651" Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.758882 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs" podStartSLOduration=151.758861752 podStartE2EDuration="2m31.758861752s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:33.758551823 +0000 UTC m=+209.396818606" watchObservedRunningTime="2026-03-13 11:51:33.758861752 +0000 UTC m=+209.397128515" Mar 13 11:51:33 crc kubenswrapper[4837]: W0313 11:51:33.791710 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fb85cad_ec2d_4ada_bd68_55937d96a779.slice/crio-32fbad917c53f080dae29a17b7d2e0db3f0b48efe2df248f03fa8431da965ad3 WatchSource:0}: Error finding container 32fbad917c53f080dae29a17b7d2e0db3f0b48efe2df248f03fa8431da965ad3: Status 404 returned error can't find the container with id 32fbad917c53f080dae29a17b7d2e0db3f0b48efe2df248f03fa8431da965ad3 Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.794940 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556710-lcprh"] Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.800306 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-8dj7w" podStartSLOduration=152.800288269 podStartE2EDuration="2m32.800288269s" podCreationTimestamp="2026-03-13 11:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:33.797460691 +0000 UTC m=+209.435727454" watchObservedRunningTime="2026-03-13 11:51:33.800288269 +0000 UTC m=+209.438555052" Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.812407 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9hkj4"] Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.820326 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-84xjl"] Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.824437 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-z9thp"] Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.838724 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:33 crc kubenswrapper[4837]: E0313 11:51:33.839135 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:34.339111124 +0000 UTC m=+209.977377887 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.840451 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:33 crc kubenswrapper[4837]: E0313 11:51:33.841010 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:34.340997873 +0000 UTC m=+209.979264636 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:33 crc kubenswrapper[4837]: W0313 11:51:33.869690 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6085cb91_fec3_45bd_bfdc_a10e6043049f.slice/crio-aa9b8655d804dd95c7ef91fad04e5b946fd2097131e680a82470cf7f2089113c WatchSource:0}: Error finding container aa9b8655d804dd95c7ef91fad04e5b946fd2097131e680a82470cf7f2089113c: Status 404 returned error can't find the container with id aa9b8655d804dd95c7ef91fad04e5b946fd2097131e680a82470cf7f2089113c Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.874496 4837 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.877817 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-9tkxg" podStartSLOduration=151.877802656 podStartE2EDuration="2m31.877802656s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:33.877605189 +0000 UTC m=+209.515871972" watchObservedRunningTime="2026-03-13 11:51:33.877802656 +0000 UTC m=+209.516069419" Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.941938 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:33 crc kubenswrapper[4837]: E0313 11:51:33.942199 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:34.44216493 +0000 UTC m=+210.080431693 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:33 crc kubenswrapper[4837]: I0313 11:51:33.955970 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-q2qpt" podStartSLOduration=151.955945972 podStartE2EDuration="2m31.955945972s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:33.95044979 +0000 UTC m=+209.588716563" watchObservedRunningTime="2026-03-13 11:51:33.955945972 +0000 UTC m=+209.594212735" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.047186 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:34 crc kubenswrapper[4837]: E0313 11:51:34.049687 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:34.549672937 +0000 UTC m=+210.187939690 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.051520 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-vsp2m" podStartSLOduration=152.051499453 podStartE2EDuration="2m32.051499453s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:34.045843626 +0000 UTC m=+209.684110389" watchObservedRunningTime="2026-03-13 11:51:34.051499453 +0000 UTC m=+209.689766216" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.121334 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dv9wr" podStartSLOduration=153.121315569 podStartE2EDuration="2m33.121315569s" podCreationTimestamp="2026-03-13 11:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:34.093817568 +0000 UTC m=+209.732084331" watchObservedRunningTime="2026-03-13 11:51:34.121315569 +0000 UTC m=+209.759582332" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.122799 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v526f" podStartSLOduration=152.122786575 podStartE2EDuration="2m32.122786575s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:34.119401469 +0000 UTC m=+209.757668232" watchObservedRunningTime="2026-03-13 11:51:34.122786575 +0000 UTC m=+209.761053338" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.154351 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:34 crc kubenswrapper[4837]: E0313 11:51:34.154817 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:34.654795767 +0000 UTC m=+210.293062530 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.256222 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:34 crc kubenswrapper[4837]: E0313 11:51:34.256780 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:34.75676468 +0000 UTC m=+210.395031453 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.357088 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:34 crc kubenswrapper[4837]: E0313 11:51:34.357453 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:34.857439502 +0000 UTC m=+210.495706265 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.415332 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pmg8q" event={"ID":"a4a3cd73-aa6c-4128-8a5f-561719e9b170","Type":"ContainerStarted","Data":"336c92baa53a146263254fd491124e611101140747c03155c6b5a6a9c70b55c2"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.422748 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ng8zt" event={"ID":"998432c5-238a-466a-a779-7d5126210706","Type":"ContainerStarted","Data":"7a69b7c6b089175a7cc7fc1cba2ac9b38104e05673fb7ba5b0c410bd95c86b31"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.441441 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-64xpb" event={"ID":"fe42bd29-b8a7-4a9f-89e2-ab3b944d7c26","Type":"ContainerStarted","Data":"04c8c046f4062a12915ce1bc97ea7ac2245aedb416cfc34c07574367a17e5c75"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.442681 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pmg8q" podStartSLOduration=152.442662929 podStartE2EDuration="2m32.442662929s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:34.441160612 +0000 UTC m=+210.079427385" watchObservedRunningTime="2026-03-13 11:51:34.442662929 +0000 UTC m=+210.080929692" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.460244 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:34 crc kubenswrapper[4837]: E0313 11:51:34.460851 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:34.960815008 +0000 UTC m=+210.599081771 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.468868 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-64xpb" podStartSLOduration=152.468848939 podStartE2EDuration="2m32.468848939s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:34.467697583 +0000 UTC m=+210.105964346" watchObservedRunningTime="2026-03-13 11:51:34.468848939 +0000 UTC m=+210.107115712" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.475473 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-84xjl" event={"ID":"8bc71239-c925-4911-bfa5-e7a564dcd654","Type":"ContainerStarted","Data":"93a423c2f3f7c65543caac4a0f22a811ce2eabd4f3aefb38b12ee8b804d63fae"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.482821 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bcfcc" event={"ID":"f5681b96-47c5-44f8-9e5d-671678930750","Type":"ContainerStarted","Data":"53d534af6f8df34ececfb1a8a0e6c5ad8f272dcb9c009a461b5b14acbfd167b8"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.482874 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bcfcc" event={"ID":"f5681b96-47c5-44f8-9e5d-671678930750","Type":"ContainerStarted","Data":"6f263bafe32aba6747375cfb633f62ecda193eb68ec17190dd9cca7b969ae95a"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.483905 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bcfcc" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.491058 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-f97pg" event={"ID":"f8d8640c-c4bc-40ad-9594-7b4fb2c4beb0","Type":"ContainerStarted","Data":"9d9e796c53101ac91bc5573107aa81ef153174f5c3a3c3ef62f853513d212d80"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.500116 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-f97pg" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.503977 4837 patch_prober.go:28] interesting pod/router-default-5444994796-9tkxg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:51:34 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Mar 13 11:51:34 crc kubenswrapper[4837]: [+]process-running ok Mar 13 11:51:34 crc kubenswrapper[4837]: healthz check failed Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.504037 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9tkxg" podUID="3eaa54fb-8d70-463c-8388-9f8443a480ed" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.504379 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556705-kllhr" event={"ID":"831db5b2-5229-4b52-8783-f99c640ba856","Type":"ContainerStarted","Data":"965aad43c7ccd189d4d18246f935c745fc24b5e2cfb5b07896f9492e9109fb55"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.504419 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556705-kllhr" event={"ID":"831db5b2-5229-4b52-8783-f99c640ba856","Type":"ContainerStarted","Data":"3c294e8fd793300ee1caaf9d2068d0b14e0a9dc058385ad90bb33a7237e0e283"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.510909 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8vgmn" event={"ID":"8fb85cad-ec2d-4ada-bd68-55937d96a779","Type":"ContainerStarted","Data":"32fbad917c53f080dae29a17b7d2e0db3f0b48efe2df248f03fa8431da965ad3"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.512831 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bcfcc" podStartSLOduration=152.512809675 podStartE2EDuration="2m32.512809675s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:34.51134662 +0000 UTC m=+210.149613403" watchObservedRunningTime="2026-03-13 11:51:34.512809675 +0000 UTC m=+210.151076438" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.528729 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xhx6c" event={"ID":"d0005e35-a11c-4773-a0d1-94fa4aff8a14","Type":"ContainerStarted","Data":"81c8f78518697dd119d66780b63d9f72316e68c2f8e4ec6dd363b91e11730a57"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.528775 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xhx6c" event={"ID":"d0005e35-a11c-4773-a0d1-94fa4aff8a14","Type":"ContainerStarted","Data":"1600b61f7d9a501c5cb5dec3d449565dc9734994d5148c31b0f65a016a53ce24"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.528965 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xhx6c" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.531003 4837 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-xhx6c container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" start-of-body= Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.531088 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xhx6c" podUID="d0005e35-a11c-4773-a0d1-94fa4aff8a14" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.537423 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29556705-kllhr" podStartSLOduration=153.53726176 podStartE2EDuration="2m33.53726176s" podCreationTimestamp="2026-03-13 11:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:34.531217402 +0000 UTC m=+210.169484185" watchObservedRunningTime="2026-03-13 11:51:34.53726176 +0000 UTC m=+210.175528513" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.557135 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-f97pg" podStartSLOduration=153.557096251 podStartE2EDuration="2m33.557096251s" podCreationTimestamp="2026-03-13 11:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:34.555984587 +0000 UTC m=+210.194251350" watchObservedRunningTime="2026-03-13 11:51:34.557096251 +0000 UTC m=+210.195363004" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.559588 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dhrww" event={"ID":"10ac507b-7307-4e09-ab72-b956d0139396","Type":"ContainerStarted","Data":"e62899fe641e7552245bdb1105a06db5016cb36ce949fad6221d2c72a4fdec51"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.560796 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:34 crc kubenswrapper[4837]: E0313 11:51:34.562839 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:35.06281122 +0000 UTC m=+210.701078153 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.563825 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-659h7" event={"ID":"e366a2cd-5dfa-45c9-b187-92772da0b827","Type":"ContainerStarted","Data":"2bf3a279693bfd2239b0811b510f18eec4acf69529f2534645ac054c12ce5663"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.563900 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-659h7" event={"ID":"e366a2cd-5dfa-45c9-b187-92772da0b827","Type":"ContainerStarted","Data":"92b7274c7455968a613e949633813e0ece8512f287e59aea006ff7883bef05ab"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.565137 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-659h7" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.571506 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-vsp2m" event={"ID":"6db10103-96be-4420-b302-a7064e347f61","Type":"ContainerStarted","Data":"ad11424fde61443cfda2afa459aaadcdeb2d287845e06731b5a12889a60c35c7"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.573745 4837 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-659h7 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.573824 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-659h7" podUID="e366a2cd-5dfa-45c9-b187-92772da0b827" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.577531 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xhx6c" podStartSLOduration=152.577507441 podStartE2EDuration="2m32.577507441s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:34.576160009 +0000 UTC m=+210.214426772" watchObservedRunningTime="2026-03-13 11:51:34.577507441 +0000 UTC m=+210.215774204" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.593162 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jpbgx" event={"ID":"fca26784-7fdf-4923-bd07-35d182c2ad14","Type":"ContainerStarted","Data":"5607a7fb040fc09d5c18959f6f3f38a09ca7b1955d2c087bf5a6429e6ba86758"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.620975 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2jcxn" event={"ID":"80d5bedc-a598-4779-be24-2d512ea7d148","Type":"ContainerStarted","Data":"67246c316b4f89ce453ed8c939d7b6959e5673d7d07e1b3189a0364926d8c36c"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.621034 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2jcxn" event={"ID":"80d5bedc-a598-4779-be24-2d512ea7d148","Type":"ContainerStarted","Data":"5cf6fb6c12775c391cb2badc48b232d0fec1d099936960e591a7481d2edd85ca"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.623573 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556710-lcprh" event={"ID":"0484d991-f239-47a2-80ff-0237945c27ac","Type":"ContainerStarted","Data":"960f7af1fa61c8ed012820a8878b593f9924c583dd0d3076ea82e4ba9452a14b"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.630260 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-659h7" podStartSLOduration=152.630239271 podStartE2EDuration="2m32.630239271s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:34.624678107 +0000 UTC m=+210.262944890" watchObservedRunningTime="2026-03-13 11:51:34.630239271 +0000 UTC m=+210.268506034" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.630746 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dhrww" podStartSLOduration=153.630740237 podStartE2EDuration="2m33.630740237s" podCreationTimestamp="2026-03-13 11:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:34.603856136 +0000 UTC m=+210.242122899" watchObservedRunningTime="2026-03-13 11:51:34.630740237 +0000 UTC m=+210.269006990" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.641364 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-69xj9" event={"ID":"edf3b6c4-d4e3-4ec6-8fdf-ff01abe23e57","Type":"ContainerStarted","Data":"164f9219a9b284091a35c7220447c2c7b0bed7f5727c7a47455b22924bfd7017"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.641416 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-69xj9" event={"ID":"edf3b6c4-d4e3-4ec6-8fdf-ff01abe23e57","Type":"ContainerStarted","Data":"26ab5d52a299f8f16150b097ca72ca2f2e9edeabd8750e2fd4e382c8b4e868a8"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.645057 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hffr5" event={"ID":"44f59229-dec6-4d9b-a63b-bd562b4523cf","Type":"ContainerStarted","Data":"8078318f2e84a807e81a2aab4ef9d643f99f9cbb8a870017c43050909154a8c8"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.645108 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hffr5" event={"ID":"44f59229-dec6-4d9b-a63b-bd562b4523cf","Type":"ContainerStarted","Data":"7b919fbe145de00ddd31c2e9c36d139a2ba1f597228ad3aacec8b12b44c0bd55"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.650489 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-l4rxn" event={"ID":"10be2947-2e91-4a8e-b54e-69cdab598955","Type":"ContainerStarted","Data":"844116c251951dc948531c150544bded3b51d4b0a39ec1479d3741e54fcffa22"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.656026 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-8zzqp" event={"ID":"416fd214-ef6d-45b4-bf11-a35c92909523","Type":"ContainerStarted","Data":"5eb9cb8c84060b639aeb979d7a717fec41c33810120564ed7c88b7d4c8a36b76"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.656075 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-8zzqp" event={"ID":"416fd214-ef6d-45b4-bf11-a35c92909523","Type":"ContainerStarted","Data":"e755c6302f2eb2419c8060998ee9930c1d704371a17e3337b1dbc4d6d0f0fbf2"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.656211 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2jcxn" podStartSLOduration=152.656192674 podStartE2EDuration="2m32.656192674s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:34.654504482 +0000 UTC m=+210.292771245" watchObservedRunningTime="2026-03-13 11:51:34.656192674 +0000 UTC m=+210.294459437" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.663801 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:34 crc kubenswrapper[4837]: E0313 11:51:34.664779 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:35.164767553 +0000 UTC m=+210.803034316 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.665035 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-wcfj4" event={"ID":"e6a94afd-1f9a-4281-9d94-2fac3916f2c3","Type":"ContainerStarted","Data":"499553b71cc7f3fa078528578afd4bc0d7fd23462f79f4bcc3fbaa571428442f"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.665088 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-wcfj4" event={"ID":"e6a94afd-1f9a-4281-9d94-2fac3916f2c3","Type":"ContainerStarted","Data":"19a484d698e8d3794e1c4a936f5217201dc3ab68057937694742bd30ea81f214"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.699548 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-8ktsx" event={"ID":"85ac6950-8b98-4d0c-8a2b-7eeeac8d1435","Type":"ContainerStarted","Data":"aea88127f304b60e92e6e3bfd6b308c34d21f2a1163ada78a267b7d02277f97d"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.699615 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-8ktsx" event={"ID":"85ac6950-8b98-4d0c-8a2b-7eeeac8d1435","Type":"ContainerStarted","Data":"e010eb156dd5e54ff5b67933f30569e7dfedf9e8ebf856adbf791bedb5ca7007"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.700745 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-8ktsx" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.711091 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-9g2bm" event={"ID":"b42d2c64-cd10-4923-aed0-dc586696da9a","Type":"ContainerStarted","Data":"13687571c1d08b467659e2286548ed0f122cf7da33338d2df258402f1288580d"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.717613 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jpbgx" podStartSLOduration=153.717590756 podStartE2EDuration="2m33.717590756s" podCreationTimestamp="2026-03-13 11:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:34.715043867 +0000 UTC m=+210.353310630" watchObservedRunningTime="2026-03-13 11:51:34.717590756 +0000 UTC m=+210.355857519" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.723991 4837 patch_prober.go:28] interesting pod/downloads-7954f5f757-8ktsx container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.724057 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8ktsx" podUID="85ac6950-8b98-4d0c-8a2b-7eeeac8d1435" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.741857 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fgk4q" event={"ID":"a5abeecf-9533-4cd9-8ce3-29bb6d8a00bd","Type":"ContainerStarted","Data":"9051790ee9843afb035fb7211a009d5253125f571a3e0be3940d0df74643b8c8"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.766453 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:34 crc kubenswrapper[4837]: E0313 11:51:34.767995 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:35.267967233 +0000 UTC m=+210.906234006 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.782137 4837 generic.go:334] "Generic (PLEG): container finished" podID="ffb5553f-d2d5-4584-9bf8-7212a378f358" containerID="f370af6d9bb9be9e1462cd46c0419f4c6f0c3a54cbe69a7101da09b322008a64" exitCode=0 Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.782258 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" event={"ID":"ffb5553f-d2d5-4584-9bf8-7212a378f358","Type":"ContainerDied","Data":"f370af6d9bb9be9e1462cd46c0419f4c6f0c3a54cbe69a7101da09b322008a64"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.792783 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4gsck" event={"ID":"41e982da-ccd1-4b0c-9f0e-c220e06052a0","Type":"ContainerStarted","Data":"70676082def244677edb5580775204c77414b6bc522caa2f0805cd1405f61587"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.792837 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4gsck" event={"ID":"41e982da-ccd1-4b0c-9f0e-c220e06052a0","Type":"ContainerStarted","Data":"de0a9c7b74e03c5422dfa011bacd29880935071aff96993b9c3cf9d85694e4d2"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.794147 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4gsck" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.802540 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9hkj4" event={"ID":"90308f63-bacc-491b-9ce2-ffbb2eaaea1f","Type":"ContainerStarted","Data":"ec5b1f024551089ef0a7ec39c81b1af86808e9f3a3709026bb866788e9c3043e"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.816831 4837 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-4gsck container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.816910 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4gsck" podUID="41e982da-ccd1-4b0c-9f0e-c220e06052a0" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.817900 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-l4rxn" podStartSLOduration=152.817885376 podStartE2EDuration="2m32.817885376s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:34.81705036 +0000 UTC m=+210.455317123" watchObservedRunningTime="2026-03-13 11:51:34.817885376 +0000 UTC m=+210.456152139" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.818882 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-z9thp" event={"ID":"6085cb91-fec3-45bd-bfdc-a10e6043049f","Type":"ContainerStarted","Data":"aa9b8655d804dd95c7ef91fad04e5b946fd2097131e680a82470cf7f2089113c"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.819309 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-9g2bm" podStartSLOduration=5.819298441 podStartE2EDuration="5.819298441s" podCreationTimestamp="2026-03-13 11:51:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:34.753709227 +0000 UTC m=+210.391976020" watchObservedRunningTime="2026-03-13 11:51:34.819298441 +0000 UTC m=+210.457565224" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.830774 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4rjht" event={"ID":"9025cb05-7c57-488b-a8cb-441552547aae","Type":"ContainerStarted","Data":"6221e47c7d92fba80529d8c67748e95db883e3bfd59b0c1395e15b0b1c63df79"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.862815 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-69xj9" podStartSLOduration=152.862790422 podStartE2EDuration="2m32.862790422s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:34.860047376 +0000 UTC m=+210.498314139" watchObservedRunningTime="2026-03-13 11:51:34.862790422 +0000 UTC m=+210.501057185" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.863155 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l57bl" event={"ID":"ec4b9459-d392-4fc5-9b6f-a87ca50e85b1","Type":"ContainerStarted","Data":"afa74a59bf362e1526eaf8019370b76f31cc88294794a75fa2e16f11441e11d9"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.868532 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:34 crc kubenswrapper[4837]: E0313 11:51:34.887699 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:35.387680911 +0000 UTC m=+211.025947664 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.909615 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-xfcxm" event={"ID":"2960b8ba-5517-4915-b524-1f3f6d0f043c","Type":"ContainerStarted","Data":"1b7479fdaddb40cc2b2b5d9856e0bfd15305f77890603dcca573328b382a1ea9"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.940754 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jrm5t" event={"ID":"00848ba6-522a-45c7-81bd-7ab287d77626","Type":"ContainerStarted","Data":"c171f1f4f12cca2eb0fc64dbb462cae8fdfab2815e450b48a43b5af2e0b3f556"} Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.947598 4837 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-9dbhc container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.947675 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" podUID="f8bc408a-bca6-42ff-8572-2ba9a3978682" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.949444 4837 patch_prober.go:28] interesting pod/console-operator-58897d9998-8dj7w container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.949524 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-8dj7w" podUID="003e8201-4e67-4356-b0c1-8cc135451069" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.949598 4837 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-8q6j6 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" start-of-body= Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.949802 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" podUID="27d45de2-e0ab-4c3e-b3da-b20e60e26801" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.956984 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.960985 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-8ktsx" podStartSLOduration=152.960966845 podStartE2EDuration="2m32.960966845s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:34.927503028 +0000 UTC m=+210.565769801" watchObservedRunningTime="2026-03-13 11:51:34.960966845 +0000 UTC m=+210.599233608" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.964846 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-wcfj4" podStartSLOduration=152.964821206 podStartE2EDuration="2m32.964821206s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:34.960345266 +0000 UTC m=+210.598612029" watchObservedRunningTime="2026-03-13 11:51:34.964821206 +0000 UTC m=+210.603087969" Mar 13 11:51:34 crc kubenswrapper[4837]: I0313 11:51:34.975214 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:34 crc kubenswrapper[4837]: E0313 11:51:34.976553 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:35.476532673 +0000 UTC m=+211.114799496 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:35 crc kubenswrapper[4837]: I0313 11:51:34.999066 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hffr5" podStartSLOduration=152.999043588 podStartE2EDuration="2m32.999043588s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:34.996595871 +0000 UTC m=+210.634862644" watchObservedRunningTime="2026-03-13 11:51:34.999043588 +0000 UTC m=+210.637310341" Mar 13 11:51:35 crc kubenswrapper[4837]: I0313 11:51:35.073988 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-l57bl" podStartSLOduration=153.073968753 podStartE2EDuration="2m33.073968753s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:35.072808657 +0000 UTC m=+210.711075430" watchObservedRunningTime="2026-03-13 11:51:35.073968753 +0000 UTC m=+210.712235516" Mar 13 11:51:35 crc kubenswrapper[4837]: I0313 11:51:35.078046 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:35 crc kubenswrapper[4837]: E0313 11:51:35.078405 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:35.578390731 +0000 UTC m=+211.216657504 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:35 crc kubenswrapper[4837]: I0313 11:51:35.100767 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jrm5t" podStartSLOduration=153.100751111 podStartE2EDuration="2m33.100751111s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:35.100247206 +0000 UTC m=+210.738513969" watchObservedRunningTime="2026-03-13 11:51:35.100751111 +0000 UTC m=+210.739017875" Mar 13 11:51:35 crc kubenswrapper[4837]: I0313 11:51:35.129298 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-xfcxm" podStartSLOduration=153.129280065 podStartE2EDuration="2m33.129280065s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:35.128250942 +0000 UTC m=+210.766517715" watchObservedRunningTime="2026-03-13 11:51:35.129280065 +0000 UTC m=+210.767546828" Mar 13 11:51:35 crc kubenswrapper[4837]: I0313 11:51:35.159787 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fgk4q" podStartSLOduration=153.159769499 podStartE2EDuration="2m33.159769499s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:35.156012111 +0000 UTC m=+210.794278884" watchObservedRunningTime="2026-03-13 11:51:35.159769499 +0000 UTC m=+210.798036272" Mar 13 11:51:35 crc kubenswrapper[4837]: I0313 11:51:35.180735 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:35 crc kubenswrapper[4837]: E0313 11:51:35.180853 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:35.680832979 +0000 UTC m=+211.319099752 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:35 crc kubenswrapper[4837]: I0313 11:51:35.181277 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:35 crc kubenswrapper[4837]: E0313 11:51:35.181711 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:35.681701726 +0000 UTC m=+211.319968489 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:35 crc kubenswrapper[4837]: I0313 11:51:35.210413 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-9hkj4" podStartSLOduration=6.210394764 podStartE2EDuration="6.210394764s" podCreationTimestamp="2026-03-13 11:51:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:35.17579164 +0000 UTC m=+210.814058403" watchObservedRunningTime="2026-03-13 11:51:35.210394764 +0000 UTC m=+210.848661527" Mar 13 11:51:35 crc kubenswrapper[4837]: I0313 11:51:35.252673 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4rjht" podStartSLOduration=153.252656097 podStartE2EDuration="2m33.252656097s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:35.221323866 +0000 UTC m=+210.859590639" watchObservedRunningTime="2026-03-13 11:51:35.252656097 +0000 UTC m=+210.890922860" Mar 13 11:51:35 crc kubenswrapper[4837]: I0313 11:51:35.287295 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:35 crc kubenswrapper[4837]: E0313 11:51:35.287657 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:35.787628992 +0000 UTC m=+211.425895755 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:35 crc kubenswrapper[4837]: I0313 11:51:35.289567 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4gsck" podStartSLOduration=153.289542462 podStartE2EDuration="2m33.289542462s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:35.253976788 +0000 UTC m=+210.892243571" watchObservedRunningTime="2026-03-13 11:51:35.289542462 +0000 UTC m=+210.927809225" Mar 13 11:51:35 crc kubenswrapper[4837]: I0313 11:51:35.392330 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:35 crc kubenswrapper[4837]: E0313 11:51:35.392793 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:35.892778474 +0000 UTC m=+211.531045237 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:35 crc kubenswrapper[4837]: I0313 11:51:35.488145 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 11:51:35 crc kubenswrapper[4837]: I0313 11:51:35.488434 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 11:51:35 crc kubenswrapper[4837]: I0313 11:51:35.494424 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:35 crc kubenswrapper[4837]: E0313 11:51:35.505541 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:36.005522264 +0000 UTC m=+211.643789027 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:35 crc kubenswrapper[4837]: I0313 11:51:35.510829 4837 patch_prober.go:28] interesting pod/router-default-5444994796-9tkxg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:51:35 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Mar 13 11:51:35 crc kubenswrapper[4837]: [+]process-running ok Mar 13 11:51:35 crc kubenswrapper[4837]: healthz check failed Mar 13 11:51:35 crc kubenswrapper[4837]: I0313 11:51:35.510912 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9tkxg" podUID="3eaa54fb-8d70-463c-8388-9f8443a480ed" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:51:35 crc kubenswrapper[4837]: I0313 11:51:35.606871 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:51:35 crc kubenswrapper[4837]: I0313 11:51:35.607257 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:35 crc kubenswrapper[4837]: E0313 11:51:35.607552 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:36.107538967 +0000 UTC m=+211.745805730 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:35 crc kubenswrapper[4837]: I0313 11:51:35.711475 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:35 crc kubenswrapper[4837]: E0313 11:51:35.712041 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:36.212021358 +0000 UTC m=+211.850288121 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:35 crc kubenswrapper[4837]: I0313 11:51:35.712318 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:35 crc kubenswrapper[4837]: E0313 11:51:35.712563 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:36.212557255 +0000 UTC m=+211.850824008 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:35 crc kubenswrapper[4837]: I0313 11:51:35.815233 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:35 crc kubenswrapper[4837]: E0313 11:51:35.815538 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:36.315520199 +0000 UTC m=+211.953786972 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:35 crc kubenswrapper[4837]: I0313 11:51:35.815694 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:35 crc kubenswrapper[4837]: E0313 11:51:35.815955 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:36.315946642 +0000 UTC m=+211.954213485 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:35 crc kubenswrapper[4837]: I0313 11:51:35.916892 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:35 crc kubenswrapper[4837]: E0313 11:51:35.917378 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:36.417362486 +0000 UTC m=+212.055629249 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:35 crc kubenswrapper[4837]: I0313 11:51:35.995847 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ng8zt" event={"ID":"998432c5-238a-466a-a779-7d5126210706","Type":"ContainerStarted","Data":"17bcf3017106794b154977fad9d2c812796ccc0d0ef29808bf462a133588ee4f"} Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.024694 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:36 crc kubenswrapper[4837]: E0313 11:51:36.025164 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:36.525145291 +0000 UTC m=+212.163412144 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.045764 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-84xjl" event={"ID":"8bc71239-c925-4911-bfa5-e7a564dcd654","Type":"ContainerStarted","Data":"055d1ece3e453d5cd8ff1c8063ecb17ee0490ae16e5ed5fdb15f70404bb4569d"} Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.064522 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9hkj4" event={"ID":"90308f63-bacc-491b-9ce2-ffbb2eaaea1f","Type":"ContainerStarted","Data":"4dc188b2c41b6fea37bc31958d827dc56918d3204bb89c56485df4a2a5ee352a"} Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.091243 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-xfcxm" event={"ID":"2960b8ba-5517-4915-b524-1f3f6d0f043c","Type":"ContainerStarted","Data":"43fc2231f05867d05d847de1bd4090131ff52020558ef46e3913e63609a95979"} Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.100006 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-8zzqp" event={"ID":"416fd214-ef6d-45b4-bf11-a35c92909523","Type":"ContainerStarted","Data":"251642b3e74397ace0a9f24d9c53652b9f098294fc8b460805946e095b03ca59"} Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.112383 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ng8zt" podStartSLOduration=154.112366372 podStartE2EDuration="2m34.112366372s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:36.112168716 +0000 UTC m=+211.750435479" watchObservedRunningTime="2026-03-13 11:51:36.112366372 +0000 UTC m=+211.750633135" Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.117678 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8vgmn" event={"ID":"8fb85cad-ec2d-4ada-bd68-55937d96a779","Type":"ContainerStarted","Data":"7062c61986b41d101ebecc3d1bfaa5e447d278c907a23b8b3db80e27716fe090"} Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.117904 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-8vgmn" Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.119215 4837 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-8vgmn container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.119259 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-8vgmn" podUID="8fb85cad-ec2d-4ada-bd68-55937d96a779" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.126123 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:36 crc kubenswrapper[4837]: E0313 11:51:36.127010 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:36.626974779 +0000 UTC m=+212.265241542 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.127725 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" event={"ID":"3e1f747d-78f3-4cbc-b313-eed531936c02","Type":"ContainerStarted","Data":"b2855d588ed711a2a0163b9bf580169f1f7ec427da32756423d44ca38e6cb5be"} Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.127771 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" event={"ID":"3e1f747d-78f3-4cbc-b313-eed531936c02","Type":"ContainerStarted","Data":"cc407b0d1ace08a36e1a88fb7b0359d00f5ca27055c133bc6c09e4fad8dddebf"} Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.138803 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-z9thp" event={"ID":"6085cb91-fec3-45bd-bfdc-a10e6043049f","Type":"ContainerStarted","Data":"57e34defc005454facec5cabd8aafb289d26d833eaed6b9d8732c0557b13c1f5"} Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.138850 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-z9thp" event={"ID":"6085cb91-fec3-45bd-bfdc-a10e6043049f","Type":"ContainerStarted","Data":"87a261aea7c43882c29f16625a2b4dd46997bbc086bc97384be02d61ee5a8595"} Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.139409 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-z9thp" Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.142013 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" event={"ID":"ffb5553f-d2d5-4584-9bf8-7212a378f358","Type":"ContainerStarted","Data":"4339d1e1fecf820037c6a5425b00f03c18da3f1bea62317e129879816d68eb07"} Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.149323 4837 patch_prober.go:28] interesting pod/downloads-7954f5f757-8ktsx container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.149376 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8ktsx" podUID="85ac6950-8b98-4d0c-8a2b-7eeeac8d1435" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.149460 4837 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-659h7 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.149481 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-659h7" podUID="e366a2cd-5dfa-45c9-b187-92772da0b827" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.149524 4837 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-xhx6c container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" start-of-body= Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.149577 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xhx6c" podUID="d0005e35-a11c-4773-a0d1-94fa4aff8a14" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.160364 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4gsck" Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.229999 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:36 crc kubenswrapper[4837]: E0313 11:51:36.233558 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:36.733547596 +0000 UTC m=+212.371814359 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.255225 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-8zzqp" podStartSLOduration=154.255202053 podStartE2EDuration="2m34.255202053s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:36.208658156 +0000 UTC m=+211.846924929" watchObservedRunningTime="2026-03-13 11:51:36.255202053 +0000 UTC m=+211.893468836" Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.255422 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" podStartSLOduration=155.25541633 podStartE2EDuration="2m35.25541633s" podCreationTimestamp="2026-03-13 11:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:36.249541026 +0000 UTC m=+211.887807809" watchObservedRunningTime="2026-03-13 11:51:36.25541633 +0000 UTC m=+211.893683083" Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.277102 4837 ???:1] "http: TLS handshake error from 192.168.126.11:42380: no serving certificate available for the kubelet" Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.337900 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:36 crc kubenswrapper[4837]: E0313 11:51:36.344744 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:36.838582344 +0000 UTC m=+212.476849107 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.361824 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-z9thp" podStartSLOduration=7.361808031 podStartE2EDuration="7.361808031s" podCreationTimestamp="2026-03-13 11:51:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:36.325194735 +0000 UTC m=+211.963461508" watchObservedRunningTime="2026-03-13 11:51:36.361808031 +0000 UTC m=+212.000074794" Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.402707 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-8vgmn" podStartSLOduration=154.402689471 podStartE2EDuration="2m34.402689471s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:36.401110041 +0000 UTC m=+212.039376814" watchObservedRunningTime="2026-03-13 11:51:36.402689471 +0000 UTC m=+212.040956234" Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.403369 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" podStartSLOduration=154.403363592 podStartE2EDuration="2m34.403363592s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:36.362132471 +0000 UTC m=+212.000399234" watchObservedRunningTime="2026-03-13 11:51:36.403363592 +0000 UTC m=+212.041630355" Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.418910 4837 ???:1] "http: TLS handshake error from 192.168.126.11:42394: no serving certificate available for the kubelet" Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.440136 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:36 crc kubenswrapper[4837]: E0313 11:51:36.440429 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:36.940416602 +0000 UTC m=+212.578683365 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.497396 4837 patch_prober.go:28] interesting pod/router-default-5444994796-9tkxg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:51:36 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Mar 13 11:51:36 crc kubenswrapper[4837]: [+]process-running ok Mar 13 11:51:36 crc kubenswrapper[4837]: healthz check failed Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.497462 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9tkxg" podUID="3eaa54fb-8d70-463c-8388-9f8443a480ed" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.541957 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:36 crc kubenswrapper[4837]: E0313 11:51:36.542216 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:37.042171957 +0000 UTC m=+212.680438720 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.571690 4837 ???:1] "http: TLS handshake error from 192.168.126.11:42408: no serving certificate available for the kubelet" Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.643815 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:36 crc kubenswrapper[4837]: E0313 11:51:36.644214 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:37.144191721 +0000 UTC m=+212.782458564 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.684820 4837 ???:1] "http: TLS handshake error from 192.168.126.11:42424: no serving certificate available for the kubelet" Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.744870 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:36 crc kubenswrapper[4837]: E0313 11:51:36.745235 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:37.245207404 +0000 UTC m=+212.883474167 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.745325 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:36 crc kubenswrapper[4837]: E0313 11:51:36.745673 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:37.245660308 +0000 UTC m=+212.883927071 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.814802 4837 ???:1] "http: TLS handshake error from 192.168.126.11:42432: no serving certificate available for the kubelet" Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.846575 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:36 crc kubenswrapper[4837]: E0313 11:51:36.846735 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:37.346707942 +0000 UTC m=+212.984974705 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.846879 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:36 crc kubenswrapper[4837]: E0313 11:51:36.847173 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:37.347166685 +0000 UTC m=+212.985433448 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.947838 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:36 crc kubenswrapper[4837]: E0313 11:51:36.948038 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:37.448007442 +0000 UTC m=+213.086274215 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:36 crc kubenswrapper[4837]: I0313 11:51:36.948080 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:36 crc kubenswrapper[4837]: E0313 11:51:36.948445 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:37.448433386 +0000 UTC m=+213.086700209 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:37 crc kubenswrapper[4837]: I0313 11:51:37.005358 4837 ???:1] "http: TLS handshake error from 192.168.126.11:42442: no serving certificate available for the kubelet" Mar 13 11:51:37 crc kubenswrapper[4837]: I0313 11:51:37.048754 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:37 crc kubenswrapper[4837]: E0313 11:51:37.048957 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:37.548924802 +0000 UTC m=+213.187191625 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:37 crc kubenswrapper[4837]: I0313 11:51:37.049108 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:37 crc kubenswrapper[4837]: E0313 11:51:37.049473 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:37.549462819 +0000 UTC m=+213.187729652 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:37 crc kubenswrapper[4837]: I0313 11:51:37.150910 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:37 crc kubenswrapper[4837]: E0313 11:51:37.151469 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:37.651451762 +0000 UTC m=+213.289718535 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:37 crc kubenswrapper[4837]: I0313 11:51:37.152173 4837 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-8vgmn container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Mar 13 11:51:37 crc kubenswrapper[4837]: I0313 11:51:37.152237 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-8vgmn" podUID="8fb85cad-ec2d-4ada-bd68-55937d96a779" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" Mar 13 11:51:37 crc kubenswrapper[4837]: I0313 11:51:37.152598 4837 patch_prober.go:28] interesting pod/downloads-7954f5f757-8ktsx container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Mar 13 11:51:37 crc kubenswrapper[4837]: I0313 11:51:37.152623 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8ktsx" podUID="85ac6950-8b98-4d0c-8a2b-7eeeac8d1435" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Mar 13 11:51:37 crc kubenswrapper[4837]: I0313 11:51:37.160173 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-659h7" Mar 13 11:51:37 crc kubenswrapper[4837]: I0313 11:51:37.230835 4837 ???:1] "http: TLS handshake error from 192.168.126.11:42452: no serving certificate available for the kubelet" Mar 13 11:51:37 crc kubenswrapper[4837]: I0313 11:51:37.252948 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:37 crc kubenswrapper[4837]: E0313 11:51:37.254438 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:37.754420315 +0000 UTC m=+213.392687128 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:37 crc kubenswrapper[4837]: I0313 11:51:37.354267 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:37 crc kubenswrapper[4837]: E0313 11:51:37.354601 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:37.85456846 +0000 UTC m=+213.492835223 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:37 crc kubenswrapper[4837]: I0313 11:51:37.354899 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:37 crc kubenswrapper[4837]: E0313 11:51:37.355324 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:37.855314214 +0000 UTC m=+213.493580977 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:37 crc kubenswrapper[4837]: I0313 11:51:37.456452 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:37 crc kubenswrapper[4837]: E0313 11:51:37.456666 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:37.956625336 +0000 UTC m=+213.594892099 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:37 crc kubenswrapper[4837]: I0313 11:51:37.456862 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:37 crc kubenswrapper[4837]: E0313 11:51:37.457216 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:37.957209503 +0000 UTC m=+213.595476266 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:37 crc kubenswrapper[4837]: I0313 11:51:37.494995 4837 patch_prober.go:28] interesting pod/router-default-5444994796-9tkxg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:51:37 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Mar 13 11:51:37 crc kubenswrapper[4837]: [+]process-running ok Mar 13 11:51:37 crc kubenswrapper[4837]: healthz check failed Mar 13 11:51:37 crc kubenswrapper[4837]: I0313 11:51:37.495053 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9tkxg" podUID="3eaa54fb-8d70-463c-8388-9f8443a480ed" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:51:37 crc kubenswrapper[4837]: I0313 11:51:37.558432 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:37 crc kubenswrapper[4837]: E0313 11:51:37.558793 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:38.058776893 +0000 UTC m=+213.697043656 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:37 crc kubenswrapper[4837]: I0313 11:51:37.659732 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:37 crc kubenswrapper[4837]: E0313 11:51:37.660011 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:38.159999552 +0000 UTC m=+213.798266315 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:37 crc kubenswrapper[4837]: I0313 11:51:37.714890 4837 ???:1] "http: TLS handshake error from 192.168.126.11:42456: no serving certificate available for the kubelet" Mar 13 11:51:37 crc kubenswrapper[4837]: I0313 11:51:37.762704 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:37 crc kubenswrapper[4837]: E0313 11:51:37.762886 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:38.262855232 +0000 UTC m=+213.901122005 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:37 crc kubenswrapper[4837]: I0313 11:51:37.763184 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:37 crc kubenswrapper[4837]: E0313 11:51:37.763572 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:38.263559965 +0000 UTC m=+213.901826798 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:37 crc kubenswrapper[4837]: I0313 11:51:37.863798 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:37 crc kubenswrapper[4837]: E0313 11:51:37.864208 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:38.364189835 +0000 UTC m=+214.002456608 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:37 crc kubenswrapper[4837]: I0313 11:51:37.965401 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:37 crc kubenswrapper[4837]: E0313 11:51:37.965945 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:38.465900969 +0000 UTC m=+214.104167782 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:37 crc kubenswrapper[4837]: I0313 11:51:37.998526 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-twtbj"] Mar 13 11:51:37 crc kubenswrapper[4837]: I0313 11:51:37.999407 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-twtbj" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.004120 4837 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-f97pg container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.004170 4837 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-f97pg container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.004195 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-f97pg" podUID="f8d8640c-c4bc-40ad-9594-7b4fb2c4beb0" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.004245 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-f97pg" podUID="f8d8640c-c4bc-40ad-9594-7b4fb2c4beb0" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.005079 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.034421 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-twtbj"] Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.066858 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.067177 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/278c91cc-2624-42cd-a35e-287e22d22f7d-utilities\") pod \"community-operators-twtbj\" (UID: \"278c91cc-2624-42cd-a35e-287e22d22f7d\") " pod="openshift-marketplace/community-operators-twtbj" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.067203 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcgkl\" (UniqueName: \"kubernetes.io/projected/278c91cc-2624-42cd-a35e-287e22d22f7d-kube-api-access-bcgkl\") pod \"community-operators-twtbj\" (UID: \"278c91cc-2624-42cd-a35e-287e22d22f7d\") " pod="openshift-marketplace/community-operators-twtbj" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.067244 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/278c91cc-2624-42cd-a35e-287e22d22f7d-catalog-content\") pod \"community-operators-twtbj\" (UID: \"278c91cc-2624-42cd-a35e-287e22d22f7d\") " pod="openshift-marketplace/community-operators-twtbj" Mar 13 11:51:38 crc kubenswrapper[4837]: E0313 11:51:38.067347 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:38.567332095 +0000 UTC m=+214.205598858 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.168858 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/278c91cc-2624-42cd-a35e-287e22d22f7d-utilities\") pod \"community-operators-twtbj\" (UID: \"278c91cc-2624-42cd-a35e-287e22d22f7d\") " pod="openshift-marketplace/community-operators-twtbj" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.168922 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcgkl\" (UniqueName: \"kubernetes.io/projected/278c91cc-2624-42cd-a35e-287e22d22f7d-kube-api-access-bcgkl\") pod \"community-operators-twtbj\" (UID: \"278c91cc-2624-42cd-a35e-287e22d22f7d\") " pod="openshift-marketplace/community-operators-twtbj" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.168988 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/278c91cc-2624-42cd-a35e-287e22d22f7d-catalog-content\") pod \"community-operators-twtbj\" (UID: \"278c91cc-2624-42cd-a35e-287e22d22f7d\") " pod="openshift-marketplace/community-operators-twtbj" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.169067 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:38 crc kubenswrapper[4837]: E0313 11:51:38.169704 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 11:51:38.669688089 +0000 UTC m=+214.307954862 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2w96t" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.169714 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/278c91cc-2624-42cd-a35e-287e22d22f7d-catalog-content\") pod \"community-operators-twtbj\" (UID: \"278c91cc-2624-42cd-a35e-287e22d22f7d\") " pod="openshift-marketplace/community-operators-twtbj" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.169789 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/278c91cc-2624-42cd-a35e-287e22d22f7d-utilities\") pod \"community-operators-twtbj\" (UID: \"278c91cc-2624-42cd-a35e-287e22d22f7d\") " pod="openshift-marketplace/community-operators-twtbj" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.175266 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-84xjl" event={"ID":"8bc71239-c925-4911-bfa5-e7a564dcd654","Type":"ContainerStarted","Data":"84079e4e10d6acaf8229cfb3ae643344c68afe85070cfbbb2e35088762c2fa76"} Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.181902 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ft6cr"] Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.192460 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ft6cr" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.203485 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.216925 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ft6cr"] Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.223406 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcgkl\" (UniqueName: \"kubernetes.io/projected/278c91cc-2624-42cd-a35e-287e22d22f7d-kube-api-access-bcgkl\") pod \"community-operators-twtbj\" (UID: \"278c91cc-2624-42cd-a35e-287e22d22f7d\") " pod="openshift-marketplace/community-operators-twtbj" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.245795 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-f97pg" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.250319 4837 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.251939 4837 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-13T11:51:38.250342734Z","Handler":null,"Name":""} Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.272211 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.272544 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6060cf2-077e-4112-af57-f100e297f320-utilities\") pod \"certified-operators-ft6cr\" (UID: \"e6060cf2-077e-4112-af57-f100e297f320\") " pod="openshift-marketplace/certified-operators-ft6cr" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.272682 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6060cf2-077e-4112-af57-f100e297f320-catalog-content\") pod \"certified-operators-ft6cr\" (UID: \"e6060cf2-077e-4112-af57-f100e297f320\") " pod="openshift-marketplace/certified-operators-ft6cr" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.272788 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfvgs\" (UniqueName: \"kubernetes.io/projected/e6060cf2-077e-4112-af57-f100e297f320-kube-api-access-gfvgs\") pod \"certified-operators-ft6cr\" (UID: \"e6060cf2-077e-4112-af57-f100e297f320\") " pod="openshift-marketplace/certified-operators-ft6cr" Mar 13 11:51:38 crc kubenswrapper[4837]: E0313 11:51:38.273644 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 11:51:38.773604782 +0000 UTC m=+214.411871545 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.308150 4837 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.308185 4837 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.320098 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-twtbj" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.373749 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6060cf2-077e-4112-af57-f100e297f320-utilities\") pod \"certified-operators-ft6cr\" (UID: \"e6060cf2-077e-4112-af57-f100e297f320\") " pod="openshift-marketplace/certified-operators-ft6cr" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.373798 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.373862 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6060cf2-077e-4112-af57-f100e297f320-catalog-content\") pod \"certified-operators-ft6cr\" (UID: \"e6060cf2-077e-4112-af57-f100e297f320\") " pod="openshift-marketplace/certified-operators-ft6cr" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.373966 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfvgs\" (UniqueName: \"kubernetes.io/projected/e6060cf2-077e-4112-af57-f100e297f320-kube-api-access-gfvgs\") pod \"certified-operators-ft6cr\" (UID: \"e6060cf2-077e-4112-af57-f100e297f320\") " pod="openshift-marketplace/certified-operators-ft6cr" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.374706 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6060cf2-077e-4112-af57-f100e297f320-catalog-content\") pod \"certified-operators-ft6cr\" (UID: \"e6060cf2-077e-4112-af57-f100e297f320\") " pod="openshift-marketplace/certified-operators-ft6cr" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.374721 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6060cf2-077e-4112-af57-f100e297f320-utilities\") pod \"certified-operators-ft6cr\" (UID: \"e6060cf2-077e-4112-af57-f100e297f320\") " pod="openshift-marketplace/certified-operators-ft6cr" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.378263 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vx4r8"] Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.379228 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vx4r8" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.403214 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfvgs\" (UniqueName: \"kubernetes.io/projected/e6060cf2-077e-4112-af57-f100e297f320-kube-api-access-gfvgs\") pod \"certified-operators-ft6cr\" (UID: \"e6060cf2-077e-4112-af57-f100e297f320\") " pod="openshift-marketplace/certified-operators-ft6cr" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.403732 4837 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.403768 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.405415 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vx4r8"] Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.475312 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45e6ae52-59ef-446f-917a-549d34ffbf8e-catalog-content\") pod \"community-operators-vx4r8\" (UID: \"45e6ae52-59ef-446f-917a-549d34ffbf8e\") " pod="openshift-marketplace/community-operators-vx4r8" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.475764 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45e6ae52-59ef-446f-917a-549d34ffbf8e-utilities\") pod \"community-operators-vx4r8\" (UID: \"45e6ae52-59ef-446f-917a-549d34ffbf8e\") " pod="openshift-marketplace/community-operators-vx4r8" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.475830 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmfv2\" (UniqueName: \"kubernetes.io/projected/45e6ae52-59ef-446f-917a-549d34ffbf8e-kube-api-access-xmfv2\") pod \"community-operators-vx4r8\" (UID: \"45e6ae52-59ef-446f-917a-549d34ffbf8e\") " pod="openshift-marketplace/community-operators-vx4r8" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.507173 4837 ???:1] "http: TLS handshake error from 192.168.126.11:42468: no serving certificate available for the kubelet" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.517202 4837 patch_prober.go:28] interesting pod/router-default-5444994796-9tkxg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:51:38 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Mar 13 11:51:38 crc kubenswrapper[4837]: [+]process-running ok Mar 13 11:51:38 crc kubenswrapper[4837]: healthz check failed Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.517253 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9tkxg" podUID="3eaa54fb-8d70-463c-8388-9f8443a480ed" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.544471 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ft6cr" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.579358 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45e6ae52-59ef-446f-917a-549d34ffbf8e-catalog-content\") pod \"community-operators-vx4r8\" (UID: \"45e6ae52-59ef-446f-917a-549d34ffbf8e\") " pod="openshift-marketplace/community-operators-vx4r8" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.579422 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45e6ae52-59ef-446f-917a-549d34ffbf8e-utilities\") pod \"community-operators-vx4r8\" (UID: \"45e6ae52-59ef-446f-917a-549d34ffbf8e\") " pod="openshift-marketplace/community-operators-vx4r8" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.579484 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmfv2\" (UniqueName: \"kubernetes.io/projected/45e6ae52-59ef-446f-917a-549d34ffbf8e-kube-api-access-xmfv2\") pod \"community-operators-vx4r8\" (UID: \"45e6ae52-59ef-446f-917a-549d34ffbf8e\") " pod="openshift-marketplace/community-operators-vx4r8" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.579826 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45e6ae52-59ef-446f-917a-549d34ffbf8e-catalog-content\") pod \"community-operators-vx4r8\" (UID: \"45e6ae52-59ef-446f-917a-549d34ffbf8e\") " pod="openshift-marketplace/community-operators-vx4r8" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.579959 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45e6ae52-59ef-446f-917a-549d34ffbf8e-utilities\") pod \"community-operators-vx4r8\" (UID: \"45e6ae52-59ef-446f-917a-549d34ffbf8e\") " pod="openshift-marketplace/community-operators-vx4r8" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.589723 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5tnrx"] Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.590678 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5tnrx" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.603524 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5tnrx"] Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.624620 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmfv2\" (UniqueName: \"kubernetes.io/projected/45e6ae52-59ef-446f-917a-549d34ffbf8e-kube-api-access-xmfv2\") pod \"community-operators-vx4r8\" (UID: \"45e6ae52-59ef-446f-917a-549d34ffbf8e\") " pod="openshift-marketplace/community-operators-vx4r8" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.680308 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rpl9\" (UniqueName: \"kubernetes.io/projected/6870caea-07d6-4465-86b1-645a2e29b240-kube-api-access-4rpl9\") pod \"certified-operators-5tnrx\" (UID: \"6870caea-07d6-4465-86b1-645a2e29b240\") " pod="openshift-marketplace/certified-operators-5tnrx" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.680412 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6870caea-07d6-4465-86b1-645a2e29b240-catalog-content\") pod \"certified-operators-5tnrx\" (UID: \"6870caea-07d6-4465-86b1-645a2e29b240\") " pod="openshift-marketplace/certified-operators-5tnrx" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.680449 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6870caea-07d6-4465-86b1-645a2e29b240-utilities\") pod \"certified-operators-5tnrx\" (UID: \"6870caea-07d6-4465-86b1-645a2e29b240\") " pod="openshift-marketplace/certified-operators-5tnrx" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.705003 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.705668 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.709373 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.709652 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.709829 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.716918 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vx4r8" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.726138 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2w96t\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.781997 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.782314 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9762555c-fc85-46c5-99a4-0b01577780b0-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9762555c-fc85-46c5-99a4-0b01577780b0\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.782411 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6870caea-07d6-4465-86b1-645a2e29b240-catalog-content\") pod \"certified-operators-5tnrx\" (UID: \"6870caea-07d6-4465-86b1-645a2e29b240\") " pod="openshift-marketplace/certified-operators-5tnrx" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.782461 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6870caea-07d6-4465-86b1-645a2e29b240-utilities\") pod \"certified-operators-5tnrx\" (UID: \"6870caea-07d6-4465-86b1-645a2e29b240\") " pod="openshift-marketplace/certified-operators-5tnrx" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.782505 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9762555c-fc85-46c5-99a4-0b01577780b0-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9762555c-fc85-46c5-99a4-0b01577780b0\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.782535 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rpl9\" (UniqueName: \"kubernetes.io/projected/6870caea-07d6-4465-86b1-645a2e29b240-kube-api-access-4rpl9\") pod \"certified-operators-5tnrx\" (UID: \"6870caea-07d6-4465-86b1-645a2e29b240\") " pod="openshift-marketplace/certified-operators-5tnrx" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.783236 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6870caea-07d6-4465-86b1-645a2e29b240-catalog-content\") pod \"certified-operators-5tnrx\" (UID: \"6870caea-07d6-4465-86b1-645a2e29b240\") " pod="openshift-marketplace/certified-operators-5tnrx" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.783258 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6870caea-07d6-4465-86b1-645a2e29b240-utilities\") pod \"certified-operators-5tnrx\" (UID: \"6870caea-07d6-4465-86b1-645a2e29b240\") " pod="openshift-marketplace/certified-operators-5tnrx" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.811703 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9dbhc"] Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.811904 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" podUID="f8bc408a-bca6-42ff-8572-2ba9a3978682" containerName="controller-manager" containerID="cri-o://0737572e5f80685157a6578fd12aead5fdbe12b0fbb802f48732112a9a3e2ca5" gracePeriod=30 Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.812707 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rpl9\" (UniqueName: \"kubernetes.io/projected/6870caea-07d6-4465-86b1-645a2e29b240-kube-api-access-4rpl9\") pod \"certified-operators-5tnrx\" (UID: \"6870caea-07d6-4465-86b1-645a2e29b240\") " pod="openshift-marketplace/certified-operators-5tnrx" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.883395 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9762555c-fc85-46c5-99a4-0b01577780b0-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9762555c-fc85-46c5-99a4-0b01577780b0\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.883538 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9762555c-fc85-46c5-99a4-0b01577780b0-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9762555c-fc85-46c5-99a4-0b01577780b0\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.883610 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9762555c-fc85-46c5-99a4-0b01577780b0-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9762555c-fc85-46c5-99a4-0b01577780b0\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.888862 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs"] Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.889054 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs" podUID="5a3cabe4-69ee-49f7-a783-e72ac1a56821" containerName="route-controller-manager" containerID="cri-o://0b1af16cc6188236788eb10501019d25c79e6c73c18075a85efbfcfdd6e8d90d" gracePeriod=30 Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.891521 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.910223 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.913015 4837 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-9dbhc container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": read tcp 10.217.0.2:45600->10.217.0.7:8443: read: connection reset by peer" start-of-body= Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.913062 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" podUID="f8bc408a-bca6-42ff-8572-2ba9a3978682" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": read tcp 10.217.0.2:45600->10.217.0.7:8443: read: connection reset by peer" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.938155 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9762555c-fc85-46c5-99a4-0b01577780b0-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9762555c-fc85-46c5-99a4-0b01577780b0\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 11:51:38 crc kubenswrapper[4837]: I0313 11:51:38.976951 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5tnrx" Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.088727 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.092403 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.212523 4837 generic.go:334] "Generic (PLEG): container finished" podID="f8bc408a-bca6-42ff-8572-2ba9a3978682" containerID="0737572e5f80685157a6578fd12aead5fdbe12b0fbb802f48732112a9a3e2ca5" exitCode=0 Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.212630 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" event={"ID":"f8bc408a-bca6-42ff-8572-2ba9a3978682","Type":"ContainerDied","Data":"0737572e5f80685157a6578fd12aead5fdbe12b0fbb802f48732112a9a3e2ca5"} Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.217520 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-twtbj"] Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.226509 4837 generic.go:334] "Generic (PLEG): container finished" podID="5a3cabe4-69ee-49f7-a783-e72ac1a56821" containerID="0b1af16cc6188236788eb10501019d25c79e6c73c18075a85efbfcfdd6e8d90d" exitCode=0 Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.226586 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs" event={"ID":"5a3cabe4-69ee-49f7-a783-e72ac1a56821","Type":"ContainerDied","Data":"0b1af16cc6188236788eb10501019d25c79e6c73c18075a85efbfcfdd6e8d90d"} Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.244262 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-84xjl" event={"ID":"8bc71239-c925-4911-bfa5-e7a564dcd654","Type":"ContainerStarted","Data":"87f1ed8a9c7321b308a794c5373b316a91bed14f1578617af0e948bfd338f284"} Mar 13 11:51:39 crc kubenswrapper[4837]: W0313 11:51:39.255325 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod278c91cc_2624_42cd_a35e_287e22d22f7d.slice/crio-c6c53bda7c5d3c5997c1cf5e6db327e83ff0de4776f4c37d442594e9111862d1 WatchSource:0}: Error finding container c6c53bda7c5d3c5997c1cf5e6db327e83ff0de4776f4c37d442594e9111862d1: Status 404 returned error can't find the container with id c6c53bda7c5d3c5997c1cf5e6db327e83ff0de4776f4c37d442594e9111862d1 Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.492538 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ft6cr"] Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.506747 4837 patch_prober.go:28] interesting pod/router-default-5444994796-9tkxg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:51:39 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Mar 13 11:51:39 crc kubenswrapper[4837]: [+]process-running ok Mar 13 11:51:39 crc kubenswrapper[4837]: healthz check failed Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.508509 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9tkxg" podUID="3eaa54fb-8d70-463c-8388-9f8443a480ed" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.574260 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs" Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.695903 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a3cabe4-69ee-49f7-a783-e72ac1a56821-config\") pod \"5a3cabe4-69ee-49f7-a783-e72ac1a56821\" (UID: \"5a3cabe4-69ee-49f7-a783-e72ac1a56821\") " Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.696354 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sj2pk\" (UniqueName: \"kubernetes.io/projected/5a3cabe4-69ee-49f7-a783-e72ac1a56821-kube-api-access-sj2pk\") pod \"5a3cabe4-69ee-49f7-a783-e72ac1a56821\" (UID: \"5a3cabe4-69ee-49f7-a783-e72ac1a56821\") " Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.696438 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a3cabe4-69ee-49f7-a783-e72ac1a56821-serving-cert\") pod \"5a3cabe4-69ee-49f7-a783-e72ac1a56821\" (UID: \"5a3cabe4-69ee-49f7-a783-e72ac1a56821\") " Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.696469 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5a3cabe4-69ee-49f7-a783-e72ac1a56821-client-ca\") pod \"5a3cabe4-69ee-49f7-a783-e72ac1a56821\" (UID: \"5a3cabe4-69ee-49f7-a783-e72ac1a56821\") " Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.697333 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a3cabe4-69ee-49f7-a783-e72ac1a56821-client-ca" (OuterVolumeSpecName: "client-ca") pod "5a3cabe4-69ee-49f7-a783-e72ac1a56821" (UID: "5a3cabe4-69ee-49f7-a783-e72ac1a56821"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.697469 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a3cabe4-69ee-49f7-a783-e72ac1a56821-config" (OuterVolumeSpecName: "config") pod "5a3cabe4-69ee-49f7-a783-e72ac1a56821" (UID: "5a3cabe4-69ee-49f7-a783-e72ac1a56821"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.697888 4837 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5a3cabe4-69ee-49f7-a783-e72ac1a56821-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.697915 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a3cabe4-69ee-49f7-a783-e72ac1a56821-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.706046 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a3cabe4-69ee-49f7-a783-e72ac1a56821-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5a3cabe4-69ee-49f7-a783-e72ac1a56821" (UID: "5a3cabe4-69ee-49f7-a783-e72ac1a56821"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.706082 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a3cabe4-69ee-49f7-a783-e72ac1a56821-kube-api-access-sj2pk" (OuterVolumeSpecName: "kube-api-access-sj2pk") pod "5a3cabe4-69ee-49f7-a783-e72ac1a56821" (UID: "5a3cabe4-69ee-49f7-a783-e72ac1a56821"). InnerVolumeSpecName "kube-api-access-sj2pk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.725385 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 13 11:51:39 crc kubenswrapper[4837]: W0313 11:51:39.732067 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod9762555c_fc85_46c5_99a4_0b01577780b0.slice/crio-6127d8b87e94fb8dd1104b1d5a80edde6a809747eb5ba6dc0602986991a215db WatchSource:0}: Error finding container 6127d8b87e94fb8dd1104b1d5a80edde6a809747eb5ba6dc0602986991a215db: Status 404 returned error can't find the container with id 6127d8b87e94fb8dd1104b1d5a80edde6a809747eb5ba6dc0602986991a215db Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.764600 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.778609 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2w96t"] Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.795874 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vx4r8"] Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.798513 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8bc408a-bca6-42ff-8572-2ba9a3978682-serving-cert\") pod \"f8bc408a-bca6-42ff-8572-2ba9a3978682\" (UID: \"f8bc408a-bca6-42ff-8572-2ba9a3978682\") " Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.798653 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvs46\" (UniqueName: \"kubernetes.io/projected/f8bc408a-bca6-42ff-8572-2ba9a3978682-kube-api-access-xvs46\") pod \"f8bc408a-bca6-42ff-8572-2ba9a3978682\" (UID: \"f8bc408a-bca6-42ff-8572-2ba9a3978682\") " Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.798741 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8bc408a-bca6-42ff-8572-2ba9a3978682-config\") pod \"f8bc408a-bca6-42ff-8572-2ba9a3978682\" (UID: \"f8bc408a-bca6-42ff-8572-2ba9a3978682\") " Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.798786 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f8bc408a-bca6-42ff-8572-2ba9a3978682-client-ca\") pod \"f8bc408a-bca6-42ff-8572-2ba9a3978682\" (UID: \"f8bc408a-bca6-42ff-8572-2ba9a3978682\") " Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.798890 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f8bc408a-bca6-42ff-8572-2ba9a3978682-proxy-ca-bundles\") pod \"f8bc408a-bca6-42ff-8572-2ba9a3978682\" (UID: \"f8bc408a-bca6-42ff-8572-2ba9a3978682\") " Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.799186 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sj2pk\" (UniqueName: \"kubernetes.io/projected/5a3cabe4-69ee-49f7-a783-e72ac1a56821-kube-api-access-sj2pk\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.799216 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a3cabe4-69ee-49f7-a783-e72ac1a56821-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.801162 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8bc408a-bca6-42ff-8572-2ba9a3978682-config" (OuterVolumeSpecName: "config") pod "f8bc408a-bca6-42ff-8572-2ba9a3978682" (UID: "f8bc408a-bca6-42ff-8572-2ba9a3978682"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.801242 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8bc408a-bca6-42ff-8572-2ba9a3978682-client-ca" (OuterVolumeSpecName: "client-ca") pod "f8bc408a-bca6-42ff-8572-2ba9a3978682" (UID: "f8bc408a-bca6-42ff-8572-2ba9a3978682"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.801311 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5tnrx"] Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.801726 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8bc408a-bca6-42ff-8572-2ba9a3978682-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "f8bc408a-bca6-42ff-8572-2ba9a3978682" (UID: "f8bc408a-bca6-42ff-8572-2ba9a3978682"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.808270 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8bc408a-bca6-42ff-8572-2ba9a3978682-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f8bc408a-bca6-42ff-8572-2ba9a3978682" (UID: "f8bc408a-bca6-42ff-8572-2ba9a3978682"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.810098 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8bc408a-bca6-42ff-8572-2ba9a3978682-kube-api-access-xvs46" (OuterVolumeSpecName: "kube-api-access-xvs46") pod "f8bc408a-bca6-42ff-8572-2ba9a3978682" (UID: "f8bc408a-bca6-42ff-8572-2ba9a3978682"). InnerVolumeSpecName "kube-api-access-xvs46". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.814668 4837 ???:1] "http: TLS handshake error from 192.168.126.11:42470: no serving certificate available for the kubelet" Mar 13 11:51:39 crc kubenswrapper[4837]: W0313 11:51:39.838823 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6870caea_07d6_4465_86b1_645a2e29b240.slice/crio-bfe6aca1334934677df8bf272b7d6fdeb1c785b92dcc8ef7c0566c6636ddfaa3 WatchSource:0}: Error finding container bfe6aca1334934677df8bf272b7d6fdeb1c785b92dcc8ef7c0566c6636ddfaa3: Status 404 returned error can't find the container with id bfe6aca1334934677df8bf272b7d6fdeb1c785b92dcc8ef7c0566c6636ddfaa3 Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.900545 4837 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f8bc408a-bca6-42ff-8572-2ba9a3978682-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.901136 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8bc408a-bca6-42ff-8572-2ba9a3978682-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.901152 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvs46\" (UniqueName: \"kubernetes.io/projected/f8bc408a-bca6-42ff-8572-2ba9a3978682-kube-api-access-xvs46\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.901169 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8bc408a-bca6-42ff-8572-2ba9a3978682-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:39 crc kubenswrapper[4837]: I0313 11:51:39.901182 4837 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f8bc408a-bca6-42ff-8572-2ba9a3978682-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.177685 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7crb6"] Mar 13 11:51:40 crc kubenswrapper[4837]: E0313 11:51:40.177944 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a3cabe4-69ee-49f7-a783-e72ac1a56821" containerName="route-controller-manager" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.177959 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a3cabe4-69ee-49f7-a783-e72ac1a56821" containerName="route-controller-manager" Mar 13 11:51:40 crc kubenswrapper[4837]: E0313 11:51:40.177977 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8bc408a-bca6-42ff-8572-2ba9a3978682" containerName="controller-manager" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.177986 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8bc408a-bca6-42ff-8572-2ba9a3978682" containerName="controller-manager" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.178116 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a3cabe4-69ee-49f7-a783-e72ac1a56821" containerName="route-controller-manager" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.178134 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8bc408a-bca6-42ff-8572-2ba9a3978682" containerName="controller-manager" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.178998 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7crb6" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.184484 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.196298 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7crb6"] Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.211104 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfddm\" (UniqueName: \"kubernetes.io/projected/080747b0-3d43-4ff1-b21c-b8ea9fc2f961-kube-api-access-wfddm\") pod \"redhat-marketplace-7crb6\" (UID: \"080747b0-3d43-4ff1-b21c-b8ea9fc2f961\") " pod="openshift-marketplace/redhat-marketplace-7crb6" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.211181 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/080747b0-3d43-4ff1-b21c-b8ea9fc2f961-utilities\") pod \"redhat-marketplace-7crb6\" (UID: \"080747b0-3d43-4ff1-b21c-b8ea9fc2f961\") " pod="openshift-marketplace/redhat-marketplace-7crb6" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.211219 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/080747b0-3d43-4ff1-b21c-b8ea9fc2f961-catalog-content\") pod \"redhat-marketplace-7crb6\" (UID: \"080747b0-3d43-4ff1-b21c-b8ea9fc2f961\") " pod="openshift-marketplace/redhat-marketplace-7crb6" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.250282 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9762555c-fc85-46c5-99a4-0b01577780b0","Type":"ContainerStarted","Data":"2af5ddab1d2a04daf9c57e357b7966b38ff85801a9c08e016c2ec482b2f9eb04"} Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.250335 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9762555c-fc85-46c5-99a4-0b01577780b0","Type":"ContainerStarted","Data":"6127d8b87e94fb8dd1104b1d5a80edde6a809747eb5ba6dc0602986991a215db"} Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.254038 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs" event={"ID":"5a3cabe4-69ee-49f7-a783-e72ac1a56821","Type":"ContainerDied","Data":"2bc8a3d69075e5c30fa5b45ad6a0c6f1944dbbd0064acaad5eadf14dc600adc9"} Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.254083 4837 scope.go:117] "RemoveContainer" containerID="0b1af16cc6188236788eb10501019d25c79e6c73c18075a85efbfcfdd6e8d90d" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.254183 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.259746 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-84xjl" event={"ID":"8bc71239-c925-4911-bfa5-e7a564dcd654","Type":"ContainerStarted","Data":"3f7712027be97760bdddd9977e9a0c621fce0969c2c77a94b09dcb59e4be8db9"} Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.262849 4837 generic.go:334] "Generic (PLEG): container finished" podID="831db5b2-5229-4b52-8783-f99c640ba856" containerID="965aad43c7ccd189d4d18246f935c745fc24b5e2cfb5b07896f9492e9109fb55" exitCode=0 Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.263214 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556705-kllhr" event={"ID":"831db5b2-5229-4b52-8783-f99c640ba856","Type":"ContainerDied","Data":"965aad43c7ccd189d4d18246f935c745fc24b5e2cfb5b07896f9492e9109fb55"} Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.265999 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.2659851460000002 podStartE2EDuration="2.265985146s" podCreationTimestamp="2026-03-13 11:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:40.265672056 +0000 UTC m=+215.903938819" watchObservedRunningTime="2026-03-13 11:51:40.265985146 +0000 UTC m=+215.904251909" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.267108 4837 generic.go:334] "Generic (PLEG): container finished" podID="45e6ae52-59ef-446f-917a-549d34ffbf8e" containerID="29b7adb5a9c0b54134cefd4b865e773828049385da6ba275d2791faad9875780" exitCode=0 Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.267182 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vx4r8" event={"ID":"45e6ae52-59ef-446f-917a-549d34ffbf8e","Type":"ContainerDied","Data":"29b7adb5a9c0b54134cefd4b865e773828049385da6ba275d2791faad9875780"} Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.267211 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vx4r8" event={"ID":"45e6ae52-59ef-446f-917a-549d34ffbf8e","Type":"ContainerStarted","Data":"b8c38b609b1ee957c7e1e1a563341d86aa7368639c49a74a0e6c541c1d320168"} Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.270413 4837 generic.go:334] "Generic (PLEG): container finished" podID="e6060cf2-077e-4112-af57-f100e297f320" containerID="92a2c4e8c63e772dff31e06b47098a1634c8714cf4402cee4344a939f71bc1a7" exitCode=0 Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.270494 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ft6cr" event={"ID":"e6060cf2-077e-4112-af57-f100e297f320","Type":"ContainerDied","Data":"92a2c4e8c63e772dff31e06b47098a1634c8714cf4402cee4344a939f71bc1a7"} Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.270524 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ft6cr" event={"ID":"e6060cf2-077e-4112-af57-f100e297f320","Type":"ContainerStarted","Data":"4b6c9ae51e3fb9c4dadef31697baf0c351e16ed9f865f9be7126242388f9b2dd"} Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.282973 4837 generic.go:334] "Generic (PLEG): container finished" podID="278c91cc-2624-42cd-a35e-287e22d22f7d" containerID="2e92979b20e46c7135bec8322dc3792b24e0a0cbeac503bda1fa03e0621168f3" exitCode=0 Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.283046 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-twtbj" event={"ID":"278c91cc-2624-42cd-a35e-287e22d22f7d","Type":"ContainerDied","Data":"2e92979b20e46c7135bec8322dc3792b24e0a0cbeac503bda1fa03e0621168f3"} Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.283071 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-twtbj" event={"ID":"278c91cc-2624-42cd-a35e-287e22d22f7d","Type":"ContainerStarted","Data":"c6c53bda7c5d3c5997c1cf5e6db327e83ff0de4776f4c37d442594e9111862d1"} Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.294434 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-84xjl" podStartSLOduration=12.294416627 podStartE2EDuration="12.294416627s" podCreationTimestamp="2026-03-13 11:51:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:40.29388197 +0000 UTC m=+215.932148753" watchObservedRunningTime="2026-03-13 11:51:40.294416627 +0000 UTC m=+215.932683420" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.295318 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" event={"ID":"f8bc408a-bca6-42ff-8572-2ba9a3978682","Type":"ContainerDied","Data":"79d44a31b910bec33360921358068b0857727b0bb4c82bc65255018460fa2174"} Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.295330 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9dbhc" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.301509 4837 scope.go:117] "RemoveContainer" containerID="0737572e5f80685157a6578fd12aead5fdbe12b0fbb802f48732112a9a3e2ca5" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.302033 4837 generic.go:334] "Generic (PLEG): container finished" podID="6870caea-07d6-4465-86b1-645a2e29b240" containerID="ce94a39ad6afbdcb9bafffd4ef0157cee1cf3107185ab6fcc83d0813bc89a94d" exitCode=0 Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.302109 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5tnrx" event={"ID":"6870caea-07d6-4465-86b1-645a2e29b240","Type":"ContainerDied","Data":"ce94a39ad6afbdcb9bafffd4ef0157cee1cf3107185ab6fcc83d0813bc89a94d"} Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.302142 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5tnrx" event={"ID":"6870caea-07d6-4465-86b1-645a2e29b240","Type":"ContainerStarted","Data":"bfe6aca1334934677df8bf272b7d6fdeb1c785b92dcc8ef7c0566c6636ddfaa3"} Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.306173 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" event={"ID":"9da9cfd5-f798-42e0-af98-8378cf8d1e5f","Type":"ContainerStarted","Data":"7f6dc77957ef0c3112728bef3166915837fb45018b662ba23f21fb9a5b1d11d9"} Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.306214 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" event={"ID":"9da9cfd5-f798-42e0-af98-8378cf8d1e5f","Type":"ContainerStarted","Data":"791f2e4e796f079af101ec362853eaa486bb3e46d120e36fdb1c000b9b27a22e"} Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.306420 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.315971 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/080747b0-3d43-4ff1-b21c-b8ea9fc2f961-utilities\") pod \"redhat-marketplace-7crb6\" (UID: \"080747b0-3d43-4ff1-b21c-b8ea9fc2f961\") " pod="openshift-marketplace/redhat-marketplace-7crb6" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.316412 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/080747b0-3d43-4ff1-b21c-b8ea9fc2f961-catalog-content\") pod \"redhat-marketplace-7crb6\" (UID: \"080747b0-3d43-4ff1-b21c-b8ea9fc2f961\") " pod="openshift-marketplace/redhat-marketplace-7crb6" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.316499 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfddm\" (UniqueName: \"kubernetes.io/projected/080747b0-3d43-4ff1-b21c-b8ea9fc2f961-kube-api-access-wfddm\") pod \"redhat-marketplace-7crb6\" (UID: \"080747b0-3d43-4ff1-b21c-b8ea9fc2f961\") " pod="openshift-marketplace/redhat-marketplace-7crb6" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.316660 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/080747b0-3d43-4ff1-b21c-b8ea9fc2f961-utilities\") pod \"redhat-marketplace-7crb6\" (UID: \"080747b0-3d43-4ff1-b21c-b8ea9fc2f961\") " pod="openshift-marketplace/redhat-marketplace-7crb6" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.318893 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/080747b0-3d43-4ff1-b21c-b8ea9fc2f961-catalog-content\") pod \"redhat-marketplace-7crb6\" (UID: \"080747b0-3d43-4ff1-b21c-b8ea9fc2f961\") " pod="openshift-marketplace/redhat-marketplace-7crb6" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.344823 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfddm\" (UniqueName: \"kubernetes.io/projected/080747b0-3d43-4ff1-b21c-b8ea9fc2f961-kube-api-access-wfddm\") pod \"redhat-marketplace-7crb6\" (UID: \"080747b0-3d43-4ff1-b21c-b8ea9fc2f961\") " pod="openshift-marketplace/redhat-marketplace-7crb6" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.353651 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs"] Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.356055 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qs2qs"] Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.373543 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" podStartSLOduration=158.373523393 podStartE2EDuration="2m38.373523393s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:40.371419988 +0000 UTC m=+216.009686771" watchObservedRunningTime="2026-03-13 11:51:40.373523393 +0000 UTC m=+216.011790156" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.457789 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9dbhc"] Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.465413 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9dbhc"] Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.500777 4837 patch_prober.go:28] interesting pod/router-default-5444994796-9tkxg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:51:40 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Mar 13 11:51:40 crc kubenswrapper[4837]: [+]process-running ok Mar 13 11:51:40 crc kubenswrapper[4837]: healthz check failed Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.500875 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9tkxg" podUID="3eaa54fb-8d70-463c-8388-9f8443a480ed" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.501592 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-cf76c7dc-qtd9h"] Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.502886 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.505018 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7crb6" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.505693 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.505783 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9"] Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.505836 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.506278 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.506604 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.507027 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.507279 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.515309 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.515802 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.516000 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.515802 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.516691 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.518727 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.522384 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.523143 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12e5f732-00c7-49ae-9e3e-121aa7caa6ee-serving-cert\") pod \"route-controller-manager-6b69f575c8-6gmv9\" (UID: \"12e5f732-00c7-49ae-9e3e-121aa7caa6ee\") " pod="openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.524406 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-proxy-ca-bundles\") pod \"controller-manager-cf76c7dc-qtd9h\" (UID: \"a3c9b59a-0eeb-49e0-86ef-30222e5926aa\") " pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.524577 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7km6k\" (UniqueName: \"kubernetes.io/projected/12e5f732-00c7-49ae-9e3e-121aa7caa6ee-kube-api-access-7km6k\") pod \"route-controller-manager-6b69f575c8-6gmv9\" (UID: \"12e5f732-00c7-49ae-9e3e-121aa7caa6ee\") " pod="openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.524776 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2mns\" (UniqueName: \"kubernetes.io/projected/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-kube-api-access-x2mns\") pod \"controller-manager-cf76c7dc-qtd9h\" (UID: \"a3c9b59a-0eeb-49e0-86ef-30222e5926aa\") " pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.525025 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12e5f732-00c7-49ae-9e3e-121aa7caa6ee-config\") pod \"route-controller-manager-6b69f575c8-6gmv9\" (UID: \"12e5f732-00c7-49ae-9e3e-121aa7caa6ee\") " pod="openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.525163 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12e5f732-00c7-49ae-9e3e-121aa7caa6ee-client-ca\") pod \"route-controller-manager-6b69f575c8-6gmv9\" (UID: \"12e5f732-00c7-49ae-9e3e-121aa7caa6ee\") " pod="openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.524287 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.525405 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-cf76c7dc-qtd9h"] Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.527356 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-client-ca\") pod \"controller-manager-cf76c7dc-qtd9h\" (UID: \"a3c9b59a-0eeb-49e0-86ef-30222e5926aa\") " pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.527586 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-config\") pod \"controller-manager-cf76c7dc-qtd9h\" (UID: \"a3c9b59a-0eeb-49e0-86ef-30222e5926aa\") " pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.527715 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-serving-cert\") pod \"controller-manager-cf76c7dc-qtd9h\" (UID: \"a3c9b59a-0eeb-49e0-86ef-30222e5926aa\") " pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.540609 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9"] Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.571172 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jspgm"] Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.572129 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jspgm" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.591448 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jspgm"] Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.629498 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12e5f732-00c7-49ae-9e3e-121aa7caa6ee-serving-cert\") pod \"route-controller-manager-6b69f575c8-6gmv9\" (UID: \"12e5f732-00c7-49ae-9e3e-121aa7caa6ee\") " pod="openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.629554 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5236ae0e-b305-4f1c-9125-bbac1eeb07f3-utilities\") pod \"redhat-marketplace-jspgm\" (UID: \"5236ae0e-b305-4f1c-9125-bbac1eeb07f3\") " pod="openshift-marketplace/redhat-marketplace-jspgm" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.629589 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-proxy-ca-bundles\") pod \"controller-manager-cf76c7dc-qtd9h\" (UID: \"a3c9b59a-0eeb-49e0-86ef-30222e5926aa\") " pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.629609 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7km6k\" (UniqueName: \"kubernetes.io/projected/12e5f732-00c7-49ae-9e3e-121aa7caa6ee-kube-api-access-7km6k\") pod \"route-controller-manager-6b69f575c8-6gmv9\" (UID: \"12e5f732-00c7-49ae-9e3e-121aa7caa6ee\") " pod="openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.630568 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2mns\" (UniqueName: \"kubernetes.io/projected/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-kube-api-access-x2mns\") pod \"controller-manager-cf76c7dc-qtd9h\" (UID: \"a3c9b59a-0eeb-49e0-86ef-30222e5926aa\") " pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.630627 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12e5f732-00c7-49ae-9e3e-121aa7caa6ee-config\") pod \"route-controller-manager-6b69f575c8-6gmv9\" (UID: \"12e5f732-00c7-49ae-9e3e-121aa7caa6ee\") " pod="openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.630676 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12e5f732-00c7-49ae-9e3e-121aa7caa6ee-client-ca\") pod \"route-controller-manager-6b69f575c8-6gmv9\" (UID: \"12e5f732-00c7-49ae-9e3e-121aa7caa6ee\") " pod="openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.630726 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l42fc\" (UniqueName: \"kubernetes.io/projected/5236ae0e-b305-4f1c-9125-bbac1eeb07f3-kube-api-access-l42fc\") pod \"redhat-marketplace-jspgm\" (UID: \"5236ae0e-b305-4f1c-9125-bbac1eeb07f3\") " pod="openshift-marketplace/redhat-marketplace-jspgm" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.630778 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-client-ca\") pod \"controller-manager-cf76c7dc-qtd9h\" (UID: \"a3c9b59a-0eeb-49e0-86ef-30222e5926aa\") " pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.630824 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-config\") pod \"controller-manager-cf76c7dc-qtd9h\" (UID: \"a3c9b59a-0eeb-49e0-86ef-30222e5926aa\") " pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.630856 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-serving-cert\") pod \"controller-manager-cf76c7dc-qtd9h\" (UID: \"a3c9b59a-0eeb-49e0-86ef-30222e5926aa\") " pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.640292 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12e5f732-00c7-49ae-9e3e-121aa7caa6ee-serving-cert\") pod \"route-controller-manager-6b69f575c8-6gmv9\" (UID: \"12e5f732-00c7-49ae-9e3e-121aa7caa6ee\") " pod="openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.643205 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-proxy-ca-bundles\") pod \"controller-manager-cf76c7dc-qtd9h\" (UID: \"a3c9b59a-0eeb-49e0-86ef-30222e5926aa\") " pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.644028 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12e5f732-00c7-49ae-9e3e-121aa7caa6ee-config\") pod \"route-controller-manager-6b69f575c8-6gmv9\" (UID: \"12e5f732-00c7-49ae-9e3e-121aa7caa6ee\") " pod="openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.644696 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-client-ca\") pod \"controller-manager-cf76c7dc-qtd9h\" (UID: \"a3c9b59a-0eeb-49e0-86ef-30222e5926aa\") " pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.644729 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12e5f732-00c7-49ae-9e3e-121aa7caa6ee-client-ca\") pod \"route-controller-manager-6b69f575c8-6gmv9\" (UID: \"12e5f732-00c7-49ae-9e3e-121aa7caa6ee\") " pod="openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.644792 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5236ae0e-b305-4f1c-9125-bbac1eeb07f3-catalog-content\") pod \"redhat-marketplace-jspgm\" (UID: \"5236ae0e-b305-4f1c-9125-bbac1eeb07f3\") " pod="openshift-marketplace/redhat-marketplace-jspgm" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.645830 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-config\") pod \"controller-manager-cf76c7dc-qtd9h\" (UID: \"a3c9b59a-0eeb-49e0-86ef-30222e5926aa\") " pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.650449 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-serving-cert\") pod \"controller-manager-cf76c7dc-qtd9h\" (UID: \"a3c9b59a-0eeb-49e0-86ef-30222e5926aa\") " pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.664389 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7km6k\" (UniqueName: \"kubernetes.io/projected/12e5f732-00c7-49ae-9e3e-121aa7caa6ee-kube-api-access-7km6k\") pod \"route-controller-manager-6b69f575c8-6gmv9\" (UID: \"12e5f732-00c7-49ae-9e3e-121aa7caa6ee\") " pod="openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.668720 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2mns\" (UniqueName: \"kubernetes.io/projected/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-kube-api-access-x2mns\") pod \"controller-manager-cf76c7dc-qtd9h\" (UID: \"a3c9b59a-0eeb-49e0-86ef-30222e5926aa\") " pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.747089 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5236ae0e-b305-4f1c-9125-bbac1eeb07f3-catalog-content\") pod \"redhat-marketplace-jspgm\" (UID: \"5236ae0e-b305-4f1c-9125-bbac1eeb07f3\") " pod="openshift-marketplace/redhat-marketplace-jspgm" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.747185 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5236ae0e-b305-4f1c-9125-bbac1eeb07f3-utilities\") pod \"redhat-marketplace-jspgm\" (UID: \"5236ae0e-b305-4f1c-9125-bbac1eeb07f3\") " pod="openshift-marketplace/redhat-marketplace-jspgm" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.747320 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l42fc\" (UniqueName: \"kubernetes.io/projected/5236ae0e-b305-4f1c-9125-bbac1eeb07f3-kube-api-access-l42fc\") pod \"redhat-marketplace-jspgm\" (UID: \"5236ae0e-b305-4f1c-9125-bbac1eeb07f3\") " pod="openshift-marketplace/redhat-marketplace-jspgm" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.748557 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5236ae0e-b305-4f1c-9125-bbac1eeb07f3-catalog-content\") pod \"redhat-marketplace-jspgm\" (UID: \"5236ae0e-b305-4f1c-9125-bbac1eeb07f3\") " pod="openshift-marketplace/redhat-marketplace-jspgm" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.749403 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5236ae0e-b305-4f1c-9125-bbac1eeb07f3-utilities\") pod \"redhat-marketplace-jspgm\" (UID: \"5236ae0e-b305-4f1c-9125-bbac1eeb07f3\") " pod="openshift-marketplace/redhat-marketplace-jspgm" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.771593 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7crb6"] Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.779806 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l42fc\" (UniqueName: \"kubernetes.io/projected/5236ae0e-b305-4f1c-9125-bbac1eeb07f3-kube-api-access-l42fc\") pod \"redhat-marketplace-jspgm\" (UID: \"5236ae0e-b305-4f1c-9125-bbac1eeb07f3\") " pod="openshift-marketplace/redhat-marketplace-jspgm" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.829894 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.846422 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9" Mar 13 11:51:40 crc kubenswrapper[4837]: I0313 11:51:40.893881 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jspgm" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.025428 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.029241 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.036370 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.044934 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.049214 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.082276 4837 patch_prober.go:28] interesting pod/console-f9d7485db-q2qpt container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.082328 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-q2qpt" podUID="c83842ec-9933-4f84-bb4a-c84ca61a28e1" containerName="console" probeResult="failure" output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.097397 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a3cabe4-69ee-49f7-a783-e72ac1a56821" path="/var/lib/kubelet/pods/5a3cabe4-69ee-49f7-a783-e72ac1a56821/volumes" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.098440 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8bc408a-bca6-42ff-8572-2ba9a3978682" path="/var/lib/kubelet/pods/f8bc408a-bca6-42ff-8572-2ba9a3978682/volumes" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.099125 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.099170 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.099184 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.099294 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.099314 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.108311 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-qqkbm" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.128987 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.135412 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.145305 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.153031 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b3036ca-4a0e-45d9-9c51-c3faa6067ce3-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"5b3036ca-4a0e-45d9-9c51-c3faa6067ce3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.153124 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5b3036ca-4a0e-45d9-9c51-c3faa6067ce3-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"5b3036ca-4a0e-45d9-9c51-c3faa6067ce3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.210943 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ng6kk"] Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.212169 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ng6kk" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.215983 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.254673 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d-catalog-content\") pod \"redhat-operators-ng6kk\" (UID: \"bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d\") " pod="openshift-marketplace/redhat-operators-ng6kk" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.255839 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b3036ca-4a0e-45d9-9c51-c3faa6067ce3-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"5b3036ca-4a0e-45d9-9c51-c3faa6067ce3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.255880 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d-utilities\") pod \"redhat-operators-ng6kk\" (UID: \"bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d\") " pod="openshift-marketplace/redhat-operators-ng6kk" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.255901 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpclz\" (UniqueName: \"kubernetes.io/projected/bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d-kube-api-access-vpclz\") pod \"redhat-operators-ng6kk\" (UID: \"bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d\") " pod="openshift-marketplace/redhat-operators-ng6kk" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.255927 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5b3036ca-4a0e-45d9-9c51-c3faa6067ce3-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"5b3036ca-4a0e-45d9-9c51-c3faa6067ce3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.256047 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5b3036ca-4a0e-45d9-9c51-c3faa6067ce3-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"5b3036ca-4a0e-45d9-9c51-c3faa6067ce3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.281382 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ng6kk"] Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.282824 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.295814 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b3036ca-4a0e-45d9-9c51-c3faa6067ce3-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"5b3036ca-4a0e-45d9-9c51-c3faa6067ce3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.304881 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-8dj7w" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.335057 4837 generic.go:334] "Generic (PLEG): container finished" podID="080747b0-3d43-4ff1-b21c-b8ea9fc2f961" containerID="86b2c7193d237a632c12a22d24c63b2f2247ec49e6eadaf1bcddddc01304e298" exitCode=0 Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.335115 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7crb6" event={"ID":"080747b0-3d43-4ff1-b21c-b8ea9fc2f961","Type":"ContainerDied","Data":"86b2c7193d237a632c12a22d24c63b2f2247ec49e6eadaf1bcddddc01304e298"} Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.335142 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7crb6" event={"ID":"080747b0-3d43-4ff1-b21c-b8ea9fc2f961","Type":"ContainerStarted","Data":"3672e1f233b40bf42b048214c1fa7e9647f6025a8a0466aed9482e60a925fb22"} Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.340361 4837 generic.go:334] "Generic (PLEG): container finished" podID="9762555c-fc85-46c5-99a4-0b01577780b0" containerID="2af5ddab1d2a04daf9c57e357b7966b38ff85801a9c08e016c2ec482b2f9eb04" exitCode=0 Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.340410 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9762555c-fc85-46c5-99a4-0b01577780b0","Type":"ContainerDied","Data":"2af5ddab1d2a04daf9c57e357b7966b38ff85801a9c08e016c2ec482b2f9eb04"} Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.343944 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9"] Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.357719 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d-utilities\") pod \"redhat-operators-ng6kk\" (UID: \"bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d\") " pod="openshift-marketplace/redhat-operators-ng6kk" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.357776 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpclz\" (UniqueName: \"kubernetes.io/projected/bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d-kube-api-access-vpclz\") pod \"redhat-operators-ng6kk\" (UID: \"bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d\") " pod="openshift-marketplace/redhat-operators-ng6kk" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.357934 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d-catalog-content\") pod \"redhat-operators-ng6kk\" (UID: \"bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d\") " pod="openshift-marketplace/redhat-operators-ng6kk" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.359460 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d-utilities\") pod \"redhat-operators-ng6kk\" (UID: \"bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d\") " pod="openshift-marketplace/redhat-operators-ng6kk" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.360054 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d-catalog-content\") pod \"redhat-operators-ng6kk\" (UID: \"bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d\") " pod="openshift-marketplace/redhat-operators-ng6kk" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.386941 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.398765 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-cf76c7dc-qtd9h"] Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.400527 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpclz\" (UniqueName: \"kubernetes.io/projected/bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d-kube-api-access-vpclz\") pod \"redhat-operators-ng6kk\" (UID: \"bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d\") " pod="openshift-marketplace/redhat-operators-ng6kk" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.408287 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-472bb" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.495732 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-9tkxg" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.497873 4837 patch_prober.go:28] interesting pod/router-default-5444994796-9tkxg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:51:41 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Mar 13 11:51:41 crc kubenswrapper[4837]: [+]process-running ok Mar 13 11:51:41 crc kubenswrapper[4837]: healthz check failed Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.497929 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9tkxg" podUID="3eaa54fb-8d70-463c-8388-9f8443a480ed" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.588464 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ng6kk" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.607317 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-j246z"] Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.608351 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j246z" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.613011 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j246z"] Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.653201 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jspgm"] Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.763652 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32a36cbe-a17f-46bf-9c6a-1df6f427e2c6-catalog-content\") pod \"redhat-operators-j246z\" (UID: \"32a36cbe-a17f-46bf-9c6a-1df6f427e2c6\") " pod="openshift-marketplace/redhat-operators-j246z" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.763725 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32a36cbe-a17f-46bf-9c6a-1df6f427e2c6-utilities\") pod \"redhat-operators-j246z\" (UID: \"32a36cbe-a17f-46bf-9c6a-1df6f427e2c6\") " pod="openshift-marketplace/redhat-operators-j246z" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.763750 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx4zq\" (UniqueName: \"kubernetes.io/projected/32a36cbe-a17f-46bf-9c6a-1df6f427e2c6-kube-api-access-xx4zq\") pod \"redhat-operators-j246z\" (UID: \"32a36cbe-a17f-46bf-9c6a-1df6f427e2c6\") " pod="openshift-marketplace/redhat-operators-j246z" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.865360 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32a36cbe-a17f-46bf-9c6a-1df6f427e2c6-utilities\") pod \"redhat-operators-j246z\" (UID: \"32a36cbe-a17f-46bf-9c6a-1df6f427e2c6\") " pod="openshift-marketplace/redhat-operators-j246z" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.865833 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx4zq\" (UniqueName: \"kubernetes.io/projected/32a36cbe-a17f-46bf-9c6a-1df6f427e2c6-kube-api-access-xx4zq\") pod \"redhat-operators-j246z\" (UID: \"32a36cbe-a17f-46bf-9c6a-1df6f427e2c6\") " pod="openshift-marketplace/redhat-operators-j246z" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.865949 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32a36cbe-a17f-46bf-9c6a-1df6f427e2c6-catalog-content\") pod \"redhat-operators-j246z\" (UID: \"32a36cbe-a17f-46bf-9c6a-1df6f427e2c6\") " pod="openshift-marketplace/redhat-operators-j246z" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.866440 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32a36cbe-a17f-46bf-9c6a-1df6f427e2c6-catalog-content\") pod \"redhat-operators-j246z\" (UID: \"32a36cbe-a17f-46bf-9c6a-1df6f427e2c6\") " pod="openshift-marketplace/redhat-operators-j246z" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.866720 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32a36cbe-a17f-46bf-9c6a-1df6f427e2c6-utilities\") pod \"redhat-operators-j246z\" (UID: \"32a36cbe-a17f-46bf-9c6a-1df6f427e2c6\") " pod="openshift-marketplace/redhat-operators-j246z" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.871151 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556705-kllhr" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.910267 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx4zq\" (UniqueName: \"kubernetes.io/projected/32a36cbe-a17f-46bf-9c6a-1df6f427e2c6-kube-api-access-xx4zq\") pod \"redhat-operators-j246z\" (UID: \"32a36cbe-a17f-46bf-9c6a-1df6f427e2c6\") " pod="openshift-marketplace/redhat-operators-j246z" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.966752 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/831db5b2-5229-4b52-8783-f99c640ba856-config-volume\") pod \"831db5b2-5229-4b52-8783-f99c640ba856\" (UID: \"831db5b2-5229-4b52-8783-f99c640ba856\") " Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.966822 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pl4f8\" (UniqueName: \"kubernetes.io/projected/831db5b2-5229-4b52-8783-f99c640ba856-kube-api-access-pl4f8\") pod \"831db5b2-5229-4b52-8783-f99c640ba856\" (UID: \"831db5b2-5229-4b52-8783-f99c640ba856\") " Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.966857 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/831db5b2-5229-4b52-8783-f99c640ba856-secret-volume\") pod \"831db5b2-5229-4b52-8783-f99c640ba856\" (UID: \"831db5b2-5229-4b52-8783-f99c640ba856\") " Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.968993 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/831db5b2-5229-4b52-8783-f99c640ba856-config-volume" (OuterVolumeSpecName: "config-volume") pod "831db5b2-5229-4b52-8783-f99c640ba856" (UID: "831db5b2-5229-4b52-8783-f99c640ba856"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.973377 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/831db5b2-5229-4b52-8783-f99c640ba856-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "831db5b2-5229-4b52-8783-f99c640ba856" (UID: "831db5b2-5229-4b52-8783-f99c640ba856"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.975712 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/831db5b2-5229-4b52-8783-f99c640ba856-kube-api-access-pl4f8" (OuterVolumeSpecName: "kube-api-access-pl4f8") pod "831db5b2-5229-4b52-8783-f99c640ba856" (UID: "831db5b2-5229-4b52-8783-f99c640ba856"). InnerVolumeSpecName "kube-api-access-pl4f8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:51:41 crc kubenswrapper[4837]: I0313 11:51:41.979076 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:41.999452 4837 patch_prober.go:28] interesting pod/downloads-7954f5f757-8ktsx container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:41.999724 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-8ktsx" podUID="85ac6950-8b98-4d0c-8a2b-7eeeac8d1435" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:41.999959 4837 patch_prober.go:28] interesting pod/downloads-7954f5f757-8ktsx container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.000059 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8ktsx" podUID="85ac6950-8b98-4d0c-8a2b-7eeeac8d1435" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.069780 4837 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/831db5b2-5229-4b52-8783-f99c640ba856-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.070303 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pl4f8\" (UniqueName: \"kubernetes.io/projected/831db5b2-5229-4b52-8783-f99c640ba856-kube-api-access-pl4f8\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.070319 4837 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/831db5b2-5229-4b52-8783-f99c640ba856-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.085203 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j246z" Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.177365 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-8vgmn" Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.179568 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xhx6c" Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.369288 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ng6kk"] Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.409003 4837 ???:1] "http: TLS handshake error from 192.168.126.11:42486: no serving certificate available for the kubelet" Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.447000 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9" event={"ID":"12e5f732-00c7-49ae-9e3e-121aa7caa6ee","Type":"ContainerStarted","Data":"b82fa6f2134589dee51636289f2f8c0ff8d4c77d04a184b0382c40aa9a2b8bdc"} Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.447048 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9" event={"ID":"12e5f732-00c7-49ae-9e3e-121aa7caa6ee","Type":"ContainerStarted","Data":"351e128a579d7bea389593621f8531499b1484f659ddbed7034ee720b2bb6945"} Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.450275 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9" Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.474159 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9" Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.477929 4837 generic.go:334] "Generic (PLEG): container finished" podID="5236ae0e-b305-4f1c-9125-bbac1eeb07f3" containerID="23cf1d25c2a824231d5fd39eb5a9920394e587e257254f6ffa6bf893fe9f2624" exitCode=0 Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.478014 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jspgm" event={"ID":"5236ae0e-b305-4f1c-9125-bbac1eeb07f3","Type":"ContainerDied","Data":"23cf1d25c2a824231d5fd39eb5a9920394e587e257254f6ffa6bf893fe9f2624"} Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.478041 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jspgm" event={"ID":"5236ae0e-b305-4f1c-9125-bbac1eeb07f3","Type":"ContainerStarted","Data":"307f294c9c816d0f8c581cbf3561f2a5e0cff01395517438e2ad320ce61f35e4"} Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.492369 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5b3036ca-4a0e-45d9-9c51-c3faa6067ce3","Type":"ContainerStarted","Data":"68c9618069477f1223e3ecc0d2dec6041262b1cf792bbe2a6cc6e41865aea27a"} Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.492720 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9" podStartSLOduration=3.4927026469999998 podStartE2EDuration="3.492702647s" podCreationTimestamp="2026-03-13 11:51:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:42.490681454 +0000 UTC m=+218.128948227" watchObservedRunningTime="2026-03-13 11:51:42.492702647 +0000 UTC m=+218.130969410" Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.499308 4837 patch_prober.go:28] interesting pod/router-default-5444994796-9tkxg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:51:42 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Mar 13 11:51:42 crc kubenswrapper[4837]: [+]process-running ok Mar 13 11:51:42 crc kubenswrapper[4837]: healthz check failed Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.499633 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9tkxg" podUID="3eaa54fb-8d70-463c-8388-9f8443a480ed" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.505271 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556705-kllhr" event={"ID":"831db5b2-5229-4b52-8783-f99c640ba856","Type":"ContainerDied","Data":"3c294e8fd793300ee1caaf9d2068d0b14e0a9dc058385ad90bb33a7237e0e283"} Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.505307 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c294e8fd793300ee1caaf9d2068d0b14e0a9dc058385ad90bb33a7237e0e283" Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.505379 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556705-kllhr" Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.520000 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" event={"ID":"a3c9b59a-0eeb-49e0-86ef-30222e5926aa","Type":"ContainerStarted","Data":"04c1b4cdc66c99fd06f47d8f53d5a9118d0695a5ac3f712471886566797fee43"} Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.520127 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" event={"ID":"a3c9b59a-0eeb-49e0-86ef-30222e5926aa","Type":"ContainerStarted","Data":"d7a58daa62b7a3f44dc2a8d87fb35984d20d34c979da845c5833bab0d1c0d7f2"} Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.526782 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.541797 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.597940 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" podStartSLOduration=3.597915461 podStartE2EDuration="3.597915461s" podCreationTimestamp="2026-03-13 11:51:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:42.585046458 +0000 UTC m=+218.223313211" watchObservedRunningTime="2026-03-13 11:51:42.597915461 +0000 UTC m=+218.236182224" Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.983052 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j246z"] Mar 13 11:51:42 crc kubenswrapper[4837]: I0313 11:51:42.996870 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 11:51:42 crc kubenswrapper[4837]: W0313 11:51:42.998709 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32a36cbe_a17f_46bf_9c6a_1df6f427e2c6.slice/crio-524d259feb76e4121fddc10b32a9829c69c7b137ab82d2d2c18f81ea9d556b60 WatchSource:0}: Error finding container 524d259feb76e4121fddc10b32a9829c69c7b137ab82d2d2c18f81ea9d556b60: Status 404 returned error can't find the container with id 524d259feb76e4121fddc10b32a9829c69c7b137ab82d2d2c18f81ea9d556b60 Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.112185 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9762555c-fc85-46c5-99a4-0b01577780b0-kubelet-dir\") pod \"9762555c-fc85-46c5-99a4-0b01577780b0\" (UID: \"9762555c-fc85-46c5-99a4-0b01577780b0\") " Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.113114 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9762555c-fc85-46c5-99a4-0b01577780b0-kube-api-access\") pod \"9762555c-fc85-46c5-99a4-0b01577780b0\" (UID: \"9762555c-fc85-46c5-99a4-0b01577780b0\") " Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.113304 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.113336 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.113363 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.113408 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.114396 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9762555c-fc85-46c5-99a4-0b01577780b0-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "9762555c-fc85-46c5-99a4-0b01577780b0" (UID: "9762555c-fc85-46c5-99a4-0b01577780b0"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.116737 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.128118 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.129287 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.129518 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.129727 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9762555c-fc85-46c5-99a4-0b01577780b0-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "9762555c-fc85-46c5-99a4-0b01577780b0" (UID: "9762555c-fc85-46c5-99a4-0b01577780b0"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.214408 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86e5afeb-4720-4593-a53e-dfb5381d0b1d-metrics-certs\") pod \"network-metrics-daemon-cjn4q\" (UID: \"86e5afeb-4720-4593-a53e-dfb5381d0b1d\") " pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.214509 4837 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9762555c-fc85-46c5-99a4-0b01577780b0-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.214524 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9762555c-fc85-46c5-99a4-0b01577780b0-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.220660 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86e5afeb-4720-4593-a53e-dfb5381d0b1d-metrics-certs\") pod \"network-metrics-daemon-cjn4q\" (UID: \"86e5afeb-4720-4593-a53e-dfb5381d0b1d\") " pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.260806 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.271646 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.279400 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.471265 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cjn4q" Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.494979 4837 patch_prober.go:28] interesting pod/router-default-5444994796-9tkxg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:51:43 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Mar 13 11:51:43 crc kubenswrapper[4837]: [+]process-running ok Mar 13 11:51:43 crc kubenswrapper[4837]: healthz check failed Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.495213 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9tkxg" podUID="3eaa54fb-8d70-463c-8388-9f8443a480ed" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.544777 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5b3036ca-4a0e-45d9-9c51-c3faa6067ce3","Type":"ContainerStarted","Data":"5f2332ff2aa4ac65770f7fb36b9b44babbbf5ea1ede6559d7a38c358e4838de4"} Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.565911 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.565890414 podStartE2EDuration="2.565890414s" podCreationTimestamp="2026-03-13 11:51:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:43.565211424 +0000 UTC m=+219.203478197" watchObservedRunningTime="2026-03-13 11:51:43.565890414 +0000 UTC m=+219.204157187" Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.570503 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9762555c-fc85-46c5-99a4-0b01577780b0","Type":"ContainerDied","Data":"6127d8b87e94fb8dd1104b1d5a80edde6a809747eb5ba6dc0602986991a215db"} Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.570543 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6127d8b87e94fb8dd1104b1d5a80edde6a809747eb5ba6dc0602986991a215db" Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.570598 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.582021 4837 generic.go:334] "Generic (PLEG): container finished" podID="32a36cbe-a17f-46bf-9c6a-1df6f427e2c6" containerID="613107c1ce24dcf9cb1cf0c1623f3de9a7d5b33bc09c57a646911cae7011d82e" exitCode=0 Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.582108 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j246z" event={"ID":"32a36cbe-a17f-46bf-9c6a-1df6f427e2c6","Type":"ContainerDied","Data":"613107c1ce24dcf9cb1cf0c1623f3de9a7d5b33bc09c57a646911cae7011d82e"} Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.582137 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j246z" event={"ID":"32a36cbe-a17f-46bf-9c6a-1df6f427e2c6","Type":"ContainerStarted","Data":"524d259feb76e4121fddc10b32a9829c69c7b137ab82d2d2c18f81ea9d556b60"} Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.590520 4837 generic.go:334] "Generic (PLEG): container finished" podID="bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d" containerID="96174c590656df138c1d79af4e8416815fc220b78535e35ba951e58fb70ac305" exitCode=0 Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.590674 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ng6kk" event={"ID":"bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d","Type":"ContainerDied","Data":"96174c590656df138c1d79af4e8416815fc220b78535e35ba951e58fb70ac305"} Mar 13 11:51:43 crc kubenswrapper[4837]: I0313 11:51:43.590720 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ng6kk" event={"ID":"bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d","Type":"ContainerStarted","Data":"eac45e620e44e693cbb55f704b7783d81f0f024e3e2cf4051be3383dc9b6b145"} Mar 13 11:51:43 crc kubenswrapper[4837]: W0313 11:51:43.959269 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-9a1a9acd8c95fe633e0ad75eed184ecded06d9cd1da532cd24c0c79c70ede582 WatchSource:0}: Error finding container 9a1a9acd8c95fe633e0ad75eed184ecded06d9cd1da532cd24c0c79c70ede582: Status 404 returned error can't find the container with id 9a1a9acd8c95fe633e0ad75eed184ecded06d9cd1da532cd24c0c79c70ede582 Mar 13 11:51:44 crc kubenswrapper[4837]: I0313 11:51:44.049709 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-cjn4q"] Mar 13 11:51:44 crc kubenswrapper[4837]: I0313 11:51:44.290256 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-z9thp" Mar 13 11:51:44 crc kubenswrapper[4837]: I0313 11:51:44.498889 4837 patch_prober.go:28] interesting pod/router-default-5444994796-9tkxg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:51:44 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Mar 13 11:51:44 crc kubenswrapper[4837]: [+]process-running ok Mar 13 11:51:44 crc kubenswrapper[4837]: healthz check failed Mar 13 11:51:44 crc kubenswrapper[4837]: I0313 11:51:44.498966 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9tkxg" podUID="3eaa54fb-8d70-463c-8388-9f8443a480ed" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:51:44 crc kubenswrapper[4837]: I0313 11:51:44.622785 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" event={"ID":"86e5afeb-4720-4593-a53e-dfb5381d0b1d","Type":"ContainerStarted","Data":"5cf98a5b729f9333f1f80a59486dc7faa7bcb28e5a0ff758d9fc65192d2b963f"} Mar 13 11:51:44 crc kubenswrapper[4837]: I0313 11:51:44.629241 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"470c532726880d1e6f64d1b2504c3040447e21a8badc2ff8ff10632e0dfad3b7"} Mar 13 11:51:44 crc kubenswrapper[4837]: I0313 11:51:44.631185 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"5c62d412c0b98557eabe844fb8b508b384241ec442c29112b6bcb46aecab33af"} Mar 13 11:51:44 crc kubenswrapper[4837]: I0313 11:51:44.659682 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"f92ebade66e7aabb858b5a2cd9e46c26aa00174bf8ba4e8fbf822142b02c3cba"} Mar 13 11:51:44 crc kubenswrapper[4837]: I0313 11:51:44.659754 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"9a1a9acd8c95fe633e0ad75eed184ecded06d9cd1da532cd24c0c79c70ede582"} Mar 13 11:51:44 crc kubenswrapper[4837]: I0313 11:51:44.670147 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"7bc6ebb4cef7add0cd370a574f88318c57908982e7687b876dd4f22f8dba508e"} Mar 13 11:51:44 crc kubenswrapper[4837]: I0313 11:51:44.670205 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"a4d28a7b7e7e955e2a6a9bca6a2142d04feb7e810a4e97116b1e3c960df2ae1f"} Mar 13 11:51:44 crc kubenswrapper[4837]: I0313 11:51:44.671111 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:51:44 crc kubenswrapper[4837]: I0313 11:51:44.673531 4837 generic.go:334] "Generic (PLEG): container finished" podID="5b3036ca-4a0e-45d9-9c51-c3faa6067ce3" containerID="5f2332ff2aa4ac65770f7fb36b9b44babbbf5ea1ede6559d7a38c358e4838de4" exitCode=0 Mar 13 11:51:44 crc kubenswrapper[4837]: I0313 11:51:44.673916 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5b3036ca-4a0e-45d9-9c51-c3faa6067ce3","Type":"ContainerDied","Data":"5f2332ff2aa4ac65770f7fb36b9b44babbbf5ea1ede6559d7a38c358e4838de4"} Mar 13 11:51:45 crc kubenswrapper[4837]: I0313 11:51:45.493374 4837 patch_prober.go:28] interesting pod/router-default-5444994796-9tkxg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:51:45 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Mar 13 11:51:45 crc kubenswrapper[4837]: [+]process-running ok Mar 13 11:51:45 crc kubenswrapper[4837]: healthz check failed Mar 13 11:51:45 crc kubenswrapper[4837]: I0313 11:51:45.493441 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9tkxg" podUID="3eaa54fb-8d70-463c-8388-9f8443a480ed" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:51:45 crc kubenswrapper[4837]: I0313 11:51:45.682902 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" event={"ID":"86e5afeb-4720-4593-a53e-dfb5381d0b1d","Type":"ContainerStarted","Data":"e1ea74358708fea67b17abc9bede9d292fe5bf85f5d2f9c7ae44e817c98fa621"} Mar 13 11:51:45 crc kubenswrapper[4837]: I0313 11:51:45.903630 4837 ???:1] "http: TLS handshake error from 192.168.126.11:52396: no serving certificate available for the kubelet" Mar 13 11:51:46 crc kubenswrapper[4837]: I0313 11:51:46.116253 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 11:51:46 crc kubenswrapper[4837]: I0313 11:51:46.190353 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b3036ca-4a0e-45d9-9c51-c3faa6067ce3-kube-api-access\") pod \"5b3036ca-4a0e-45d9-9c51-c3faa6067ce3\" (UID: \"5b3036ca-4a0e-45d9-9c51-c3faa6067ce3\") " Mar 13 11:51:46 crc kubenswrapper[4837]: I0313 11:51:46.190472 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5b3036ca-4a0e-45d9-9c51-c3faa6067ce3-kubelet-dir\") pod \"5b3036ca-4a0e-45d9-9c51-c3faa6067ce3\" (UID: \"5b3036ca-4a0e-45d9-9c51-c3faa6067ce3\") " Mar 13 11:51:46 crc kubenswrapper[4837]: I0313 11:51:46.190801 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5b3036ca-4a0e-45d9-9c51-c3faa6067ce3-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5b3036ca-4a0e-45d9-9c51-c3faa6067ce3" (UID: "5b3036ca-4a0e-45d9-9c51-c3faa6067ce3"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:51:46 crc kubenswrapper[4837]: I0313 11:51:46.204772 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b3036ca-4a0e-45d9-9c51-c3faa6067ce3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5b3036ca-4a0e-45d9-9c51-c3faa6067ce3" (UID: "5b3036ca-4a0e-45d9-9c51-c3faa6067ce3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:51:46 crc kubenswrapper[4837]: I0313 11:51:46.291826 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b3036ca-4a0e-45d9-9c51-c3faa6067ce3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:46 crc kubenswrapper[4837]: I0313 11:51:46.291857 4837 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5b3036ca-4a0e-45d9-9c51-c3faa6067ce3-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 13 11:51:46 crc kubenswrapper[4837]: I0313 11:51:46.493700 4837 patch_prober.go:28] interesting pod/router-default-5444994796-9tkxg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:51:46 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Mar 13 11:51:46 crc kubenswrapper[4837]: [+]process-running ok Mar 13 11:51:46 crc kubenswrapper[4837]: healthz check failed Mar 13 11:51:46 crc kubenswrapper[4837]: I0313 11:51:46.493774 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9tkxg" podUID="3eaa54fb-8d70-463c-8388-9f8443a480ed" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:51:46 crc kubenswrapper[4837]: I0313 11:51:46.753203 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 11:51:46 crc kubenswrapper[4837]: I0313 11:51:46.753332 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5b3036ca-4a0e-45d9-9c51-c3faa6067ce3","Type":"ContainerDied","Data":"68c9618069477f1223e3ecc0d2dec6041262b1cf792bbe2a6cc6e41865aea27a"} Mar 13 11:51:46 crc kubenswrapper[4837]: I0313 11:51:46.753374 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68c9618069477f1223e3ecc0d2dec6041262b1cf792bbe2a6cc6e41865aea27a" Mar 13 11:51:46 crc kubenswrapper[4837]: I0313 11:51:46.775894 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cjn4q" event={"ID":"86e5afeb-4720-4593-a53e-dfb5381d0b1d","Type":"ContainerStarted","Data":"be923e781809c02a28a4c9369907b19014d2a115043ade3ff095562f00fa19e4"} Mar 13 11:51:46 crc kubenswrapper[4837]: I0313 11:51:46.803027 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-cjn4q" podStartSLOduration=164.803004427 podStartE2EDuration="2m44.803004427s" podCreationTimestamp="2026-03-13 11:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:51:46.802358927 +0000 UTC m=+222.440625690" watchObservedRunningTime="2026-03-13 11:51:46.803004427 +0000 UTC m=+222.441271190" Mar 13 11:51:47 crc kubenswrapper[4837]: I0313 11:51:47.493131 4837 patch_prober.go:28] interesting pod/router-default-5444994796-9tkxg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:51:47 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Mar 13 11:51:47 crc kubenswrapper[4837]: [+]process-running ok Mar 13 11:51:47 crc kubenswrapper[4837]: healthz check failed Mar 13 11:51:47 crc kubenswrapper[4837]: I0313 11:51:47.493205 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9tkxg" podUID="3eaa54fb-8d70-463c-8388-9f8443a480ed" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:51:47 crc kubenswrapper[4837]: I0313 11:51:47.549811 4837 ???:1] "http: TLS handshake error from 192.168.126.11:52398: no serving certificate available for the kubelet" Mar 13 11:51:48 crc kubenswrapper[4837]: I0313 11:51:48.493707 4837 patch_prober.go:28] interesting pod/router-default-5444994796-9tkxg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:51:48 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Mar 13 11:51:48 crc kubenswrapper[4837]: [+]process-running ok Mar 13 11:51:48 crc kubenswrapper[4837]: healthz check failed Mar 13 11:51:48 crc kubenswrapper[4837]: I0313 11:51:48.493823 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9tkxg" podUID="3eaa54fb-8d70-463c-8388-9f8443a480ed" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:51:49 crc kubenswrapper[4837]: I0313 11:51:49.492130 4837 patch_prober.go:28] interesting pod/router-default-5444994796-9tkxg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:51:49 crc kubenswrapper[4837]: [-]has-synced failed: reason withheld Mar 13 11:51:49 crc kubenswrapper[4837]: [+]process-running ok Mar 13 11:51:49 crc kubenswrapper[4837]: healthz check failed Mar 13 11:51:49 crc kubenswrapper[4837]: I0313 11:51:49.492221 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9tkxg" podUID="3eaa54fb-8d70-463c-8388-9f8443a480ed" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:51:50 crc kubenswrapper[4837]: I0313 11:51:50.497793 4837 patch_prober.go:28] interesting pod/router-default-5444994796-9tkxg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 11:51:50 crc kubenswrapper[4837]: [+]has-synced ok Mar 13 11:51:50 crc kubenswrapper[4837]: [+]process-running ok Mar 13 11:51:50 crc kubenswrapper[4837]: healthz check failed Mar 13 11:51:50 crc kubenswrapper[4837]: I0313 11:51:50.497871 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9tkxg" podUID="3eaa54fb-8d70-463c-8388-9f8443a480ed" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 11:51:51 crc kubenswrapper[4837]: I0313 11:51:51.074345 4837 patch_prober.go:28] interesting pod/console-f9d7485db-q2qpt container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 13 11:51:51 crc kubenswrapper[4837]: I0313 11:51:51.074699 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-q2qpt" podUID="c83842ec-9933-4f84-bb4a-c84ca61a28e1" containerName="console" probeResult="failure" output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 13 11:51:51 crc kubenswrapper[4837]: I0313 11:51:51.493298 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-9tkxg" Mar 13 11:51:51 crc kubenswrapper[4837]: I0313 11:51:51.495945 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-9tkxg" Mar 13 11:51:52 crc kubenswrapper[4837]: I0313 11:51:52.004867 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-8ktsx" Mar 13 11:51:58 crc kubenswrapper[4837]: I0313 11:51:58.102486 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-cf76c7dc-qtd9h"] Mar 13 11:51:58 crc kubenswrapper[4837]: I0313 11:51:58.103164 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" podUID="a3c9b59a-0eeb-49e0-86ef-30222e5926aa" containerName="controller-manager" containerID="cri-o://04c1b4cdc66c99fd06f47d8f53d5a9118d0695a5ac3f712471886566797fee43" gracePeriod=30 Mar 13 11:51:58 crc kubenswrapper[4837]: I0313 11:51:58.122045 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9"] Mar 13 11:51:58 crc kubenswrapper[4837]: I0313 11:51:58.122311 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9" podUID="12e5f732-00c7-49ae-9e3e-121aa7caa6ee" containerName="route-controller-manager" containerID="cri-o://b82fa6f2134589dee51636289f2f8c0ff8d4c77d04a184b0382c40aa9a2b8bdc" gracePeriod=30 Mar 13 11:51:58 crc kubenswrapper[4837]: I0313 11:51:58.918160 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:52:00 crc kubenswrapper[4837]: I0313 11:52:00.137715 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556712-g8877"] Mar 13 11:52:00 crc kubenswrapper[4837]: E0313 11:52:00.138112 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b3036ca-4a0e-45d9-9c51-c3faa6067ce3" containerName="pruner" Mar 13 11:52:00 crc kubenswrapper[4837]: I0313 11:52:00.138134 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b3036ca-4a0e-45d9-9c51-c3faa6067ce3" containerName="pruner" Mar 13 11:52:00 crc kubenswrapper[4837]: E0313 11:52:00.138153 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="831db5b2-5229-4b52-8783-f99c640ba856" containerName="collect-profiles" Mar 13 11:52:00 crc kubenswrapper[4837]: I0313 11:52:00.138162 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="831db5b2-5229-4b52-8783-f99c640ba856" containerName="collect-profiles" Mar 13 11:52:00 crc kubenswrapper[4837]: E0313 11:52:00.138183 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9762555c-fc85-46c5-99a4-0b01577780b0" containerName="pruner" Mar 13 11:52:00 crc kubenswrapper[4837]: I0313 11:52:00.138193 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="9762555c-fc85-46c5-99a4-0b01577780b0" containerName="pruner" Mar 13 11:52:00 crc kubenswrapper[4837]: I0313 11:52:00.138413 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="831db5b2-5229-4b52-8783-f99c640ba856" containerName="collect-profiles" Mar 13 11:52:00 crc kubenswrapper[4837]: I0313 11:52:00.138444 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b3036ca-4a0e-45d9-9c51-c3faa6067ce3" containerName="pruner" Mar 13 11:52:00 crc kubenswrapper[4837]: I0313 11:52:00.138460 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="9762555c-fc85-46c5-99a4-0b01577780b0" containerName="pruner" Mar 13 11:52:00 crc kubenswrapper[4837]: I0313 11:52:00.139214 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556712-g8877" Mar 13 11:52:00 crc kubenswrapper[4837]: I0313 11:52:00.141794 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 11:52:00 crc kubenswrapper[4837]: I0313 11:52:00.146772 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556712-g8877"] Mar 13 11:52:00 crc kubenswrapper[4837]: I0313 11:52:00.179078 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xqlb\" (UniqueName: \"kubernetes.io/projected/87edec8a-33b2-44c0-bbcb-1e4f5dded1b2-kube-api-access-9xqlb\") pod \"auto-csr-approver-29556712-g8877\" (UID: \"87edec8a-33b2-44c0-bbcb-1e4f5dded1b2\") " pod="openshift-infra/auto-csr-approver-29556712-g8877" Mar 13 11:52:00 crc kubenswrapper[4837]: I0313 11:52:00.279798 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xqlb\" (UniqueName: \"kubernetes.io/projected/87edec8a-33b2-44c0-bbcb-1e4f5dded1b2-kube-api-access-9xqlb\") pod \"auto-csr-approver-29556712-g8877\" (UID: \"87edec8a-33b2-44c0-bbcb-1e4f5dded1b2\") " pod="openshift-infra/auto-csr-approver-29556712-g8877" Mar 13 11:52:00 crc kubenswrapper[4837]: I0313 11:52:00.298383 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xqlb\" (UniqueName: \"kubernetes.io/projected/87edec8a-33b2-44c0-bbcb-1e4f5dded1b2-kube-api-access-9xqlb\") pod \"auto-csr-approver-29556712-g8877\" (UID: \"87edec8a-33b2-44c0-bbcb-1e4f5dded1b2\") " pod="openshift-infra/auto-csr-approver-29556712-g8877" Mar 13 11:52:00 crc kubenswrapper[4837]: I0313 11:52:00.581071 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556712-g8877" Mar 13 11:52:00 crc kubenswrapper[4837]: I0313 11:52:00.831320 4837 patch_prober.go:28] interesting pod/controller-manager-cf76c7dc-qtd9h container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" start-of-body= Mar 13 11:52:00 crc kubenswrapper[4837]: I0313 11:52:00.831407 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" podUID="a3c9b59a-0eeb-49e0-86ef-30222e5926aa" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" Mar 13 11:52:00 crc kubenswrapper[4837]: I0313 11:52:00.848550 4837 patch_prober.go:28] interesting pod/route-controller-manager-6b69f575c8-6gmv9 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" start-of-body= Mar 13 11:52:00 crc kubenswrapper[4837]: I0313 11:52:00.848608 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9" podUID="12e5f732-00c7-49ae-9e3e-121aa7caa6ee" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" Mar 13 11:52:00 crc kubenswrapper[4837]: I0313 11:52:00.884278 4837 generic.go:334] "Generic (PLEG): container finished" podID="12e5f732-00c7-49ae-9e3e-121aa7caa6ee" containerID="b82fa6f2134589dee51636289f2f8c0ff8d4c77d04a184b0382c40aa9a2b8bdc" exitCode=0 Mar 13 11:52:00 crc kubenswrapper[4837]: I0313 11:52:00.884349 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9" event={"ID":"12e5f732-00c7-49ae-9e3e-121aa7caa6ee","Type":"ContainerDied","Data":"b82fa6f2134589dee51636289f2f8c0ff8d4c77d04a184b0382c40aa9a2b8bdc"} Mar 13 11:52:00 crc kubenswrapper[4837]: I0313 11:52:00.886325 4837 generic.go:334] "Generic (PLEG): container finished" podID="a3c9b59a-0eeb-49e0-86ef-30222e5926aa" containerID="04c1b4cdc66c99fd06f47d8f53d5a9118d0695a5ac3f712471886566797fee43" exitCode=0 Mar 13 11:52:00 crc kubenswrapper[4837]: I0313 11:52:00.886348 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" event={"ID":"a3c9b59a-0eeb-49e0-86ef-30222e5926aa","Type":"ContainerDied","Data":"04c1b4cdc66c99fd06f47d8f53d5a9118d0695a5ac3f712471886566797fee43"} Mar 13 11:52:01 crc kubenswrapper[4837]: I0313 11:52:01.078079 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 11:52:01 crc kubenswrapper[4837]: I0313 11:52:01.082183 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 11:52:05 crc kubenswrapper[4837]: I0313 11:52:05.484227 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 11:52:05 crc kubenswrapper[4837]: I0313 11:52:05.484597 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 11:52:08 crc kubenswrapper[4837]: I0313 11:52:08.053692 4837 ???:1] "http: TLS handshake error from 192.168.126.11:39490: no serving certificate available for the kubelet" Mar 13 11:52:11 crc kubenswrapper[4837]: I0313 11:52:11.534400 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bcfcc" Mar 13 11:52:11 crc kubenswrapper[4837]: I0313 11:52:11.832437 4837 patch_prober.go:28] interesting pod/controller-manager-cf76c7dc-qtd9h container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.51:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 11:52:11 crc kubenswrapper[4837]: I0313 11:52:11.832731 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" podUID="a3c9b59a-0eeb-49e0-86ef-30222e5926aa" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.51:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 11:52:11 crc kubenswrapper[4837]: I0313 11:52:11.848793 4837 patch_prober.go:28] interesting pod/route-controller-manager-6b69f575c8-6gmv9 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 11:52:11 crc kubenswrapper[4837]: I0313 11:52:11.848876 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9" podUID="12e5f732-00c7-49ae-9e3e-121aa7caa6ee" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 11:52:12 crc kubenswrapper[4837]: E0313 11:52:12.957820 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 13 11:52:12 crc kubenswrapper[4837]: E0313 11:52:12.958238 4837 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 13 11:52:12 crc kubenswrapper[4837]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 13 11:52:12 crc kubenswrapper[4837]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dlwdw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29556710-lcprh_openshift-infra(0484d991-f239-47a2-80ff-0237945c27ac): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 13 11:52:12 crc kubenswrapper[4837]: > logger="UnhandledError" Mar 13 11:52:12 crc kubenswrapper[4837]: E0313 11:52:12.960005 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29556710-lcprh" podUID="0484d991-f239-47a2-80ff-0237945c27ac" Mar 13 11:52:12 crc kubenswrapper[4837]: I0313 11:52:12.969271 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" Mar 13 11:52:12 crc kubenswrapper[4837]: I0313 11:52:12.972950 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:12.999324 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6dfc58dd94-n92qv"] Mar 13 11:52:13 crc kubenswrapper[4837]: E0313 11:52:13.000029 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12e5f732-00c7-49ae-9e3e-121aa7caa6ee" containerName="route-controller-manager" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.000042 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="12e5f732-00c7-49ae-9e3e-121aa7caa6ee" containerName="route-controller-manager" Mar 13 11:52:13 crc kubenswrapper[4837]: E0313 11:52:13.000053 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3c9b59a-0eeb-49e0-86ef-30222e5926aa" containerName="controller-manager" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.000059 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3c9b59a-0eeb-49e0-86ef-30222e5926aa" containerName="controller-manager" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.000156 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3c9b59a-0eeb-49e0-86ef-30222e5926aa" containerName="controller-manager" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.000165 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="12e5f732-00c7-49ae-9e3e-121aa7caa6ee" containerName="route-controller-manager" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.000501 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6dfc58dd94-n92qv" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.013974 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6dfc58dd94-n92qv"] Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.040618 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-serving-cert\") pod \"a3c9b59a-0eeb-49e0-86ef-30222e5926aa\" (UID: \"a3c9b59a-0eeb-49e0-86ef-30222e5926aa\") " Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.040698 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-config\") pod \"a3c9b59a-0eeb-49e0-86ef-30222e5926aa\" (UID: \"a3c9b59a-0eeb-49e0-86ef-30222e5926aa\") " Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.040931 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12e5f732-00c7-49ae-9e3e-121aa7caa6ee-config\") pod \"12e5f732-00c7-49ae-9e3e-121aa7caa6ee\" (UID: \"12e5f732-00c7-49ae-9e3e-121aa7caa6ee\") " Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.040963 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2mns\" (UniqueName: \"kubernetes.io/projected/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-kube-api-access-x2mns\") pod \"a3c9b59a-0eeb-49e0-86ef-30222e5926aa\" (UID: \"a3c9b59a-0eeb-49e0-86ef-30222e5926aa\") " Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.041002 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7km6k\" (UniqueName: \"kubernetes.io/projected/12e5f732-00c7-49ae-9e3e-121aa7caa6ee-kube-api-access-7km6k\") pod \"12e5f732-00c7-49ae-9e3e-121aa7caa6ee\" (UID: \"12e5f732-00c7-49ae-9e3e-121aa7caa6ee\") " Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.041029 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-proxy-ca-bundles\") pod \"a3c9b59a-0eeb-49e0-86ef-30222e5926aa\" (UID: \"a3c9b59a-0eeb-49e0-86ef-30222e5926aa\") " Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.041043 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12e5f732-00c7-49ae-9e3e-121aa7caa6ee-client-ca\") pod \"12e5f732-00c7-49ae-9e3e-121aa7caa6ee\" (UID: \"12e5f732-00c7-49ae-9e3e-121aa7caa6ee\") " Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.041084 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-client-ca\") pod \"a3c9b59a-0eeb-49e0-86ef-30222e5926aa\" (UID: \"a3c9b59a-0eeb-49e0-86ef-30222e5926aa\") " Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.041104 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12e5f732-00c7-49ae-9e3e-121aa7caa6ee-serving-cert\") pod \"12e5f732-00c7-49ae-9e3e-121aa7caa6ee\" (UID: \"12e5f732-00c7-49ae-9e3e-121aa7caa6ee\") " Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.041223 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9af86f4e-8143-426d-98a6-b59bde2a6247-proxy-ca-bundles\") pod \"controller-manager-6dfc58dd94-n92qv\" (UID: \"9af86f4e-8143-426d-98a6-b59bde2a6247\") " pod="openshift-controller-manager/controller-manager-6dfc58dd94-n92qv" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.041258 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4k7j\" (UniqueName: \"kubernetes.io/projected/9af86f4e-8143-426d-98a6-b59bde2a6247-kube-api-access-q4k7j\") pod \"controller-manager-6dfc58dd94-n92qv\" (UID: \"9af86f4e-8143-426d-98a6-b59bde2a6247\") " pod="openshift-controller-manager/controller-manager-6dfc58dd94-n92qv" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.041284 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9af86f4e-8143-426d-98a6-b59bde2a6247-config\") pod \"controller-manager-6dfc58dd94-n92qv\" (UID: \"9af86f4e-8143-426d-98a6-b59bde2a6247\") " pod="openshift-controller-manager/controller-manager-6dfc58dd94-n92qv" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.041311 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9af86f4e-8143-426d-98a6-b59bde2a6247-client-ca\") pod \"controller-manager-6dfc58dd94-n92qv\" (UID: \"9af86f4e-8143-426d-98a6-b59bde2a6247\") " pod="openshift-controller-manager/controller-manager-6dfc58dd94-n92qv" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.041326 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9af86f4e-8143-426d-98a6-b59bde2a6247-serving-cert\") pod \"controller-manager-6dfc58dd94-n92qv\" (UID: \"9af86f4e-8143-426d-98a6-b59bde2a6247\") " pod="openshift-controller-manager/controller-manager-6dfc58dd94-n92qv" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.041766 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-config" (OuterVolumeSpecName: "config") pod "a3c9b59a-0eeb-49e0-86ef-30222e5926aa" (UID: "a3c9b59a-0eeb-49e0-86ef-30222e5926aa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.041793 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12e5f732-00c7-49ae-9e3e-121aa7caa6ee-config" (OuterVolumeSpecName: "config") pod "12e5f732-00c7-49ae-9e3e-121aa7caa6ee" (UID: "12e5f732-00c7-49ae-9e3e-121aa7caa6ee"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.042340 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12e5f732-00c7-49ae-9e3e-121aa7caa6ee-client-ca" (OuterVolumeSpecName: "client-ca") pod "12e5f732-00c7-49ae-9e3e-121aa7caa6ee" (UID: "12e5f732-00c7-49ae-9e3e-121aa7caa6ee"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.042532 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a3c9b59a-0eeb-49e0-86ef-30222e5926aa" (UID: "a3c9b59a-0eeb-49e0-86ef-30222e5926aa"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.043362 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-client-ca" (OuterVolumeSpecName: "client-ca") pod "a3c9b59a-0eeb-49e0-86ef-30222e5926aa" (UID: "a3c9b59a-0eeb-49e0-86ef-30222e5926aa"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.046033 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12e5f732-00c7-49ae-9e3e-121aa7caa6ee-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "12e5f732-00c7-49ae-9e3e-121aa7caa6ee" (UID: "12e5f732-00c7-49ae-9e3e-121aa7caa6ee"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.047043 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12e5f732-00c7-49ae-9e3e-121aa7caa6ee-kube-api-access-7km6k" (OuterVolumeSpecName: "kube-api-access-7km6k") pod "12e5f732-00c7-49ae-9e3e-121aa7caa6ee" (UID: "12e5f732-00c7-49ae-9e3e-121aa7caa6ee"). InnerVolumeSpecName "kube-api-access-7km6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.047149 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-kube-api-access-x2mns" (OuterVolumeSpecName: "kube-api-access-x2mns") pod "a3c9b59a-0eeb-49e0-86ef-30222e5926aa" (UID: "a3c9b59a-0eeb-49e0-86ef-30222e5926aa"). InnerVolumeSpecName "kube-api-access-x2mns". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.061117 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a3c9b59a-0eeb-49e0-86ef-30222e5926aa" (UID: "a3c9b59a-0eeb-49e0-86ef-30222e5926aa"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.142990 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4k7j\" (UniqueName: \"kubernetes.io/projected/9af86f4e-8143-426d-98a6-b59bde2a6247-kube-api-access-q4k7j\") pod \"controller-manager-6dfc58dd94-n92qv\" (UID: \"9af86f4e-8143-426d-98a6-b59bde2a6247\") " pod="openshift-controller-manager/controller-manager-6dfc58dd94-n92qv" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.143045 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9af86f4e-8143-426d-98a6-b59bde2a6247-config\") pod \"controller-manager-6dfc58dd94-n92qv\" (UID: \"9af86f4e-8143-426d-98a6-b59bde2a6247\") " pod="openshift-controller-manager/controller-manager-6dfc58dd94-n92qv" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.143072 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9af86f4e-8143-426d-98a6-b59bde2a6247-client-ca\") pod \"controller-manager-6dfc58dd94-n92qv\" (UID: \"9af86f4e-8143-426d-98a6-b59bde2a6247\") " pod="openshift-controller-manager/controller-manager-6dfc58dd94-n92qv" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.143093 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9af86f4e-8143-426d-98a6-b59bde2a6247-serving-cert\") pod \"controller-manager-6dfc58dd94-n92qv\" (UID: \"9af86f4e-8143-426d-98a6-b59bde2a6247\") " pod="openshift-controller-manager/controller-manager-6dfc58dd94-n92qv" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.143156 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9af86f4e-8143-426d-98a6-b59bde2a6247-proxy-ca-bundles\") pod \"controller-manager-6dfc58dd94-n92qv\" (UID: \"9af86f4e-8143-426d-98a6-b59bde2a6247\") " pod="openshift-controller-manager/controller-manager-6dfc58dd94-n92qv" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.143215 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12e5f732-00c7-49ae-9e3e-121aa7caa6ee-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.143226 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2mns\" (UniqueName: \"kubernetes.io/projected/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-kube-api-access-x2mns\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.143237 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7km6k\" (UniqueName: \"kubernetes.io/projected/12e5f732-00c7-49ae-9e3e-121aa7caa6ee-kube-api-access-7km6k\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.143246 4837 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.143254 4837 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12e5f732-00c7-49ae-9e3e-121aa7caa6ee-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.143264 4837 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.143273 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12e5f732-00c7-49ae-9e3e-121aa7caa6ee-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.143281 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.143290 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3c9b59a-0eeb-49e0-86ef-30222e5926aa-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.144436 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9af86f4e-8143-426d-98a6-b59bde2a6247-client-ca\") pod \"controller-manager-6dfc58dd94-n92qv\" (UID: \"9af86f4e-8143-426d-98a6-b59bde2a6247\") " pod="openshift-controller-manager/controller-manager-6dfc58dd94-n92qv" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.144507 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9af86f4e-8143-426d-98a6-b59bde2a6247-config\") pod \"controller-manager-6dfc58dd94-n92qv\" (UID: \"9af86f4e-8143-426d-98a6-b59bde2a6247\") " pod="openshift-controller-manager/controller-manager-6dfc58dd94-n92qv" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.144814 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9af86f4e-8143-426d-98a6-b59bde2a6247-proxy-ca-bundles\") pod \"controller-manager-6dfc58dd94-n92qv\" (UID: \"9af86f4e-8143-426d-98a6-b59bde2a6247\") " pod="openshift-controller-manager/controller-manager-6dfc58dd94-n92qv" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.148172 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9af86f4e-8143-426d-98a6-b59bde2a6247-serving-cert\") pod \"controller-manager-6dfc58dd94-n92qv\" (UID: \"9af86f4e-8143-426d-98a6-b59bde2a6247\") " pod="openshift-controller-manager/controller-manager-6dfc58dd94-n92qv" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.159459 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4k7j\" (UniqueName: \"kubernetes.io/projected/9af86f4e-8143-426d-98a6-b59bde2a6247-kube-api-access-q4k7j\") pod \"controller-manager-6dfc58dd94-n92qv\" (UID: \"9af86f4e-8143-426d-98a6-b59bde2a6247\") " pod="openshift-controller-manager/controller-manager-6dfc58dd94-n92qv" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.165361 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.165357 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-cf76c7dc-qtd9h" event={"ID":"a3c9b59a-0eeb-49e0-86ef-30222e5926aa","Type":"ContainerDied","Data":"d7a58daa62b7a3f44dc2a8d87fb35984d20d34c979da845c5833bab0d1c0d7f2"} Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.165491 4837 scope.go:117] "RemoveContainer" containerID="04c1b4cdc66c99fd06f47d8f53d5a9118d0695a5ac3f712471886566797fee43" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.167010 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9" event={"ID":"12e5f732-00c7-49ae-9e3e-121aa7caa6ee","Type":"ContainerDied","Data":"351e128a579d7bea389593621f8531499b1484f659ddbed7034ee720b2bb6945"} Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.167054 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9" Mar 13 11:52:13 crc kubenswrapper[4837]: E0313 11:52:13.168914 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29556710-lcprh" podUID="0484d991-f239-47a2-80ff-0237945c27ac" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.194061 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9"] Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.197384 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b69f575c8-6gmv9"] Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.203125 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-cf76c7dc-qtd9h"] Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.206984 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-cf76c7dc-qtd9h"] Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.280805 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 11:52:13 crc kubenswrapper[4837]: I0313 11:52:13.328863 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6dfc58dd94-n92qv" Mar 13 11:52:14 crc kubenswrapper[4837]: I0313 11:52:14.410548 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 13 11:52:14 crc kubenswrapper[4837]: I0313 11:52:14.411611 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 11:52:14 crc kubenswrapper[4837]: I0313 11:52:14.414311 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 13 11:52:14 crc kubenswrapper[4837]: I0313 11:52:14.414513 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 13 11:52:14 crc kubenswrapper[4837]: I0313 11:52:14.421435 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 13 11:52:14 crc kubenswrapper[4837]: I0313 11:52:14.462383 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0db827be-f908-46a8-9402-a858214284e7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0db827be-f908-46a8-9402-a858214284e7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 11:52:14 crc kubenswrapper[4837]: I0313 11:52:14.462439 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0db827be-f908-46a8-9402-a858214284e7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0db827be-f908-46a8-9402-a858214284e7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 11:52:14 crc kubenswrapper[4837]: I0313 11:52:14.563209 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0db827be-f908-46a8-9402-a858214284e7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0db827be-f908-46a8-9402-a858214284e7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 11:52:14 crc kubenswrapper[4837]: I0313 11:52:14.563571 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0db827be-f908-46a8-9402-a858214284e7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0db827be-f908-46a8-9402-a858214284e7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 11:52:14 crc kubenswrapper[4837]: I0313 11:52:14.564605 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0db827be-f908-46a8-9402-a858214284e7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0db827be-f908-46a8-9402-a858214284e7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 11:52:14 crc kubenswrapper[4837]: I0313 11:52:14.583698 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0db827be-f908-46a8-9402-a858214284e7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0db827be-f908-46a8-9402-a858214284e7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 11:52:14 crc kubenswrapper[4837]: I0313 11:52:14.742559 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 11:52:15 crc kubenswrapper[4837]: I0313 11:52:15.056270 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12e5f732-00c7-49ae-9e3e-121aa7caa6ee" path="/var/lib/kubelet/pods/12e5f732-00c7-49ae-9e3e-121aa7caa6ee/volumes" Mar 13 11:52:15 crc kubenswrapper[4837]: I0313 11:52:15.057067 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3c9b59a-0eeb-49e0-86ef-30222e5926aa" path="/var/lib/kubelet/pods/a3c9b59a-0eeb-49e0-86ef-30222e5926aa/volumes" Mar 13 11:52:15 crc kubenswrapper[4837]: I0313 11:52:15.527656 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5"] Mar 13 11:52:15 crc kubenswrapper[4837]: I0313 11:52:15.549371 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5"] Mar 13 11:52:15 crc kubenswrapper[4837]: I0313 11:52:15.549477 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5" Mar 13 11:52:15 crc kubenswrapper[4837]: I0313 11:52:15.551898 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 13 11:52:15 crc kubenswrapper[4837]: I0313 11:52:15.552160 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 11:52:15 crc kubenswrapper[4837]: I0313 11:52:15.552370 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 11:52:15 crc kubenswrapper[4837]: I0313 11:52:15.553181 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 11:52:15 crc kubenswrapper[4837]: I0313 11:52:15.553388 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 11:52:15 crc kubenswrapper[4837]: I0313 11:52:15.553592 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 11:52:15 crc kubenswrapper[4837]: I0313 11:52:15.576451 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krwzg\" (UniqueName: \"kubernetes.io/projected/94a9fa12-c97d-4b13-81a1-da33f15c7f42-kube-api-access-krwzg\") pod \"route-controller-manager-588697dd78-t4tn5\" (UID: \"94a9fa12-c97d-4b13-81a1-da33f15c7f42\") " pod="openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5" Mar 13 11:52:15 crc kubenswrapper[4837]: I0313 11:52:15.576504 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94a9fa12-c97d-4b13-81a1-da33f15c7f42-serving-cert\") pod \"route-controller-manager-588697dd78-t4tn5\" (UID: \"94a9fa12-c97d-4b13-81a1-da33f15c7f42\") " pod="openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5" Mar 13 11:52:15 crc kubenswrapper[4837]: I0313 11:52:15.576566 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/94a9fa12-c97d-4b13-81a1-da33f15c7f42-client-ca\") pod \"route-controller-manager-588697dd78-t4tn5\" (UID: \"94a9fa12-c97d-4b13-81a1-da33f15c7f42\") " pod="openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5" Mar 13 11:52:15 crc kubenswrapper[4837]: I0313 11:52:15.576619 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94a9fa12-c97d-4b13-81a1-da33f15c7f42-config\") pod \"route-controller-manager-588697dd78-t4tn5\" (UID: \"94a9fa12-c97d-4b13-81a1-da33f15c7f42\") " pod="openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5" Mar 13 11:52:15 crc kubenswrapper[4837]: I0313 11:52:15.677888 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94a9fa12-c97d-4b13-81a1-da33f15c7f42-config\") pod \"route-controller-manager-588697dd78-t4tn5\" (UID: \"94a9fa12-c97d-4b13-81a1-da33f15c7f42\") " pod="openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5" Mar 13 11:52:15 crc kubenswrapper[4837]: I0313 11:52:15.677953 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krwzg\" (UniqueName: \"kubernetes.io/projected/94a9fa12-c97d-4b13-81a1-da33f15c7f42-kube-api-access-krwzg\") pod \"route-controller-manager-588697dd78-t4tn5\" (UID: \"94a9fa12-c97d-4b13-81a1-da33f15c7f42\") " pod="openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5" Mar 13 11:52:15 crc kubenswrapper[4837]: I0313 11:52:15.677975 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94a9fa12-c97d-4b13-81a1-da33f15c7f42-serving-cert\") pod \"route-controller-manager-588697dd78-t4tn5\" (UID: \"94a9fa12-c97d-4b13-81a1-da33f15c7f42\") " pod="openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5" Mar 13 11:52:15 crc kubenswrapper[4837]: I0313 11:52:15.678017 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/94a9fa12-c97d-4b13-81a1-da33f15c7f42-client-ca\") pod \"route-controller-manager-588697dd78-t4tn5\" (UID: \"94a9fa12-c97d-4b13-81a1-da33f15c7f42\") " pod="openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5" Mar 13 11:52:15 crc kubenswrapper[4837]: I0313 11:52:15.678746 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/94a9fa12-c97d-4b13-81a1-da33f15c7f42-client-ca\") pod \"route-controller-manager-588697dd78-t4tn5\" (UID: \"94a9fa12-c97d-4b13-81a1-da33f15c7f42\") " pod="openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5" Mar 13 11:52:15 crc kubenswrapper[4837]: I0313 11:52:15.680450 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94a9fa12-c97d-4b13-81a1-da33f15c7f42-config\") pod \"route-controller-manager-588697dd78-t4tn5\" (UID: \"94a9fa12-c97d-4b13-81a1-da33f15c7f42\") " pod="openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5" Mar 13 11:52:15 crc kubenswrapper[4837]: I0313 11:52:15.682909 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94a9fa12-c97d-4b13-81a1-da33f15c7f42-serving-cert\") pod \"route-controller-manager-588697dd78-t4tn5\" (UID: \"94a9fa12-c97d-4b13-81a1-da33f15c7f42\") " pod="openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5" Mar 13 11:52:15 crc kubenswrapper[4837]: I0313 11:52:15.697123 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krwzg\" (UniqueName: \"kubernetes.io/projected/94a9fa12-c97d-4b13-81a1-da33f15c7f42-kube-api-access-krwzg\") pod \"route-controller-manager-588697dd78-t4tn5\" (UID: \"94a9fa12-c97d-4b13-81a1-da33f15c7f42\") " pod="openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5" Mar 13 11:52:15 crc kubenswrapper[4837]: I0313 11:52:15.868428 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5" Mar 13 11:52:16 crc kubenswrapper[4837]: E0313 11:52:16.791273 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 13 11:52:16 crc kubenswrapper[4837]: E0313 11:52:16.791683 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xx4zq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-j246z_openshift-marketplace(32a36cbe-a17f-46bf-9c6a-1df6f427e2c6): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 13 11:52:16 crc kubenswrapper[4837]: E0313 11:52:16.792880 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-j246z" podUID="32a36cbe-a17f-46bf-9c6a-1df6f427e2c6" Mar 13 11:52:18 crc kubenswrapper[4837]: I0313 11:52:18.121653 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6dfc58dd94-n92qv"] Mar 13 11:52:18 crc kubenswrapper[4837]: E0313 11:52:18.165044 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-j246z" podUID="32a36cbe-a17f-46bf-9c6a-1df6f427e2c6" Mar 13 11:52:18 crc kubenswrapper[4837]: I0313 11:52:18.220107 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5"] Mar 13 11:52:18 crc kubenswrapper[4837]: E0313 11:52:18.222772 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 13 11:52:18 crc kubenswrapper[4837]: E0313 11:52:18.222933 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l42fc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-jspgm_openshift-marketplace(5236ae0e-b305-4f1c-9125-bbac1eeb07f3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 13 11:52:18 crc kubenswrapper[4837]: E0313 11:52:18.224201 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-jspgm" podUID="5236ae0e-b305-4f1c-9125-bbac1eeb07f3" Mar 13 11:52:19 crc kubenswrapper[4837]: I0313 11:52:19.006674 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 13 11:52:19 crc kubenswrapper[4837]: I0313 11:52:19.007584 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 13 11:52:19 crc kubenswrapper[4837]: I0313 11:52:19.018896 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 13 11:52:19 crc kubenswrapper[4837]: I0313 11:52:19.122038 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/def1c7aa-51a6-4ee0-93d5-714721e9fc27-kube-api-access\") pod \"installer-9-crc\" (UID: \"def1c7aa-51a6-4ee0-93d5-714721e9fc27\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 11:52:19 crc kubenswrapper[4837]: I0313 11:52:19.122099 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/def1c7aa-51a6-4ee0-93d5-714721e9fc27-var-lock\") pod \"installer-9-crc\" (UID: \"def1c7aa-51a6-4ee0-93d5-714721e9fc27\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 11:52:19 crc kubenswrapper[4837]: I0313 11:52:19.122205 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/def1c7aa-51a6-4ee0-93d5-714721e9fc27-kubelet-dir\") pod \"installer-9-crc\" (UID: \"def1c7aa-51a6-4ee0-93d5-714721e9fc27\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 11:52:19 crc kubenswrapper[4837]: I0313 11:52:19.223066 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/def1c7aa-51a6-4ee0-93d5-714721e9fc27-kube-api-access\") pod \"installer-9-crc\" (UID: \"def1c7aa-51a6-4ee0-93d5-714721e9fc27\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 11:52:19 crc kubenswrapper[4837]: I0313 11:52:19.223119 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/def1c7aa-51a6-4ee0-93d5-714721e9fc27-var-lock\") pod \"installer-9-crc\" (UID: \"def1c7aa-51a6-4ee0-93d5-714721e9fc27\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 11:52:19 crc kubenswrapper[4837]: I0313 11:52:19.223149 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/def1c7aa-51a6-4ee0-93d5-714721e9fc27-kubelet-dir\") pod \"installer-9-crc\" (UID: \"def1c7aa-51a6-4ee0-93d5-714721e9fc27\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 11:52:19 crc kubenswrapper[4837]: I0313 11:52:19.223285 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/def1c7aa-51a6-4ee0-93d5-714721e9fc27-var-lock\") pod \"installer-9-crc\" (UID: \"def1c7aa-51a6-4ee0-93d5-714721e9fc27\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 11:52:19 crc kubenswrapper[4837]: I0313 11:52:19.223391 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/def1c7aa-51a6-4ee0-93d5-714721e9fc27-kubelet-dir\") pod \"installer-9-crc\" (UID: \"def1c7aa-51a6-4ee0-93d5-714721e9fc27\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 11:52:19 crc kubenswrapper[4837]: I0313 11:52:19.241208 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/def1c7aa-51a6-4ee0-93d5-714721e9fc27-kube-api-access\") pod \"installer-9-crc\" (UID: \"def1c7aa-51a6-4ee0-93d5-714721e9fc27\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 11:52:19 crc kubenswrapper[4837]: I0313 11:52:19.336261 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 13 11:52:19 crc kubenswrapper[4837]: E0313 11:52:19.721777 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-jspgm" podUID="5236ae0e-b305-4f1c-9125-bbac1eeb07f3" Mar 13 11:52:19 crc kubenswrapper[4837]: E0313 11:52:19.818494 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 13 11:52:19 crc kubenswrapper[4837]: E0313 11:52:19.818670 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xmfv2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-vx4r8_openshift-marketplace(45e6ae52-59ef-446f-917a-549d34ffbf8e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 13 11:52:19 crc kubenswrapper[4837]: E0313 11:52:19.819751 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 13 11:52:19 crc kubenswrapper[4837]: E0313 11:52:19.819774 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-vx4r8" podUID="45e6ae52-59ef-446f-917a-549d34ffbf8e" Mar 13 11:52:19 crc kubenswrapper[4837]: E0313 11:52:19.819847 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bcgkl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-twtbj_openshift-marketplace(278c91cc-2624-42cd-a35e-287e22d22f7d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 13 11:52:19 crc kubenswrapper[4837]: E0313 11:52:19.820993 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-twtbj" podUID="278c91cc-2624-42cd-a35e-287e22d22f7d" Mar 13 11:52:19 crc kubenswrapper[4837]: E0313 11:52:19.854331 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 13 11:52:19 crc kubenswrapper[4837]: E0313 11:52:19.854470 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vpclz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-ng6kk_openshift-marketplace(bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 13 11:52:19 crc kubenswrapper[4837]: E0313 11:52:19.855625 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-ng6kk" podUID="bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d" Mar 13 11:52:19 crc kubenswrapper[4837]: E0313 11:52:19.861778 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 13 11:52:19 crc kubenswrapper[4837]: E0313 11:52:19.861911 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wfddm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-7crb6_openshift-marketplace(080747b0-3d43-4ff1-b21c-b8ea9fc2f961): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 13 11:52:19 crc kubenswrapper[4837]: E0313 11:52:19.863096 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-7crb6" podUID="080747b0-3d43-4ff1-b21c-b8ea9fc2f961" Mar 13 11:52:21 crc kubenswrapper[4837]: E0313 11:52:21.331333 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-7crb6" podUID="080747b0-3d43-4ff1-b21c-b8ea9fc2f961" Mar 13 11:52:21 crc kubenswrapper[4837]: E0313 11:52:21.331440 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-ng6kk" podUID="bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d" Mar 13 11:52:21 crc kubenswrapper[4837]: E0313 11:52:21.331446 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-vx4r8" podUID="45e6ae52-59ef-446f-917a-549d34ffbf8e" Mar 13 11:52:21 crc kubenswrapper[4837]: E0313 11:52:21.331511 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-twtbj" podUID="278c91cc-2624-42cd-a35e-287e22d22f7d" Mar 13 11:52:21 crc kubenswrapper[4837]: I0313 11:52:21.372340 4837 scope.go:117] "RemoveContainer" containerID="b82fa6f2134589dee51636289f2f8c0ff8d4c77d04a184b0382c40aa9a2b8bdc" Mar 13 11:52:21 crc kubenswrapper[4837]: E0313 11:52:21.415213 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 13 11:52:21 crc kubenswrapper[4837]: E0313 11:52:21.415412 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gfvgs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-ft6cr_openshift-marketplace(e6060cf2-077e-4112-af57-f100e297f320): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 13 11:52:21 crc kubenswrapper[4837]: E0313 11:52:21.416561 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-ft6cr" podUID="e6060cf2-077e-4112-af57-f100e297f320" Mar 13 11:52:21 crc kubenswrapper[4837]: E0313 11:52:21.470214 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 13 11:52:21 crc kubenswrapper[4837]: E0313 11:52:21.470404 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4rpl9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-5tnrx_openshift-marketplace(6870caea-07d6-4465-86b1-645a2e29b240): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 13 11:52:21 crc kubenswrapper[4837]: E0313 11:52:21.471850 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-5tnrx" podUID="6870caea-07d6-4465-86b1-645a2e29b240" Mar 13 11:52:21 crc kubenswrapper[4837]: I0313 11:52:21.779863 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556712-g8877"] Mar 13 11:52:21 crc kubenswrapper[4837]: I0313 11:52:21.784405 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5"] Mar 13 11:52:21 crc kubenswrapper[4837]: W0313 11:52:21.790893 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87edec8a_33b2_44c0_bbcb_1e4f5dded1b2.slice/crio-a21cd2ecc8f429b49f1437604fc5e32c63e6746f08d9a4b8da352497b132669f WatchSource:0}: Error finding container a21cd2ecc8f429b49f1437604fc5e32c63e6746f08d9a4b8da352497b132669f: Status 404 returned error can't find the container with id a21cd2ecc8f429b49f1437604fc5e32c63e6746f08d9a4b8da352497b132669f Mar 13 11:52:21 crc kubenswrapper[4837]: I0313 11:52:21.844280 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6dfc58dd94-n92qv"] Mar 13 11:52:21 crc kubenswrapper[4837]: W0313 11:52:21.850987 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9af86f4e_8143_426d_98a6_b59bde2a6247.slice/crio-0a64770191264774b60b3b0035ecdb4182d299c43a1bebd3bdb1f2e9e4efd51c WatchSource:0}: Error finding container 0a64770191264774b60b3b0035ecdb4182d299c43a1bebd3bdb1f2e9e4efd51c: Status 404 returned error can't find the container with id 0a64770191264774b60b3b0035ecdb4182d299c43a1bebd3bdb1f2e9e4efd51c Mar 13 11:52:21 crc kubenswrapper[4837]: I0313 11:52:21.899555 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 13 11:52:21 crc kubenswrapper[4837]: I0313 11:52:21.910031 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 13 11:52:21 crc kubenswrapper[4837]: W0313 11:52:21.912416 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0db827be_f908_46a8_9402_a858214284e7.slice/crio-dbdd7156061fda7162f147ca816854574c46fd0ae13942bad0a58567040c30eb WatchSource:0}: Error finding container dbdd7156061fda7162f147ca816854574c46fd0ae13942bad0a58567040c30eb: Status 404 returned error can't find the container with id dbdd7156061fda7162f147ca816854574c46fd0ae13942bad0a58567040c30eb Mar 13 11:52:21 crc kubenswrapper[4837]: W0313 11:52:21.928889 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poddef1c7aa_51a6_4ee0_93d5_714721e9fc27.slice/crio-971c2474cd87ef98e0d1f40d16055d082ae34c7f72c4c883a6feb86fd0ff4ce0 WatchSource:0}: Error finding container 971c2474cd87ef98e0d1f40d16055d082ae34c7f72c4c883a6feb86fd0ff4ce0: Status 404 returned error can't find the container with id 971c2474cd87ef98e0d1f40d16055d082ae34c7f72c4c883a6feb86fd0ff4ce0 Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.212531 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5" event={"ID":"94a9fa12-c97d-4b13-81a1-da33f15c7f42","Type":"ContainerStarted","Data":"156ca548c0f1c4ee06367765d9ba3fe7ce821e85f789c679adc2538ac26cece9"} Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.212594 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5" podUID="94a9fa12-c97d-4b13-81a1-da33f15c7f42" containerName="route-controller-manager" containerID="cri-o://156ca548c0f1c4ee06367765d9ba3fe7ce821e85f789c679adc2538ac26cece9" gracePeriod=30 Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.212663 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.212682 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5" event={"ID":"94a9fa12-c97d-4b13-81a1-da33f15c7f42","Type":"ContainerStarted","Data":"e8304698d4bf5325cbeebf58e902f92321fa9a970fc375c2b81310795d7c51bb"} Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.216050 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"0db827be-f908-46a8-9402-a858214284e7","Type":"ContainerStarted","Data":"1302ef186a9c607b00444c8a5a974e46bcf1b9f7828a0fc6da6c8957d4d4e5cd"} Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.216095 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"0db827be-f908-46a8-9402-a858214284e7","Type":"ContainerStarted","Data":"dbdd7156061fda7162f147ca816854574c46fd0ae13942bad0a58567040c30eb"} Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.221031 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"def1c7aa-51a6-4ee0-93d5-714721e9fc27","Type":"ContainerStarted","Data":"65e28d9ae9393725ced85e7d2690513d16c23a3765c0987cd0c005a1bb7bef87"} Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.221085 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"def1c7aa-51a6-4ee0-93d5-714721e9fc27","Type":"ContainerStarted","Data":"971c2474cd87ef98e0d1f40d16055d082ae34c7f72c4c883a6feb86fd0ff4ce0"} Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.223413 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556712-g8877" event={"ID":"87edec8a-33b2-44c0-bbcb-1e4f5dded1b2","Type":"ContainerStarted","Data":"a21cd2ecc8f429b49f1437604fc5e32c63e6746f08d9a4b8da352497b132669f"} Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.226088 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6dfc58dd94-n92qv" event={"ID":"9af86f4e-8143-426d-98a6-b59bde2a6247","Type":"ContainerStarted","Data":"e8a2cef8e75a4e0ace069da9cf579224b66ef8558043d4da9db0156d41a9a6b0"} Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.226157 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6dfc58dd94-n92qv" event={"ID":"9af86f4e-8143-426d-98a6-b59bde2a6247","Type":"ContainerStarted","Data":"0a64770191264774b60b3b0035ecdb4182d299c43a1bebd3bdb1f2e9e4efd51c"} Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.226305 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6dfc58dd94-n92qv" podUID="9af86f4e-8143-426d-98a6-b59bde2a6247" containerName="controller-manager" containerID="cri-o://e8a2cef8e75a4e0ace069da9cf579224b66ef8558043d4da9db0156d41a9a6b0" gracePeriod=30 Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.226801 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6dfc58dd94-n92qv" Mar 13 11:52:22 crc kubenswrapper[4837]: E0313 11:52:22.227967 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-5tnrx" podUID="6870caea-07d6-4465-86b1-645a2e29b240" Mar 13 11:52:22 crc kubenswrapper[4837]: E0313 11:52:22.227989 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-ft6cr" podUID="e6060cf2-077e-4112-af57-f100e297f320" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.234506 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5" podStartSLOduration=24.234487848 podStartE2EDuration="24.234487848s" podCreationTimestamp="2026-03-13 11:51:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:52:22.231468243 +0000 UTC m=+257.869735016" watchObservedRunningTime="2026-03-13 11:52:22.234487848 +0000 UTC m=+257.872754621" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.252187 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=4.252164652 podStartE2EDuration="4.252164652s" podCreationTimestamp="2026-03-13 11:52:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:52:22.245600806 +0000 UTC m=+257.883867569" watchObservedRunningTime="2026-03-13 11:52:22.252164652 +0000 UTC m=+257.890431415" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.263562 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6dfc58dd94-n92qv" podStartSLOduration=24.263547228 podStartE2EDuration="24.263547228s" podCreationTimestamp="2026-03-13 11:51:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:52:22.26330042 +0000 UTC m=+257.901567183" watchObservedRunningTime="2026-03-13 11:52:22.263547228 +0000 UTC m=+257.901813991" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.290303 4837 patch_prober.go:28] interesting pod/controller-manager-6dfc58dd94-n92qv container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": EOF" start-of-body= Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.290365 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6dfc58dd94-n92qv" podUID="9af86f4e-8143-426d-98a6-b59bde2a6247" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": EOF" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.312007 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=8.311987254 podStartE2EDuration="8.311987254s" podCreationTimestamp="2026-03-13 11:52:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:52:22.293470905 +0000 UTC m=+257.931737668" watchObservedRunningTime="2026-03-13 11:52:22.311987254 +0000 UTC m=+257.950254017" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.544861 4837 patch_prober.go:28] interesting pod/route-controller-manager-588697dd78-t4tn5 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.61:8443/healthz\": read tcp 10.217.0.2:59940->10.217.0.61:8443: read: connection reset by peer" start-of-body= Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.545212 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5" podUID="94a9fa12-c97d-4b13-81a1-da33f15c7f42" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.61:8443/healthz\": read tcp 10.217.0.2:59940->10.217.0.61:8443: read: connection reset by peer" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.685410 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6dfc58dd94-n92qv" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.714776 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s"] Mar 13 11:52:22 crc kubenswrapper[4837]: E0313 11:52:22.715016 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9af86f4e-8143-426d-98a6-b59bde2a6247" containerName="controller-manager" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.715031 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="9af86f4e-8143-426d-98a6-b59bde2a6247" containerName="controller-manager" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.715177 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="9af86f4e-8143-426d-98a6-b59bde2a6247" containerName="controller-manager" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.715608 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.724036 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s"] Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.847305 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-588697dd78-t4tn5_94a9fa12-c97d-4b13-81a1-da33f15c7f42/route-controller-manager/0.log" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.847571 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.866956 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4k7j\" (UniqueName: \"kubernetes.io/projected/9af86f4e-8143-426d-98a6-b59bde2a6247-kube-api-access-q4k7j\") pod \"9af86f4e-8143-426d-98a6-b59bde2a6247\" (UID: \"9af86f4e-8143-426d-98a6-b59bde2a6247\") " Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.867189 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9af86f4e-8143-426d-98a6-b59bde2a6247-client-ca\") pod \"9af86f4e-8143-426d-98a6-b59bde2a6247\" (UID: \"9af86f4e-8143-426d-98a6-b59bde2a6247\") " Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.867231 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9af86f4e-8143-426d-98a6-b59bde2a6247-config\") pod \"9af86f4e-8143-426d-98a6-b59bde2a6247\" (UID: \"9af86f4e-8143-426d-98a6-b59bde2a6247\") " Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.867316 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9af86f4e-8143-426d-98a6-b59bde2a6247-proxy-ca-bundles\") pod \"9af86f4e-8143-426d-98a6-b59bde2a6247\" (UID: \"9af86f4e-8143-426d-98a6-b59bde2a6247\") " Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.867419 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9af86f4e-8143-426d-98a6-b59bde2a6247-serving-cert\") pod \"9af86f4e-8143-426d-98a6-b59bde2a6247\" (UID: \"9af86f4e-8143-426d-98a6-b59bde2a6247\") " Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.867766 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f85724b-0c9e-4a01-927a-0054866f46d5-serving-cert\") pod \"controller-manager-6f96fbd6b8-wdh8s\" (UID: \"2f85724b-0c9e-4a01-927a-0054866f46d5\") " pod="openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.867948 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f85724b-0c9e-4a01-927a-0054866f46d5-client-ca\") pod \"controller-manager-6f96fbd6b8-wdh8s\" (UID: \"2f85724b-0c9e-4a01-927a-0054866f46d5\") " pod="openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.868110 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jdgb\" (UniqueName: \"kubernetes.io/projected/2f85724b-0c9e-4a01-927a-0054866f46d5-kube-api-access-6jdgb\") pod \"controller-manager-6f96fbd6b8-wdh8s\" (UID: \"2f85724b-0c9e-4a01-927a-0054866f46d5\") " pod="openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.868172 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f85724b-0c9e-4a01-927a-0054866f46d5-config\") pod \"controller-manager-6f96fbd6b8-wdh8s\" (UID: \"2f85724b-0c9e-4a01-927a-0054866f46d5\") " pod="openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.868210 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f85724b-0c9e-4a01-927a-0054866f46d5-proxy-ca-bundles\") pod \"controller-manager-6f96fbd6b8-wdh8s\" (UID: \"2f85724b-0c9e-4a01-927a-0054866f46d5\") " pod="openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.868501 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9af86f4e-8143-426d-98a6-b59bde2a6247-client-ca" (OuterVolumeSpecName: "client-ca") pod "9af86f4e-8143-426d-98a6-b59bde2a6247" (UID: "9af86f4e-8143-426d-98a6-b59bde2a6247"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.868561 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9af86f4e-8143-426d-98a6-b59bde2a6247-config" (OuterVolumeSpecName: "config") pod "9af86f4e-8143-426d-98a6-b59bde2a6247" (UID: "9af86f4e-8143-426d-98a6-b59bde2a6247"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.868865 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9af86f4e-8143-426d-98a6-b59bde2a6247-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "9af86f4e-8143-426d-98a6-b59bde2a6247" (UID: "9af86f4e-8143-426d-98a6-b59bde2a6247"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.874121 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9af86f4e-8143-426d-98a6-b59bde2a6247-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9af86f4e-8143-426d-98a6-b59bde2a6247" (UID: "9af86f4e-8143-426d-98a6-b59bde2a6247"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.874330 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9af86f4e-8143-426d-98a6-b59bde2a6247-kube-api-access-q4k7j" (OuterVolumeSpecName: "kube-api-access-q4k7j") pod "9af86f4e-8143-426d-98a6-b59bde2a6247" (UID: "9af86f4e-8143-426d-98a6-b59bde2a6247"). InnerVolumeSpecName "kube-api-access-q4k7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.970401 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/94a9fa12-c97d-4b13-81a1-da33f15c7f42-client-ca\") pod \"94a9fa12-c97d-4b13-81a1-da33f15c7f42\" (UID: \"94a9fa12-c97d-4b13-81a1-da33f15c7f42\") " Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.970464 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94a9fa12-c97d-4b13-81a1-da33f15c7f42-config\") pod \"94a9fa12-c97d-4b13-81a1-da33f15c7f42\" (UID: \"94a9fa12-c97d-4b13-81a1-da33f15c7f42\") " Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.970527 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krwzg\" (UniqueName: \"kubernetes.io/projected/94a9fa12-c97d-4b13-81a1-da33f15c7f42-kube-api-access-krwzg\") pod \"94a9fa12-c97d-4b13-81a1-da33f15c7f42\" (UID: \"94a9fa12-c97d-4b13-81a1-da33f15c7f42\") " Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.970568 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94a9fa12-c97d-4b13-81a1-da33f15c7f42-serving-cert\") pod \"94a9fa12-c97d-4b13-81a1-da33f15c7f42\" (UID: \"94a9fa12-c97d-4b13-81a1-da33f15c7f42\") " Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.970748 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jdgb\" (UniqueName: \"kubernetes.io/projected/2f85724b-0c9e-4a01-927a-0054866f46d5-kube-api-access-6jdgb\") pod \"controller-manager-6f96fbd6b8-wdh8s\" (UID: \"2f85724b-0c9e-4a01-927a-0054866f46d5\") " pod="openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.970791 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f85724b-0c9e-4a01-927a-0054866f46d5-config\") pod \"controller-manager-6f96fbd6b8-wdh8s\" (UID: \"2f85724b-0c9e-4a01-927a-0054866f46d5\") " pod="openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.970815 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f85724b-0c9e-4a01-927a-0054866f46d5-proxy-ca-bundles\") pod \"controller-manager-6f96fbd6b8-wdh8s\" (UID: \"2f85724b-0c9e-4a01-927a-0054866f46d5\") " pod="openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.970854 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f85724b-0c9e-4a01-927a-0054866f46d5-serving-cert\") pod \"controller-manager-6f96fbd6b8-wdh8s\" (UID: \"2f85724b-0c9e-4a01-927a-0054866f46d5\") " pod="openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.970896 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f85724b-0c9e-4a01-927a-0054866f46d5-client-ca\") pod \"controller-manager-6f96fbd6b8-wdh8s\" (UID: \"2f85724b-0c9e-4a01-927a-0054866f46d5\") " pod="openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.970964 4837 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9af86f4e-8143-426d-98a6-b59bde2a6247-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.970979 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9af86f4e-8143-426d-98a6-b59bde2a6247-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.970992 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4k7j\" (UniqueName: \"kubernetes.io/projected/9af86f4e-8143-426d-98a6-b59bde2a6247-kube-api-access-q4k7j\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.971019 4837 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9af86f4e-8143-426d-98a6-b59bde2a6247-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.971030 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9af86f4e-8143-426d-98a6-b59bde2a6247-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.971465 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94a9fa12-c97d-4b13-81a1-da33f15c7f42-client-ca" (OuterVolumeSpecName: "client-ca") pod "94a9fa12-c97d-4b13-81a1-da33f15c7f42" (UID: "94a9fa12-c97d-4b13-81a1-da33f15c7f42"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.971701 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94a9fa12-c97d-4b13-81a1-da33f15c7f42-config" (OuterVolumeSpecName: "config") pod "94a9fa12-c97d-4b13-81a1-da33f15c7f42" (UID: "94a9fa12-c97d-4b13-81a1-da33f15c7f42"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.972257 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f85724b-0c9e-4a01-927a-0054866f46d5-proxy-ca-bundles\") pod \"controller-manager-6f96fbd6b8-wdh8s\" (UID: \"2f85724b-0c9e-4a01-927a-0054866f46d5\") " pod="openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.972479 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f85724b-0c9e-4a01-927a-0054866f46d5-config\") pod \"controller-manager-6f96fbd6b8-wdh8s\" (UID: \"2f85724b-0c9e-4a01-927a-0054866f46d5\") " pod="openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.973279 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f85724b-0c9e-4a01-927a-0054866f46d5-client-ca\") pod \"controller-manager-6f96fbd6b8-wdh8s\" (UID: \"2f85724b-0c9e-4a01-927a-0054866f46d5\") " pod="openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.976326 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94a9fa12-c97d-4b13-81a1-da33f15c7f42-kube-api-access-krwzg" (OuterVolumeSpecName: "kube-api-access-krwzg") pod "94a9fa12-c97d-4b13-81a1-da33f15c7f42" (UID: "94a9fa12-c97d-4b13-81a1-da33f15c7f42"). InnerVolumeSpecName "kube-api-access-krwzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.976458 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94a9fa12-c97d-4b13-81a1-da33f15c7f42-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "94a9fa12-c97d-4b13-81a1-da33f15c7f42" (UID: "94a9fa12-c97d-4b13-81a1-da33f15c7f42"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.977002 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f85724b-0c9e-4a01-927a-0054866f46d5-serving-cert\") pod \"controller-manager-6f96fbd6b8-wdh8s\" (UID: \"2f85724b-0c9e-4a01-927a-0054866f46d5\") " pod="openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s" Mar 13 11:52:22 crc kubenswrapper[4837]: I0313 11:52:22.989606 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jdgb\" (UniqueName: \"kubernetes.io/projected/2f85724b-0c9e-4a01-927a-0054866f46d5-kube-api-access-6jdgb\") pod \"controller-manager-6f96fbd6b8-wdh8s\" (UID: \"2f85724b-0c9e-4a01-927a-0054866f46d5\") " pod="openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s" Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.072863 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krwzg\" (UniqueName: \"kubernetes.io/projected/94a9fa12-c97d-4b13-81a1-da33f15c7f42-kube-api-access-krwzg\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.073899 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94a9fa12-c97d-4b13-81a1-da33f15c7f42-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.073923 4837 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/94a9fa12-c97d-4b13-81a1-da33f15c7f42-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.073938 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94a9fa12-c97d-4b13-81a1-da33f15c7f42-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.104999 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s" Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.158795 4837 csr.go:261] certificate signing request csr-qq6w7 is approved, waiting to be issued Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.163933 4837 csr.go:257] certificate signing request csr-qq6w7 is issued Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.237365 4837 generic.go:334] "Generic (PLEG): container finished" podID="87edec8a-33b2-44c0-bbcb-1e4f5dded1b2" containerID="8c4d75bce91d26c5c90ccce3126b557507017a92b0dd1db884cee46957fc8b2f" exitCode=0 Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.237423 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556712-g8877" event={"ID":"87edec8a-33b2-44c0-bbcb-1e4f5dded1b2","Type":"ContainerDied","Data":"8c4d75bce91d26c5c90ccce3126b557507017a92b0dd1db884cee46957fc8b2f"} Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.239310 4837 generic.go:334] "Generic (PLEG): container finished" podID="9af86f4e-8143-426d-98a6-b59bde2a6247" containerID="e8a2cef8e75a4e0ace069da9cf579224b66ef8558043d4da9db0156d41a9a6b0" exitCode=0 Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.239361 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6dfc58dd94-n92qv" event={"ID":"9af86f4e-8143-426d-98a6-b59bde2a6247","Type":"ContainerDied","Data":"e8a2cef8e75a4e0ace069da9cf579224b66ef8558043d4da9db0156d41a9a6b0"} Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.239394 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6dfc58dd94-n92qv" Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.239432 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6dfc58dd94-n92qv" event={"ID":"9af86f4e-8143-426d-98a6-b59bde2a6247","Type":"ContainerDied","Data":"0a64770191264774b60b3b0035ecdb4182d299c43a1bebd3bdb1f2e9e4efd51c"} Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.239453 4837 scope.go:117] "RemoveContainer" containerID="e8a2cef8e75a4e0ace069da9cf579224b66ef8558043d4da9db0156d41a9a6b0" Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.242315 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-588697dd78-t4tn5_94a9fa12-c97d-4b13-81a1-da33f15c7f42/route-controller-manager/0.log" Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.242349 4837 generic.go:334] "Generic (PLEG): container finished" podID="94a9fa12-c97d-4b13-81a1-da33f15c7f42" containerID="156ca548c0f1c4ee06367765d9ba3fe7ce821e85f789c679adc2538ac26cece9" exitCode=255 Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.242389 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5" event={"ID":"94a9fa12-c97d-4b13-81a1-da33f15c7f42","Type":"ContainerDied","Data":"156ca548c0f1c4ee06367765d9ba3fe7ce821e85f789c679adc2538ac26cece9"} Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.242416 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5" event={"ID":"94a9fa12-c97d-4b13-81a1-da33f15c7f42","Type":"ContainerDied","Data":"e8304698d4bf5325cbeebf58e902f92321fa9a970fc375c2b81310795d7c51bb"} Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.242467 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5" Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.245802 4837 generic.go:334] "Generic (PLEG): container finished" podID="0db827be-f908-46a8-9402-a858214284e7" containerID="1302ef186a9c607b00444c8a5a974e46bcf1b9f7828a0fc6da6c8957d4d4e5cd" exitCode=0 Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.245870 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"0db827be-f908-46a8-9402-a858214284e7","Type":"ContainerDied","Data":"1302ef186a9c607b00444c8a5a974e46bcf1b9f7828a0fc6da6c8957d4d4e5cd"} Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.267995 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6dfc58dd94-n92qv"] Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.270615 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6dfc58dd94-n92qv"] Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.274038 4837 scope.go:117] "RemoveContainer" containerID="e8a2cef8e75a4e0ace069da9cf579224b66ef8558043d4da9db0156d41a9a6b0" Mar 13 11:52:23 crc kubenswrapper[4837]: E0313 11:52:23.276115 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8a2cef8e75a4e0ace069da9cf579224b66ef8558043d4da9db0156d41a9a6b0\": container with ID starting with e8a2cef8e75a4e0ace069da9cf579224b66ef8558043d4da9db0156d41a9a6b0 not found: ID does not exist" containerID="e8a2cef8e75a4e0ace069da9cf579224b66ef8558043d4da9db0156d41a9a6b0" Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.276172 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8a2cef8e75a4e0ace069da9cf579224b66ef8558043d4da9db0156d41a9a6b0"} err="failed to get container status \"e8a2cef8e75a4e0ace069da9cf579224b66ef8558043d4da9db0156d41a9a6b0\": rpc error: code = NotFound desc = could not find container \"e8a2cef8e75a4e0ace069da9cf579224b66ef8558043d4da9db0156d41a9a6b0\": container with ID starting with e8a2cef8e75a4e0ace069da9cf579224b66ef8558043d4da9db0156d41a9a6b0 not found: ID does not exist" Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.276198 4837 scope.go:117] "RemoveContainer" containerID="156ca548c0f1c4ee06367765d9ba3fe7ce821e85f789c679adc2538ac26cece9" Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.293231 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s"] Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.298988 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5"] Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.301427 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-588697dd78-t4tn5"] Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.303827 4837 scope.go:117] "RemoveContainer" containerID="156ca548c0f1c4ee06367765d9ba3fe7ce821e85f789c679adc2538ac26cece9" Mar 13 11:52:23 crc kubenswrapper[4837]: E0313 11:52:23.304798 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"156ca548c0f1c4ee06367765d9ba3fe7ce821e85f789c679adc2538ac26cece9\": container with ID starting with 156ca548c0f1c4ee06367765d9ba3fe7ce821e85f789c679adc2538ac26cece9 not found: ID does not exist" containerID="156ca548c0f1c4ee06367765d9ba3fe7ce821e85f789c679adc2538ac26cece9" Mar 13 11:52:23 crc kubenswrapper[4837]: I0313 11:52:23.304828 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"156ca548c0f1c4ee06367765d9ba3fe7ce821e85f789c679adc2538ac26cece9"} err="failed to get container status \"156ca548c0f1c4ee06367765d9ba3fe7ce821e85f789c679adc2538ac26cece9\": rpc error: code = NotFound desc = could not find container \"156ca548c0f1c4ee06367765d9ba3fe7ce821e85f789c679adc2538ac26cece9\": container with ID starting with 156ca548c0f1c4ee06367765d9ba3fe7ce821e85f789c679adc2538ac26cece9 not found: ID does not exist" Mar 13 11:52:24 crc kubenswrapper[4837]: I0313 11:52:24.166473 4837 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-09 20:52:31.936490827 +0000 UTC Mar 13 11:52:24 crc kubenswrapper[4837]: I0313 11:52:24.166513 4837 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6513h0m7.769979994s for next certificate rotation Mar 13 11:52:24 crc kubenswrapper[4837]: I0313 11:52:24.253505 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s" event={"ID":"2f85724b-0c9e-4a01-927a-0054866f46d5","Type":"ContainerStarted","Data":"a26cbf0bbee08d073239d4a9f9f827d33e581ba8c5b8a64d42ab65f1206a8910"} Mar 13 11:52:24 crc kubenswrapper[4837]: I0313 11:52:24.253550 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s" event={"ID":"2f85724b-0c9e-4a01-927a-0054866f46d5","Type":"ContainerStarted","Data":"a1d4f83811bf42ca7944c4333709cb3c7a2ea535fb4a2466c48aaccfc4a847ba"} Mar 13 11:52:24 crc kubenswrapper[4837]: I0313 11:52:24.253998 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s" Mar 13 11:52:24 crc kubenswrapper[4837]: I0313 11:52:24.267887 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s" Mar 13 11:52:24 crc kubenswrapper[4837]: I0313 11:52:24.280350 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s" podStartSLOduration=6.280327626 podStartE2EDuration="6.280327626s" podCreationTimestamp="2026-03-13 11:52:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:52:24.274548805 +0000 UTC m=+259.912815578" watchObservedRunningTime="2026-03-13 11:52:24.280327626 +0000 UTC m=+259.918594389" Mar 13 11:52:24 crc kubenswrapper[4837]: I0313 11:52:24.510264 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556712-g8877" Mar 13 11:52:24 crc kubenswrapper[4837]: I0313 11:52:24.578494 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 11:52:24 crc kubenswrapper[4837]: I0313 11:52:24.593977 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0db827be-f908-46a8-9402-a858214284e7-kubelet-dir\") pod \"0db827be-f908-46a8-9402-a858214284e7\" (UID: \"0db827be-f908-46a8-9402-a858214284e7\") " Mar 13 11:52:24 crc kubenswrapper[4837]: I0313 11:52:24.594073 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0db827be-f908-46a8-9402-a858214284e7-kube-api-access\") pod \"0db827be-f908-46a8-9402-a858214284e7\" (UID: \"0db827be-f908-46a8-9402-a858214284e7\") " Mar 13 11:52:24 crc kubenswrapper[4837]: I0313 11:52:24.594082 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0db827be-f908-46a8-9402-a858214284e7-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0db827be-f908-46a8-9402-a858214284e7" (UID: "0db827be-f908-46a8-9402-a858214284e7"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:52:24 crc kubenswrapper[4837]: I0313 11:52:24.594126 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xqlb\" (UniqueName: \"kubernetes.io/projected/87edec8a-33b2-44c0-bbcb-1e4f5dded1b2-kube-api-access-9xqlb\") pod \"87edec8a-33b2-44c0-bbcb-1e4f5dded1b2\" (UID: \"87edec8a-33b2-44c0-bbcb-1e4f5dded1b2\") " Mar 13 11:52:24 crc kubenswrapper[4837]: I0313 11:52:24.594305 4837 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0db827be-f908-46a8-9402-a858214284e7-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:24 crc kubenswrapper[4837]: I0313 11:52:24.642147 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0db827be-f908-46a8-9402-a858214284e7-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0db827be-f908-46a8-9402-a858214284e7" (UID: "0db827be-f908-46a8-9402-a858214284e7"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:52:24 crc kubenswrapper[4837]: I0313 11:52:24.642327 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87edec8a-33b2-44c0-bbcb-1e4f5dded1b2-kube-api-access-9xqlb" (OuterVolumeSpecName: "kube-api-access-9xqlb") pod "87edec8a-33b2-44c0-bbcb-1e4f5dded1b2" (UID: "87edec8a-33b2-44c0-bbcb-1e4f5dded1b2"). InnerVolumeSpecName "kube-api-access-9xqlb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:52:24 crc kubenswrapper[4837]: I0313 11:52:24.695051 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0db827be-f908-46a8-9402-a858214284e7-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:24 crc kubenswrapper[4837]: I0313 11:52:24.695095 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xqlb\" (UniqueName: \"kubernetes.io/projected/87edec8a-33b2-44c0-bbcb-1e4f5dded1b2-kube-api-access-9xqlb\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.057993 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94a9fa12-c97d-4b13-81a1-da33f15c7f42" path="/var/lib/kubelet/pods/94a9fa12-c97d-4b13-81a1-da33f15c7f42/volumes" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.058842 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9af86f4e-8143-426d-98a6-b59bde2a6247" path="/var/lib/kubelet/pods/9af86f4e-8143-426d-98a6-b59bde2a6247/volumes" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.166777 4837 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-16 19:25:23.776019141 +0000 UTC Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.166835 4837 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 5959h32m58.609186902s for next certificate rotation Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.273456 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"0db827be-f908-46a8-9402-a858214284e7","Type":"ContainerDied","Data":"dbdd7156061fda7162f147ca816854574c46fd0ae13942bad0a58567040c30eb"} Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.273501 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbdd7156061fda7162f147ca816854574c46fd0ae13942bad0a58567040c30eb" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.273567 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.275331 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556712-g8877" event={"ID":"87edec8a-33b2-44c0-bbcb-1e4f5dded1b2","Type":"ContainerDied","Data":"a21cd2ecc8f429b49f1437604fc5e32c63e6746f08d9a4b8da352497b132669f"} Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.275387 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a21cd2ecc8f429b49f1437604fc5e32c63e6746f08d9a4b8da352497b132669f" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.275352 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556712-g8877" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.530974 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j"] Mar 13 11:52:25 crc kubenswrapper[4837]: E0313 11:52:25.531256 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87edec8a-33b2-44c0-bbcb-1e4f5dded1b2" containerName="oc" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.531278 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="87edec8a-33b2-44c0-bbcb-1e4f5dded1b2" containerName="oc" Mar 13 11:52:25 crc kubenswrapper[4837]: E0313 11:52:25.531295 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94a9fa12-c97d-4b13-81a1-da33f15c7f42" containerName="route-controller-manager" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.531303 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="94a9fa12-c97d-4b13-81a1-da33f15c7f42" containerName="route-controller-manager" Mar 13 11:52:25 crc kubenswrapper[4837]: E0313 11:52:25.531323 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0db827be-f908-46a8-9402-a858214284e7" containerName="pruner" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.531331 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="0db827be-f908-46a8-9402-a858214284e7" containerName="pruner" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.531445 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="94a9fa12-c97d-4b13-81a1-da33f15c7f42" containerName="route-controller-manager" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.531462 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="0db827be-f908-46a8-9402-a858214284e7" containerName="pruner" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.531478 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="87edec8a-33b2-44c0-bbcb-1e4f5dded1b2" containerName="oc" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.531905 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.533585 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.534060 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.534407 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.534582 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.534756 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.534946 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.558877 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j"] Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.710008 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c0f7bce-90d1-4fe8-a832-8c4f55efd886-serving-cert\") pod \"route-controller-manager-cc969d968-65b4j\" (UID: \"1c0f7bce-90d1-4fe8-a832-8c4f55efd886\") " pod="openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.710077 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c0f7bce-90d1-4fe8-a832-8c4f55efd886-config\") pod \"route-controller-manager-cc969d968-65b4j\" (UID: \"1c0f7bce-90d1-4fe8-a832-8c4f55efd886\") " pod="openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.710104 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqdrx\" (UniqueName: \"kubernetes.io/projected/1c0f7bce-90d1-4fe8-a832-8c4f55efd886-kube-api-access-fqdrx\") pod \"route-controller-manager-cc969d968-65b4j\" (UID: \"1c0f7bce-90d1-4fe8-a832-8c4f55efd886\") " pod="openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.710127 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1c0f7bce-90d1-4fe8-a832-8c4f55efd886-client-ca\") pod \"route-controller-manager-cc969d968-65b4j\" (UID: \"1c0f7bce-90d1-4fe8-a832-8c4f55efd886\") " pod="openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.812333 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c0f7bce-90d1-4fe8-a832-8c4f55efd886-serving-cert\") pod \"route-controller-manager-cc969d968-65b4j\" (UID: \"1c0f7bce-90d1-4fe8-a832-8c4f55efd886\") " pod="openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.812439 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c0f7bce-90d1-4fe8-a832-8c4f55efd886-config\") pod \"route-controller-manager-cc969d968-65b4j\" (UID: \"1c0f7bce-90d1-4fe8-a832-8c4f55efd886\") " pod="openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.812481 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqdrx\" (UniqueName: \"kubernetes.io/projected/1c0f7bce-90d1-4fe8-a832-8c4f55efd886-kube-api-access-fqdrx\") pod \"route-controller-manager-cc969d968-65b4j\" (UID: \"1c0f7bce-90d1-4fe8-a832-8c4f55efd886\") " pod="openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.812515 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1c0f7bce-90d1-4fe8-a832-8c4f55efd886-client-ca\") pod \"route-controller-manager-cc969d968-65b4j\" (UID: \"1c0f7bce-90d1-4fe8-a832-8c4f55efd886\") " pod="openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.814197 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1c0f7bce-90d1-4fe8-a832-8c4f55efd886-client-ca\") pod \"route-controller-manager-cc969d968-65b4j\" (UID: \"1c0f7bce-90d1-4fe8-a832-8c4f55efd886\") " pod="openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.814327 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c0f7bce-90d1-4fe8-a832-8c4f55efd886-config\") pod \"route-controller-manager-cc969d968-65b4j\" (UID: \"1c0f7bce-90d1-4fe8-a832-8c4f55efd886\") " pod="openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.816952 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c0f7bce-90d1-4fe8-a832-8c4f55efd886-serving-cert\") pod \"route-controller-manager-cc969d968-65b4j\" (UID: \"1c0f7bce-90d1-4fe8-a832-8c4f55efd886\") " pod="openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.831015 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqdrx\" (UniqueName: \"kubernetes.io/projected/1c0f7bce-90d1-4fe8-a832-8c4f55efd886-kube-api-access-fqdrx\") pod \"route-controller-manager-cc969d968-65b4j\" (UID: \"1c0f7bce-90d1-4fe8-a832-8c4f55efd886\") " pod="openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j" Mar 13 11:52:25 crc kubenswrapper[4837]: I0313 11:52:25.860214 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j" Mar 13 11:52:26 crc kubenswrapper[4837]: I0313 11:52:26.248339 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j"] Mar 13 11:52:26 crc kubenswrapper[4837]: W0313 11:52:26.258080 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c0f7bce_90d1_4fe8_a832_8c4f55efd886.slice/crio-9be860a81b1b95a82b5cdc8cfccb2c34c9083d28d8fc3fa8cb17545d34f6a331 WatchSource:0}: Error finding container 9be860a81b1b95a82b5cdc8cfccb2c34c9083d28d8fc3fa8cb17545d34f6a331: Status 404 returned error can't find the container with id 9be860a81b1b95a82b5cdc8cfccb2c34c9083d28d8fc3fa8cb17545d34f6a331 Mar 13 11:52:26 crc kubenswrapper[4837]: I0313 11:52:26.282113 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j" event={"ID":"1c0f7bce-90d1-4fe8-a832-8c4f55efd886","Type":"ContainerStarted","Data":"9be860a81b1b95a82b5cdc8cfccb2c34c9083d28d8fc3fa8cb17545d34f6a331"} Mar 13 11:52:27 crc kubenswrapper[4837]: I0313 11:52:27.288983 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j" event={"ID":"1c0f7bce-90d1-4fe8-a832-8c4f55efd886","Type":"ContainerStarted","Data":"c5b04d576ec724fb239a6b87808e44a211c810b2a1f9473b6bf7e374d0f380ef"} Mar 13 11:52:27 crc kubenswrapper[4837]: I0313 11:52:27.289243 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j" Mar 13 11:52:27 crc kubenswrapper[4837]: I0313 11:52:27.295577 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j" Mar 13 11:52:27 crc kubenswrapper[4837]: I0313 11:52:27.306171 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j" podStartSLOduration=9.306155286 podStartE2EDuration="9.306155286s" podCreationTimestamp="2026-03-13 11:52:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:52:27.304040026 +0000 UTC m=+262.942306789" watchObservedRunningTime="2026-03-13 11:52:27.306155286 +0000 UTC m=+262.944422049" Mar 13 11:52:30 crc kubenswrapper[4837]: I0313 11:52:30.308386 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556710-lcprh" event={"ID":"0484d991-f239-47a2-80ff-0237945c27ac","Type":"ContainerStarted","Data":"b2ba4ee22041e914a3b1573c300fce67d3ac337c4d9b3d85c86421a82bc9711f"} Mar 13 11:52:30 crc kubenswrapper[4837]: I0313 11:52:30.323381 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556710-lcprh" podStartSLOduration=94.271420207 podStartE2EDuration="2m30.323357475s" podCreationTimestamp="2026-03-13 11:50:00 +0000 UTC" firstStartedPulling="2026-03-13 11:51:33.87443962 +0000 UTC m=+209.512706383" lastFinishedPulling="2026-03-13 11:52:29.926376888 +0000 UTC m=+265.564643651" observedRunningTime="2026-03-13 11:52:30.320144101 +0000 UTC m=+265.958410864" watchObservedRunningTime="2026-03-13 11:52:30.323357475 +0000 UTC m=+265.961624238" Mar 13 11:52:31 crc kubenswrapper[4837]: I0313 11:52:31.317145 4837 generic.go:334] "Generic (PLEG): container finished" podID="0484d991-f239-47a2-80ff-0237945c27ac" containerID="b2ba4ee22041e914a3b1573c300fce67d3ac337c4d9b3d85c86421a82bc9711f" exitCode=0 Mar 13 11:52:31 crc kubenswrapper[4837]: I0313 11:52:31.317239 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556710-lcprh" event={"ID":"0484d991-f239-47a2-80ff-0237945c27ac","Type":"ContainerDied","Data":"b2ba4ee22041e914a3b1573c300fce67d3ac337c4d9b3d85c86421a82bc9711f"} Mar 13 11:52:31 crc kubenswrapper[4837]: I0313 11:52:31.319271 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j246z" event={"ID":"32a36cbe-a17f-46bf-9c6a-1df6f427e2c6","Type":"ContainerStarted","Data":"73bd112e9625df12ee2d18dc3c732843142d27d7ee69bd7e78c3b55fe032dc84"} Mar 13 11:52:32 crc kubenswrapper[4837]: I0313 11:52:32.327030 4837 generic.go:334] "Generic (PLEG): container finished" podID="32a36cbe-a17f-46bf-9c6a-1df6f427e2c6" containerID="73bd112e9625df12ee2d18dc3c732843142d27d7ee69bd7e78c3b55fe032dc84" exitCode=0 Mar 13 11:52:32 crc kubenswrapper[4837]: I0313 11:52:32.327125 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j246z" event={"ID":"32a36cbe-a17f-46bf-9c6a-1df6f427e2c6","Type":"ContainerDied","Data":"73bd112e9625df12ee2d18dc3c732843142d27d7ee69bd7e78c3b55fe032dc84"} Mar 13 11:52:32 crc kubenswrapper[4837]: I0313 11:52:32.722863 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556710-lcprh" Mar 13 11:52:32 crc kubenswrapper[4837]: I0313 11:52:32.829156 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlwdw\" (UniqueName: \"kubernetes.io/projected/0484d991-f239-47a2-80ff-0237945c27ac-kube-api-access-dlwdw\") pod \"0484d991-f239-47a2-80ff-0237945c27ac\" (UID: \"0484d991-f239-47a2-80ff-0237945c27ac\") " Mar 13 11:52:32 crc kubenswrapper[4837]: I0313 11:52:32.834624 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0484d991-f239-47a2-80ff-0237945c27ac-kube-api-access-dlwdw" (OuterVolumeSpecName: "kube-api-access-dlwdw") pod "0484d991-f239-47a2-80ff-0237945c27ac" (UID: "0484d991-f239-47a2-80ff-0237945c27ac"). InnerVolumeSpecName "kube-api-access-dlwdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:52:32 crc kubenswrapper[4837]: I0313 11:52:32.930055 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlwdw\" (UniqueName: \"kubernetes.io/projected/0484d991-f239-47a2-80ff-0237945c27ac-kube-api-access-dlwdw\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:33 crc kubenswrapper[4837]: I0313 11:52:33.337022 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556710-lcprh" event={"ID":"0484d991-f239-47a2-80ff-0237945c27ac","Type":"ContainerDied","Data":"960f7af1fa61c8ed012820a8878b593f9924c583dd0d3076ea82e4ba9452a14b"} Mar 13 11:52:33 crc kubenswrapper[4837]: I0313 11:52:33.337105 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="960f7af1fa61c8ed012820a8878b593f9924c583dd0d3076ea82e4ba9452a14b" Mar 13 11:52:33 crc kubenswrapper[4837]: I0313 11:52:33.337124 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556710-lcprh" Mar 13 11:52:35 crc kubenswrapper[4837]: I0313 11:52:35.483937 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 11:52:35 crc kubenswrapper[4837]: I0313 11:52:35.484411 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 11:52:35 crc kubenswrapper[4837]: I0313 11:52:35.484912 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 11:52:35 crc kubenswrapper[4837]: I0313 11:52:35.486086 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a"} pod="openshift-machine-config-operator/machine-config-daemon-2td4d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 11:52:35 crc kubenswrapper[4837]: I0313 11:52:35.486175 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" containerID="cri-o://87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a" gracePeriod=600 Mar 13 11:52:36 crc kubenswrapper[4837]: I0313 11:52:36.361355 4837 generic.go:334] "Generic (PLEG): container finished" podID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerID="87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a" exitCode=0 Mar 13 11:52:36 crc kubenswrapper[4837]: I0313 11:52:36.361389 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerDied","Data":"87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a"} Mar 13 11:52:38 crc kubenswrapper[4837]: I0313 11:52:38.114932 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s"] Mar 13 11:52:38 crc kubenswrapper[4837]: I0313 11:52:38.116602 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s" podUID="2f85724b-0c9e-4a01-927a-0054866f46d5" containerName="controller-manager" containerID="cri-o://a26cbf0bbee08d073239d4a9f9f827d33e581ba8c5b8a64d42ab65f1206a8910" gracePeriod=30 Mar 13 11:52:38 crc kubenswrapper[4837]: I0313 11:52:38.126247 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j"] Mar 13 11:52:38 crc kubenswrapper[4837]: I0313 11:52:38.126494 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j" podUID="1c0f7bce-90d1-4fe8-a832-8c4f55efd886" containerName="route-controller-manager" containerID="cri-o://c5b04d576ec724fb239a6b87808e44a211c810b2a1f9473b6bf7e374d0f380ef" gracePeriod=30 Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.377448 4837 generic.go:334] "Generic (PLEG): container finished" podID="1c0f7bce-90d1-4fe8-a832-8c4f55efd886" containerID="c5b04d576ec724fb239a6b87808e44a211c810b2a1f9473b6bf7e374d0f380ef" exitCode=0 Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.377500 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j" event={"ID":"1c0f7bce-90d1-4fe8-a832-8c4f55efd886","Type":"ContainerDied","Data":"c5b04d576ec724fb239a6b87808e44a211c810b2a1f9473b6bf7e374d0f380ef"} Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.378887 4837 generic.go:334] "Generic (PLEG): container finished" podID="2f85724b-0c9e-4a01-927a-0054866f46d5" containerID="a26cbf0bbee08d073239d4a9f9f827d33e581ba8c5b8a64d42ab65f1206a8910" exitCode=0 Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.378907 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s" event={"ID":"2f85724b-0c9e-4a01-927a-0054866f46d5","Type":"ContainerDied","Data":"a26cbf0bbee08d073239d4a9f9f827d33e581ba8c5b8a64d42ab65f1206a8910"} Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.740762 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.795171 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7bb68b4479-msngs"] Mar 13 11:52:39 crc kubenswrapper[4837]: E0313 11:52:39.795455 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f85724b-0c9e-4a01-927a-0054866f46d5" containerName="controller-manager" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.795474 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f85724b-0c9e-4a01-927a-0054866f46d5" containerName="controller-manager" Mar 13 11:52:39 crc kubenswrapper[4837]: E0313 11:52:39.795488 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0484d991-f239-47a2-80ff-0237945c27ac" containerName="oc" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.795502 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="0484d991-f239-47a2-80ff-0237945c27ac" containerName="oc" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.795677 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="0484d991-f239-47a2-80ff-0237945c27ac" containerName="oc" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.795700 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f85724b-0c9e-4a01-927a-0054866f46d5" containerName="controller-manager" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.796132 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bb68b4479-msngs" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.803140 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7bb68b4479-msngs"] Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.817855 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f85724b-0c9e-4a01-927a-0054866f46d5-proxy-ca-bundles\") pod \"2f85724b-0c9e-4a01-927a-0054866f46d5\" (UID: \"2f85724b-0c9e-4a01-927a-0054866f46d5\") " Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.817937 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f85724b-0c9e-4a01-927a-0054866f46d5-config\") pod \"2f85724b-0c9e-4a01-927a-0054866f46d5\" (UID: \"2f85724b-0c9e-4a01-927a-0054866f46d5\") " Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.819046 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f85724b-0c9e-4a01-927a-0054866f46d5-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "2f85724b-0c9e-4a01-927a-0054866f46d5" (UID: "2f85724b-0c9e-4a01-927a-0054866f46d5"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.819126 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f85724b-0c9e-4a01-927a-0054866f46d5-serving-cert\") pod \"2f85724b-0c9e-4a01-927a-0054866f46d5\" (UID: \"2f85724b-0c9e-4a01-927a-0054866f46d5\") " Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.819260 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f85724b-0c9e-4a01-927a-0054866f46d5-client-ca\") pod \"2f85724b-0c9e-4a01-927a-0054866f46d5\" (UID: \"2f85724b-0c9e-4a01-927a-0054866f46d5\") " Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.819345 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jdgb\" (UniqueName: \"kubernetes.io/projected/2f85724b-0c9e-4a01-927a-0054866f46d5-kube-api-access-6jdgb\") pod \"2f85724b-0c9e-4a01-927a-0054866f46d5\" (UID: \"2f85724b-0c9e-4a01-927a-0054866f46d5\") " Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.819766 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5b640fc3-2425-48b0-adfa-3300a6d52002-client-ca\") pod \"controller-manager-7bb68b4479-msngs\" (UID: \"5b640fc3-2425-48b0-adfa-3300a6d52002\") " pod="openshift-controller-manager/controller-manager-7bb68b4479-msngs" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.819867 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kndkg\" (UniqueName: \"kubernetes.io/projected/5b640fc3-2425-48b0-adfa-3300a6d52002-kube-api-access-kndkg\") pod \"controller-manager-7bb68b4479-msngs\" (UID: \"5b640fc3-2425-48b0-adfa-3300a6d52002\") " pod="openshift-controller-manager/controller-manager-7bb68b4479-msngs" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.819872 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f85724b-0c9e-4a01-927a-0054866f46d5-config" (OuterVolumeSpecName: "config") pod "2f85724b-0c9e-4a01-927a-0054866f46d5" (UID: "2f85724b-0c9e-4a01-927a-0054866f46d5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.819999 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5b640fc3-2425-48b0-adfa-3300a6d52002-proxy-ca-bundles\") pod \"controller-manager-7bb68b4479-msngs\" (UID: \"5b640fc3-2425-48b0-adfa-3300a6d52002\") " pod="openshift-controller-manager/controller-manager-7bb68b4479-msngs" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.820041 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b640fc3-2425-48b0-adfa-3300a6d52002-config\") pod \"controller-manager-7bb68b4479-msngs\" (UID: \"5b640fc3-2425-48b0-adfa-3300a6d52002\") " pod="openshift-controller-manager/controller-manager-7bb68b4479-msngs" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.820099 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b640fc3-2425-48b0-adfa-3300a6d52002-serving-cert\") pod \"controller-manager-7bb68b4479-msngs\" (UID: \"5b640fc3-2425-48b0-adfa-3300a6d52002\") " pod="openshift-controller-manager/controller-manager-7bb68b4479-msngs" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.820102 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f85724b-0c9e-4a01-927a-0054866f46d5-client-ca" (OuterVolumeSpecName: "client-ca") pod "2f85724b-0c9e-4a01-927a-0054866f46d5" (UID: "2f85724b-0c9e-4a01-927a-0054866f46d5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.820195 4837 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f85724b-0c9e-4a01-927a-0054866f46d5-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.820264 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f85724b-0c9e-4a01-927a-0054866f46d5-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.834806 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f85724b-0c9e-4a01-927a-0054866f46d5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2f85724b-0c9e-4a01-927a-0054866f46d5" (UID: "2f85724b-0c9e-4a01-927a-0054866f46d5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.834829 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f85724b-0c9e-4a01-927a-0054866f46d5-kube-api-access-6jdgb" (OuterVolumeSpecName: "kube-api-access-6jdgb") pod "2f85724b-0c9e-4a01-927a-0054866f46d5" (UID: "2f85724b-0c9e-4a01-927a-0054866f46d5"). InnerVolumeSpecName "kube-api-access-6jdgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.921328 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kndkg\" (UniqueName: \"kubernetes.io/projected/5b640fc3-2425-48b0-adfa-3300a6d52002-kube-api-access-kndkg\") pod \"controller-manager-7bb68b4479-msngs\" (UID: \"5b640fc3-2425-48b0-adfa-3300a6d52002\") " pod="openshift-controller-manager/controller-manager-7bb68b4479-msngs" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.921423 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5b640fc3-2425-48b0-adfa-3300a6d52002-proxy-ca-bundles\") pod \"controller-manager-7bb68b4479-msngs\" (UID: \"5b640fc3-2425-48b0-adfa-3300a6d52002\") " pod="openshift-controller-manager/controller-manager-7bb68b4479-msngs" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.921454 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b640fc3-2425-48b0-adfa-3300a6d52002-config\") pod \"controller-manager-7bb68b4479-msngs\" (UID: \"5b640fc3-2425-48b0-adfa-3300a6d52002\") " pod="openshift-controller-manager/controller-manager-7bb68b4479-msngs" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.921496 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b640fc3-2425-48b0-adfa-3300a6d52002-serving-cert\") pod \"controller-manager-7bb68b4479-msngs\" (UID: \"5b640fc3-2425-48b0-adfa-3300a6d52002\") " pod="openshift-controller-manager/controller-manager-7bb68b4479-msngs" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.921520 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5b640fc3-2425-48b0-adfa-3300a6d52002-client-ca\") pod \"controller-manager-7bb68b4479-msngs\" (UID: \"5b640fc3-2425-48b0-adfa-3300a6d52002\") " pod="openshift-controller-manager/controller-manager-7bb68b4479-msngs" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.921577 4837 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f85724b-0c9e-4a01-927a-0054866f46d5-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.921590 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jdgb\" (UniqueName: \"kubernetes.io/projected/2f85724b-0c9e-4a01-927a-0054866f46d5-kube-api-access-6jdgb\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.921602 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f85724b-0c9e-4a01-927a-0054866f46d5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.922655 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5b640fc3-2425-48b0-adfa-3300a6d52002-client-ca\") pod \"controller-manager-7bb68b4479-msngs\" (UID: \"5b640fc3-2425-48b0-adfa-3300a6d52002\") " pod="openshift-controller-manager/controller-manager-7bb68b4479-msngs" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.923772 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b640fc3-2425-48b0-adfa-3300a6d52002-config\") pod \"controller-manager-7bb68b4479-msngs\" (UID: \"5b640fc3-2425-48b0-adfa-3300a6d52002\") " pod="openshift-controller-manager/controller-manager-7bb68b4479-msngs" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.924096 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5b640fc3-2425-48b0-adfa-3300a6d52002-proxy-ca-bundles\") pod \"controller-manager-7bb68b4479-msngs\" (UID: \"5b640fc3-2425-48b0-adfa-3300a6d52002\") " pod="openshift-controller-manager/controller-manager-7bb68b4479-msngs" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.927143 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b640fc3-2425-48b0-adfa-3300a6d52002-serving-cert\") pod \"controller-manager-7bb68b4479-msngs\" (UID: \"5b640fc3-2425-48b0-adfa-3300a6d52002\") " pod="openshift-controller-manager/controller-manager-7bb68b4479-msngs" Mar 13 11:52:39 crc kubenswrapper[4837]: I0313 11:52:39.941885 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kndkg\" (UniqueName: \"kubernetes.io/projected/5b640fc3-2425-48b0-adfa-3300a6d52002-kube-api-access-kndkg\") pod \"controller-manager-7bb68b4479-msngs\" (UID: \"5b640fc3-2425-48b0-adfa-3300a6d52002\") " pod="openshift-controller-manager/controller-manager-7bb68b4479-msngs" Mar 13 11:52:40 crc kubenswrapper[4837]: I0313 11:52:40.113372 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bb68b4479-msngs" Mar 13 11:52:40 crc kubenswrapper[4837]: I0313 11:52:40.194999 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j" Mar 13 11:52:40 crc kubenswrapper[4837]: I0313 11:52:40.225121 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1c0f7bce-90d1-4fe8-a832-8c4f55efd886-client-ca\") pod \"1c0f7bce-90d1-4fe8-a832-8c4f55efd886\" (UID: \"1c0f7bce-90d1-4fe8-a832-8c4f55efd886\") " Mar 13 11:52:40 crc kubenswrapper[4837]: I0313 11:52:40.225189 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c0f7bce-90d1-4fe8-a832-8c4f55efd886-serving-cert\") pod \"1c0f7bce-90d1-4fe8-a832-8c4f55efd886\" (UID: \"1c0f7bce-90d1-4fe8-a832-8c4f55efd886\") " Mar 13 11:52:40 crc kubenswrapper[4837]: I0313 11:52:40.225242 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c0f7bce-90d1-4fe8-a832-8c4f55efd886-config\") pod \"1c0f7bce-90d1-4fe8-a832-8c4f55efd886\" (UID: \"1c0f7bce-90d1-4fe8-a832-8c4f55efd886\") " Mar 13 11:52:40 crc kubenswrapper[4837]: I0313 11:52:40.225271 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqdrx\" (UniqueName: \"kubernetes.io/projected/1c0f7bce-90d1-4fe8-a832-8c4f55efd886-kube-api-access-fqdrx\") pod \"1c0f7bce-90d1-4fe8-a832-8c4f55efd886\" (UID: \"1c0f7bce-90d1-4fe8-a832-8c4f55efd886\") " Mar 13 11:52:40 crc kubenswrapper[4837]: I0313 11:52:40.226417 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c0f7bce-90d1-4fe8-a832-8c4f55efd886-client-ca" (OuterVolumeSpecName: "client-ca") pod "1c0f7bce-90d1-4fe8-a832-8c4f55efd886" (UID: "1c0f7bce-90d1-4fe8-a832-8c4f55efd886"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:52:40 crc kubenswrapper[4837]: I0313 11:52:40.226447 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c0f7bce-90d1-4fe8-a832-8c4f55efd886-config" (OuterVolumeSpecName: "config") pod "1c0f7bce-90d1-4fe8-a832-8c4f55efd886" (UID: "1c0f7bce-90d1-4fe8-a832-8c4f55efd886"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:52:40 crc kubenswrapper[4837]: I0313 11:52:40.228607 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c0f7bce-90d1-4fe8-a832-8c4f55efd886-kube-api-access-fqdrx" (OuterVolumeSpecName: "kube-api-access-fqdrx") pod "1c0f7bce-90d1-4fe8-a832-8c4f55efd886" (UID: "1c0f7bce-90d1-4fe8-a832-8c4f55efd886"). InnerVolumeSpecName "kube-api-access-fqdrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:52:40 crc kubenswrapper[4837]: I0313 11:52:40.228717 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c0f7bce-90d1-4fe8-a832-8c4f55efd886-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1c0f7bce-90d1-4fe8-a832-8c4f55efd886" (UID: "1c0f7bce-90d1-4fe8-a832-8c4f55efd886"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:52:40 crc kubenswrapper[4837]: I0313 11:52:40.327606 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c0f7bce-90d1-4fe8-a832-8c4f55efd886-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:40 crc kubenswrapper[4837]: I0313 11:52:40.327744 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c0f7bce-90d1-4fe8-a832-8c4f55efd886-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:40 crc kubenswrapper[4837]: I0313 11:52:40.327765 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqdrx\" (UniqueName: \"kubernetes.io/projected/1c0f7bce-90d1-4fe8-a832-8c4f55efd886-kube-api-access-fqdrx\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:40 crc kubenswrapper[4837]: I0313 11:52:40.327788 4837 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1c0f7bce-90d1-4fe8-a832-8c4f55efd886-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:40 crc kubenswrapper[4837]: I0313 11:52:40.384835 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s" event={"ID":"2f85724b-0c9e-4a01-927a-0054866f46d5","Type":"ContainerDied","Data":"a1d4f83811bf42ca7944c4333709cb3c7a2ea535fb4a2466c48aaccfc4a847ba"} Mar 13 11:52:40 crc kubenswrapper[4837]: I0313 11:52:40.384860 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s" Mar 13 11:52:40 crc kubenswrapper[4837]: I0313 11:52:40.384891 4837 scope.go:117] "RemoveContainer" containerID="a26cbf0bbee08d073239d4a9f9f827d33e581ba8c5b8a64d42ab65f1206a8910" Mar 13 11:52:40 crc kubenswrapper[4837]: I0313 11:52:40.387056 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j" event={"ID":"1c0f7bce-90d1-4fe8-a832-8c4f55efd886","Type":"ContainerDied","Data":"9be860a81b1b95a82b5cdc8cfccb2c34c9083d28d8fc3fa8cb17545d34f6a331"} Mar 13 11:52:40 crc kubenswrapper[4837]: I0313 11:52:40.387153 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j" Mar 13 11:52:40 crc kubenswrapper[4837]: I0313 11:52:40.417206 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s"] Mar 13 11:52:40 crc kubenswrapper[4837]: I0313 11:52:40.420242 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6f96fbd6b8-wdh8s"] Mar 13 11:52:40 crc kubenswrapper[4837]: I0313 11:52:40.428804 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j"] Mar 13 11:52:40 crc kubenswrapper[4837]: I0313 11:52:40.431240 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cc969d968-65b4j"] Mar 13 11:52:40 crc kubenswrapper[4837]: I0313 11:52:40.713997 4837 scope.go:117] "RemoveContainer" containerID="c5b04d576ec724fb239a6b87808e44a211c810b2a1f9473b6bf7e374d0f380ef" Mar 13 11:52:41 crc kubenswrapper[4837]: I0313 11:52:41.055826 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c0f7bce-90d1-4fe8-a832-8c4f55efd886" path="/var/lib/kubelet/pods/1c0f7bce-90d1-4fe8-a832-8c4f55efd886/volumes" Mar 13 11:52:41 crc kubenswrapper[4837]: I0313 11:52:41.060197 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f85724b-0c9e-4a01-927a-0054866f46d5" path="/var/lib/kubelet/pods/2f85724b-0c9e-4a01-927a-0054866f46d5/volumes" Mar 13 11:52:41 crc kubenswrapper[4837]: I0313 11:52:41.272111 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7bb68b4479-msngs"] Mar 13 11:52:41 crc kubenswrapper[4837]: W0313 11:52:41.312736 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b640fc3_2425_48b0_adfa_3300a6d52002.slice/crio-097c9cec1e11f8aa83b30bf8b029f844e3b8d7cc463f16a2a3f8fa54330c0b4a WatchSource:0}: Error finding container 097c9cec1e11f8aa83b30bf8b029f844e3b8d7cc463f16a2a3f8fa54330c0b4a: Status 404 returned error can't find the container with id 097c9cec1e11f8aa83b30bf8b029f844e3b8d7cc463f16a2a3f8fa54330c0b4a Mar 13 11:52:41 crc kubenswrapper[4837]: I0313 11:52:41.414204 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7crb6" event={"ID":"080747b0-3d43-4ff1-b21c-b8ea9fc2f961","Type":"ContainerStarted","Data":"0d9d2068fa9a75fcedf62135c026dbc7b6be8fecbfe8ba1e1bd893e7874fe650"} Mar 13 11:52:41 crc kubenswrapper[4837]: I0313 11:52:41.422005 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ft6cr" event={"ID":"e6060cf2-077e-4112-af57-f100e297f320","Type":"ContainerStarted","Data":"18157b9c0c2686c96644926d7ef6b55f8879075870dcef9cd6f8b2f09be008ae"} Mar 13 11:52:41 crc kubenswrapper[4837]: I0313 11:52:41.430540 4837 generic.go:334] "Generic (PLEG): container finished" podID="5236ae0e-b305-4f1c-9125-bbac1eeb07f3" containerID="2b9a329e446509859ed3d2441ed61db997519085747ff5a45b3c942e5be127c3" exitCode=0 Mar 13 11:52:41 crc kubenswrapper[4837]: I0313 11:52:41.430688 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jspgm" event={"ID":"5236ae0e-b305-4f1c-9125-bbac1eeb07f3","Type":"ContainerDied","Data":"2b9a329e446509859ed3d2441ed61db997519085747ff5a45b3c942e5be127c3"} Mar 13 11:52:41 crc kubenswrapper[4837]: I0313 11:52:41.446123 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j246z" event={"ID":"32a36cbe-a17f-46bf-9c6a-1df6f427e2c6","Type":"ContainerStarted","Data":"14abcb9ea1fbbb60b399b29e871faa61fa90553e6b8ac0c1201e048e766e55b2"} Mar 13 11:52:41 crc kubenswrapper[4837]: I0313 11:52:41.448533 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ng6kk" event={"ID":"bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d","Type":"ContainerStarted","Data":"f0462ba3a7de7e503c15d1b89ef700574f3f475b90e578e1c36542ba0d37ed51"} Mar 13 11:52:41 crc kubenswrapper[4837]: I0313 11:52:41.458698 4837 generic.go:334] "Generic (PLEG): container finished" podID="45e6ae52-59ef-446f-917a-549d34ffbf8e" containerID="3eeb1bee61050e1f32f457f88245564bdfbb228b2e770616db2d75ef4e55866a" exitCode=0 Mar 13 11:52:41 crc kubenswrapper[4837]: I0313 11:52:41.458779 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vx4r8" event={"ID":"45e6ae52-59ef-446f-917a-549d34ffbf8e","Type":"ContainerDied","Data":"3eeb1bee61050e1f32f457f88245564bdfbb228b2e770616db2d75ef4e55866a"} Mar 13 11:52:41 crc kubenswrapper[4837]: I0313 11:52:41.470795 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7bb68b4479-msngs" event={"ID":"5b640fc3-2425-48b0-adfa-3300a6d52002","Type":"ContainerStarted","Data":"097c9cec1e11f8aa83b30bf8b029f844e3b8d7cc463f16a2a3f8fa54330c0b4a"} Mar 13 11:52:41 crc kubenswrapper[4837]: I0313 11:52:41.473086 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-twtbj" event={"ID":"278c91cc-2624-42cd-a35e-287e22d22f7d","Type":"ContainerStarted","Data":"a7440d1435344c8bb57482e03b8adcb5ddb52bcac5f7b4d1ca42e60d0e6e91e3"} Mar 13 11:52:41 crc kubenswrapper[4837]: I0313 11:52:41.483257 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerStarted","Data":"ea590165224c827e615cf9230078895eabcfa03489c1d34d92662f043fe58752"} Mar 13 11:52:41 crc kubenswrapper[4837]: I0313 11:52:41.486729 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5tnrx" event={"ID":"6870caea-07d6-4465-86b1-645a2e29b240","Type":"ContainerStarted","Data":"e7b7bdfa9636e908ca85620d4aa821fe24093ede2d0e43b39c562069b0b2da62"} Mar 13 11:52:41 crc kubenswrapper[4837]: I0313 11:52:41.595207 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-j246z" podStartSLOduration=8.203274834 podStartE2EDuration="1m0.595013537s" podCreationTimestamp="2026-03-13 11:51:41 +0000 UTC" firstStartedPulling="2026-03-13 11:51:43.586525391 +0000 UTC m=+219.224792154" lastFinishedPulling="2026-03-13 11:52:35.978264094 +0000 UTC m=+271.616530857" observedRunningTime="2026-03-13 11:52:41.567901267 +0000 UTC m=+277.206168031" watchObservedRunningTime="2026-03-13 11:52:41.595013537 +0000 UTC m=+277.233280300" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.086129 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-j246z" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.086437 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-j246z" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.494012 4837 generic.go:334] "Generic (PLEG): container finished" podID="e6060cf2-077e-4112-af57-f100e297f320" containerID="18157b9c0c2686c96644926d7ef6b55f8879075870dcef9cd6f8b2f09be008ae" exitCode=0 Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.494073 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ft6cr" event={"ID":"e6060cf2-077e-4112-af57-f100e297f320","Type":"ContainerDied","Data":"18157b9c0c2686c96644926d7ef6b55f8879075870dcef9cd6f8b2f09be008ae"} Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.496725 4837 generic.go:334] "Generic (PLEG): container finished" podID="278c91cc-2624-42cd-a35e-287e22d22f7d" containerID="a7440d1435344c8bb57482e03b8adcb5ddb52bcac5f7b4d1ca42e60d0e6e91e3" exitCode=0 Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.497367 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-twtbj" event={"ID":"278c91cc-2624-42cd-a35e-287e22d22f7d","Type":"ContainerDied","Data":"a7440d1435344c8bb57482e03b8adcb5ddb52bcac5f7b4d1ca42e60d0e6e91e3"} Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.498922 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jspgm" event={"ID":"5236ae0e-b305-4f1c-9125-bbac1eeb07f3","Type":"ContainerStarted","Data":"685e1bab7476abed18e470abc3204572f12f0276a4f4090ebce1ddb59e18c03e"} Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.500909 4837 generic.go:334] "Generic (PLEG): container finished" podID="bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d" containerID="f0462ba3a7de7e503c15d1b89ef700574f3f475b90e578e1c36542ba0d37ed51" exitCode=0 Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.500949 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ng6kk" event={"ID":"bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d","Type":"ContainerDied","Data":"f0462ba3a7de7e503c15d1b89ef700574f3f475b90e578e1c36542ba0d37ed51"} Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.506145 4837 generic.go:334] "Generic (PLEG): container finished" podID="6870caea-07d6-4465-86b1-645a2e29b240" containerID="e7b7bdfa9636e908ca85620d4aa821fe24093ede2d0e43b39c562069b0b2da62" exitCode=0 Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.506226 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5tnrx" event={"ID":"6870caea-07d6-4465-86b1-645a2e29b240","Type":"ContainerDied","Data":"e7b7bdfa9636e908ca85620d4aa821fe24093ede2d0e43b39c562069b0b2da62"} Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.507710 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7bb68b4479-msngs" event={"ID":"5b640fc3-2425-48b0-adfa-3300a6d52002","Type":"ContainerStarted","Data":"a06e6485ac9f8f31411ee3ba603ad966da0a6cdc4ea8a84b4df845f32f9ff0f0"} Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.508599 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7bb68b4479-msngs" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.511666 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vx4r8" event={"ID":"45e6ae52-59ef-446f-917a-549d34ffbf8e","Type":"ContainerStarted","Data":"437830c336afe1b86abf5ecc6f987c951e9edf2376fca0f8dd482ae1f16fa9c8"} Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.514912 4837 generic.go:334] "Generic (PLEG): container finished" podID="080747b0-3d43-4ff1-b21c-b8ea9fc2f961" containerID="0d9d2068fa9a75fcedf62135c026dbc7b6be8fecbfe8ba1e1bd893e7874fe650" exitCode=0 Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.515160 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7crb6" event={"ID":"080747b0-3d43-4ff1-b21c-b8ea9fc2f961","Type":"ContainerDied","Data":"0d9d2068fa9a75fcedf62135c026dbc7b6be8fecbfe8ba1e1bd893e7874fe650"} Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.516038 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7bb68b4479-msngs" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.544107 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb"] Mar 13 11:52:42 crc kubenswrapper[4837]: E0313 11:52:42.544989 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c0f7bce-90d1-4fe8-a832-8c4f55efd886" containerName="route-controller-manager" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.545012 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c0f7bce-90d1-4fe8-a832-8c4f55efd886" containerName="route-controller-manager" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.549187 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c0f7bce-90d1-4fe8-a832-8c4f55efd886" containerName="route-controller-manager" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.549995 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.553305 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7bb68b4479-msngs" podStartSLOduration=4.553278738 podStartE2EDuration="4.553278738s" podCreationTimestamp="2026-03-13 11:52:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:52:42.543523192 +0000 UTC m=+278.181789955" watchObservedRunningTime="2026-03-13 11:52:42.553278738 +0000 UTC m=+278.191545501" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.557957 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.558606 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.558910 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.559142 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.559463 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.559783 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.588470 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb"] Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.592138 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vx4r8" podStartSLOduration=2.95879682 podStartE2EDuration="1m4.592118957s" podCreationTimestamp="2026-03-13 11:51:38 +0000 UTC" firstStartedPulling="2026-03-13 11:51:40.268777084 +0000 UTC m=+215.907043847" lastFinishedPulling="2026-03-13 11:52:41.902099221 +0000 UTC m=+277.540365984" observedRunningTime="2026-03-13 11:52:42.573662119 +0000 UTC m=+278.211928882" watchObservedRunningTime="2026-03-13 11:52:42.592118957 +0000 UTC m=+278.230385730" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.662462 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6fdbbb9-292f-4621-892a-53a6c1c13f65-config\") pod \"route-controller-manager-6764db5ddf-mqtsb\" (UID: \"e6fdbbb9-292f-4621-892a-53a6c1c13f65\") " pod="openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.662524 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6fdbbb9-292f-4621-892a-53a6c1c13f65-serving-cert\") pod \"route-controller-manager-6764db5ddf-mqtsb\" (UID: \"e6fdbbb9-292f-4621-892a-53a6c1c13f65\") " pod="openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.662553 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e6fdbbb9-292f-4621-892a-53a6c1c13f65-client-ca\") pod \"route-controller-manager-6764db5ddf-mqtsb\" (UID: \"e6fdbbb9-292f-4621-892a-53a6c1c13f65\") " pod="openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.662606 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rft5b\" (UniqueName: \"kubernetes.io/projected/e6fdbbb9-292f-4621-892a-53a6c1c13f65-kube-api-access-rft5b\") pod \"route-controller-manager-6764db5ddf-mqtsb\" (UID: \"e6fdbbb9-292f-4621-892a-53a6c1c13f65\") " pod="openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.669233 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jspgm" podStartSLOduration=3.315294886 podStartE2EDuration="1m2.669217516s" podCreationTimestamp="2026-03-13 11:51:40 +0000 UTC" firstStartedPulling="2026-03-13 11:51:42.505912501 +0000 UTC m=+218.144179264" lastFinishedPulling="2026-03-13 11:52:41.859835131 +0000 UTC m=+277.498101894" observedRunningTime="2026-03-13 11:52:42.649862969 +0000 UTC m=+278.288129732" watchObservedRunningTime="2026-03-13 11:52:42.669217516 +0000 UTC m=+278.307484279" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.764229 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6fdbbb9-292f-4621-892a-53a6c1c13f65-config\") pod \"route-controller-manager-6764db5ddf-mqtsb\" (UID: \"e6fdbbb9-292f-4621-892a-53a6c1c13f65\") " pod="openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.764556 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6fdbbb9-292f-4621-892a-53a6c1c13f65-serving-cert\") pod \"route-controller-manager-6764db5ddf-mqtsb\" (UID: \"e6fdbbb9-292f-4621-892a-53a6c1c13f65\") " pod="openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.764665 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e6fdbbb9-292f-4621-892a-53a6c1c13f65-client-ca\") pod \"route-controller-manager-6764db5ddf-mqtsb\" (UID: \"e6fdbbb9-292f-4621-892a-53a6c1c13f65\") " pod="openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.764772 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rft5b\" (UniqueName: \"kubernetes.io/projected/e6fdbbb9-292f-4621-892a-53a6c1c13f65-kube-api-access-rft5b\") pod \"route-controller-manager-6764db5ddf-mqtsb\" (UID: \"e6fdbbb9-292f-4621-892a-53a6c1c13f65\") " pod="openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.766219 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6fdbbb9-292f-4621-892a-53a6c1c13f65-config\") pod \"route-controller-manager-6764db5ddf-mqtsb\" (UID: \"e6fdbbb9-292f-4621-892a-53a6c1c13f65\") " pod="openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.767750 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e6fdbbb9-292f-4621-892a-53a6c1c13f65-client-ca\") pod \"route-controller-manager-6764db5ddf-mqtsb\" (UID: \"e6fdbbb9-292f-4621-892a-53a6c1c13f65\") " pod="openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.780079 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6fdbbb9-292f-4621-892a-53a6c1c13f65-serving-cert\") pod \"route-controller-manager-6764db5ddf-mqtsb\" (UID: \"e6fdbbb9-292f-4621-892a-53a6c1c13f65\") " pod="openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.795390 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rft5b\" (UniqueName: \"kubernetes.io/projected/e6fdbbb9-292f-4621-892a-53a6c1c13f65-kube-api-access-rft5b\") pod \"route-controller-manager-6764db5ddf-mqtsb\" (UID: \"e6fdbbb9-292f-4621-892a-53a6c1c13f65\") " pod="openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb" Mar 13 11:52:42 crc kubenswrapper[4837]: I0313 11:52:42.887181 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb" Mar 13 11:52:43 crc kubenswrapper[4837]: I0313 11:52:43.228899 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-j246z" podUID="32a36cbe-a17f-46bf-9c6a-1df6f427e2c6" containerName="registry-server" probeResult="failure" output=< Mar 13 11:52:43 crc kubenswrapper[4837]: timeout: failed to connect service ":50051" within 1s Mar 13 11:52:43 crc kubenswrapper[4837]: > Mar 13 11:52:43 crc kubenswrapper[4837]: I0313 11:52:43.318094 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb"] Mar 13 11:52:43 crc kubenswrapper[4837]: W0313 11:52:43.324121 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6fdbbb9_292f_4621_892a_53a6c1c13f65.slice/crio-ec6669f507f7c1917535f69f1388c95e2e8ec84b237ee008635f694dc17102e4 WatchSource:0}: Error finding container ec6669f507f7c1917535f69f1388c95e2e8ec84b237ee008635f694dc17102e4: Status 404 returned error can't find the container with id ec6669f507f7c1917535f69f1388c95e2e8ec84b237ee008635f694dc17102e4 Mar 13 11:52:43 crc kubenswrapper[4837]: I0313 11:52:43.522102 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5tnrx" event={"ID":"6870caea-07d6-4465-86b1-645a2e29b240","Type":"ContainerStarted","Data":"da09fb123851d2f67106724c294a3c4bfa2982538246898019311eb5b26eaca3"} Mar 13 11:52:43 crc kubenswrapper[4837]: I0313 11:52:43.525157 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7crb6" event={"ID":"080747b0-3d43-4ff1-b21c-b8ea9fc2f961","Type":"ContainerStarted","Data":"caf645720e683fd04b4144b714a66fde6f0b64f2a123d5270dabac05a2a4caaa"} Mar 13 11:52:43 crc kubenswrapper[4837]: I0313 11:52:43.526342 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb" event={"ID":"e6fdbbb9-292f-4621-892a-53a6c1c13f65","Type":"ContainerStarted","Data":"ec6669f507f7c1917535f69f1388c95e2e8ec84b237ee008635f694dc17102e4"} Mar 13 11:52:43 crc kubenswrapper[4837]: I0313 11:52:43.538685 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5tnrx" podStartSLOduration=2.928566741 podStartE2EDuration="1m5.538664688s" podCreationTimestamp="2026-03-13 11:51:38 +0000 UTC" firstStartedPulling="2026-03-13 11:51:40.303557463 +0000 UTC m=+215.941824226" lastFinishedPulling="2026-03-13 11:52:42.91365541 +0000 UTC m=+278.551922173" observedRunningTime="2026-03-13 11:52:43.537395208 +0000 UTC m=+279.175661991" watchObservedRunningTime="2026-03-13 11:52:43.538664688 +0000 UTC m=+279.176931451" Mar 13 11:52:44 crc kubenswrapper[4837]: I0313 11:52:44.534036 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ft6cr" event={"ID":"e6060cf2-077e-4112-af57-f100e297f320","Type":"ContainerStarted","Data":"981d238a29da8dc69fd7413479e02e57c3595b2787cab7169c57d333172bede1"} Mar 13 11:52:44 crc kubenswrapper[4837]: I0313 11:52:44.536859 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-twtbj" event={"ID":"278c91cc-2624-42cd-a35e-287e22d22f7d","Type":"ContainerStarted","Data":"e6908d46230c52fea1c314d660f53fb74dbfd03beed19d0d7b5d526d78fc8a6c"} Mar 13 11:52:44 crc kubenswrapper[4837]: I0313 11:52:44.539490 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ng6kk" event={"ID":"bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d","Type":"ContainerStarted","Data":"1b6b0960e651037356989556f5ddff9457e82572c75941cbde7fc59810854ea0"} Mar 13 11:52:44 crc kubenswrapper[4837]: I0313 11:52:44.541059 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb" event={"ID":"e6fdbbb9-292f-4621-892a-53a6c1c13f65","Type":"ContainerStarted","Data":"7fdcd49d58a893ad2332b8993c2a1cdba6ab455eb48a11b6f32d6c2c777e93c3"} Mar 13 11:52:44 crc kubenswrapper[4837]: I0313 11:52:44.562344 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7crb6" podStartSLOduration=2.913330798 podStartE2EDuration="1m4.56232436s" podCreationTimestamp="2026-03-13 11:51:40 +0000 UTC" firstStartedPulling="2026-03-13 11:51:41.349855809 +0000 UTC m=+216.988122582" lastFinishedPulling="2026-03-13 11:52:42.998849381 +0000 UTC m=+278.637116144" observedRunningTime="2026-03-13 11:52:43.558889654 +0000 UTC m=+279.197156427" watchObservedRunningTime="2026-03-13 11:52:44.56232436 +0000 UTC m=+280.200591123" Mar 13 11:52:44 crc kubenswrapper[4837]: I0313 11:52:44.580421 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb" podStartSLOduration=6.580400795 podStartE2EDuration="6.580400795s" podCreationTimestamp="2026-03-13 11:52:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:52:44.580360554 +0000 UTC m=+280.218627317" watchObservedRunningTime="2026-03-13 11:52:44.580400795 +0000 UTC m=+280.218667558" Mar 13 11:52:44 crc kubenswrapper[4837]: I0313 11:52:44.583715 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ft6cr" podStartSLOduration=3.814979068 podStartE2EDuration="1m6.583697552s" podCreationTimestamp="2026-03-13 11:51:38 +0000 UTC" firstStartedPulling="2026-03-13 11:51:40.27279963 +0000 UTC m=+215.911066393" lastFinishedPulling="2026-03-13 11:52:43.041518114 +0000 UTC m=+278.679784877" observedRunningTime="2026-03-13 11:52:44.562216976 +0000 UTC m=+280.200483749" watchObservedRunningTime="2026-03-13 11:52:44.583697552 +0000 UTC m=+280.221964315" Mar 13 11:52:44 crc kubenswrapper[4837]: I0313 11:52:44.602303 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-twtbj" podStartSLOduration=4.751976723 podStartE2EDuration="1m7.602283484s" podCreationTimestamp="2026-03-13 11:51:37 +0000 UTC" firstStartedPulling="2026-03-13 11:51:40.293287261 +0000 UTC m=+215.931554034" lastFinishedPulling="2026-03-13 11:52:43.143594032 +0000 UTC m=+278.781860795" observedRunningTime="2026-03-13 11:52:44.599114243 +0000 UTC m=+280.237381006" watchObservedRunningTime="2026-03-13 11:52:44.602283484 +0000 UTC m=+280.240550247" Mar 13 11:52:45 crc kubenswrapper[4837]: I0313 11:52:45.552497 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb" Mar 13 11:52:45 crc kubenswrapper[4837]: I0313 11:52:45.560570 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb" Mar 13 11:52:45 crc kubenswrapper[4837]: I0313 11:52:45.580854 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ng6kk" podStartSLOduration=5.03349287 podStartE2EDuration="1m4.580826804s" podCreationTimestamp="2026-03-13 11:51:41 +0000 UTC" firstStartedPulling="2026-03-13 11:51:43.602827571 +0000 UTC m=+219.241094334" lastFinishedPulling="2026-03-13 11:52:43.150161505 +0000 UTC m=+278.788428268" observedRunningTime="2026-03-13 11:52:44.633510007 +0000 UTC m=+280.271776770" watchObservedRunningTime="2026-03-13 11:52:45.580826804 +0000 UTC m=+281.219093567" Mar 13 11:52:48 crc kubenswrapper[4837]: I0313 11:52:48.320877 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-twtbj" Mar 13 11:52:48 crc kubenswrapper[4837]: I0313 11:52:48.322573 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-twtbj" Mar 13 11:52:48 crc kubenswrapper[4837]: I0313 11:52:48.374670 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-twtbj" Mar 13 11:52:48 crc kubenswrapper[4837]: I0313 11:52:48.545290 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ft6cr" Mar 13 11:52:48 crc kubenswrapper[4837]: I0313 11:52:48.546850 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ft6cr" Mar 13 11:52:48 crc kubenswrapper[4837]: I0313 11:52:48.588380 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ft6cr" Mar 13 11:52:48 crc kubenswrapper[4837]: I0313 11:52:48.610078 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-twtbj" Mar 13 11:52:48 crc kubenswrapper[4837]: I0313 11:52:48.717696 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vx4r8" Mar 13 11:52:48 crc kubenswrapper[4837]: I0313 11:52:48.717756 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vx4r8" Mar 13 11:52:48 crc kubenswrapper[4837]: I0313 11:52:48.754047 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vx4r8" Mar 13 11:52:48 crc kubenswrapper[4837]: I0313 11:52:48.978344 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5tnrx" Mar 13 11:52:48 crc kubenswrapper[4837]: I0313 11:52:48.978400 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5tnrx" Mar 13 11:52:49 crc kubenswrapper[4837]: I0313 11:52:49.034551 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5tnrx" Mar 13 11:52:49 crc kubenswrapper[4837]: I0313 11:52:49.616936 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vx4r8" Mar 13 11:52:49 crc kubenswrapper[4837]: I0313 11:52:49.619048 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5tnrx" Mar 13 11:52:49 crc kubenswrapper[4837]: I0313 11:52:49.632603 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ft6cr" Mar 13 11:52:50 crc kubenswrapper[4837]: I0313 11:52:50.506198 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7crb6" Mar 13 11:52:50 crc kubenswrapper[4837]: I0313 11:52:50.506483 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7crb6" Mar 13 11:52:50 crc kubenswrapper[4837]: I0313 11:52:50.575600 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7crb6" Mar 13 11:52:50 crc kubenswrapper[4837]: I0313 11:52:50.713447 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7crb6" Mar 13 11:52:50 crc kubenswrapper[4837]: I0313 11:52:50.880720 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vx4r8"] Mar 13 11:52:50 crc kubenswrapper[4837]: I0313 11:52:50.894504 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jspgm" Mar 13 11:52:50 crc kubenswrapper[4837]: I0313 11:52:50.894577 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jspgm" Mar 13 11:52:50 crc kubenswrapper[4837]: I0313 11:52:50.952592 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jspgm" Mar 13 11:52:51 crc kubenswrapper[4837]: I0313 11:52:51.588984 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ng6kk" Mar 13 11:52:51 crc kubenswrapper[4837]: I0313 11:52:51.589141 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vx4r8" podUID="45e6ae52-59ef-446f-917a-549d34ffbf8e" containerName="registry-server" containerID="cri-o://437830c336afe1b86abf5ecc6f987c951e9edf2376fca0f8dd482ae1f16fa9c8" gracePeriod=2 Mar 13 11:52:51 crc kubenswrapper[4837]: I0313 11:52:51.589188 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ng6kk" Mar 13 11:52:51 crc kubenswrapper[4837]: I0313 11:52:51.631724 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jspgm" Mar 13 11:52:51 crc kubenswrapper[4837]: I0313 11:52:51.662331 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ng6kk" Mar 13 11:52:51 crc kubenswrapper[4837]: I0313 11:52:51.882131 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5tnrx"] Mar 13 11:52:51 crc kubenswrapper[4837]: I0313 11:52:51.882357 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5tnrx" podUID="6870caea-07d6-4465-86b1-645a2e29b240" containerName="registry-server" containerID="cri-o://da09fb123851d2f67106724c294a3c4bfa2982538246898019311eb5b26eaca3" gracePeriod=2 Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.060628 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vx4r8" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.088549 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmfv2\" (UniqueName: \"kubernetes.io/projected/45e6ae52-59ef-446f-917a-549d34ffbf8e-kube-api-access-xmfv2\") pod \"45e6ae52-59ef-446f-917a-549d34ffbf8e\" (UID: \"45e6ae52-59ef-446f-917a-549d34ffbf8e\") " Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.088615 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45e6ae52-59ef-446f-917a-549d34ffbf8e-catalog-content\") pod \"45e6ae52-59ef-446f-917a-549d34ffbf8e\" (UID: \"45e6ae52-59ef-446f-917a-549d34ffbf8e\") " Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.088702 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45e6ae52-59ef-446f-917a-549d34ffbf8e-utilities\") pod \"45e6ae52-59ef-446f-917a-549d34ffbf8e\" (UID: \"45e6ae52-59ef-446f-917a-549d34ffbf8e\") " Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.091152 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45e6ae52-59ef-446f-917a-549d34ffbf8e-utilities" (OuterVolumeSpecName: "utilities") pod "45e6ae52-59ef-446f-917a-549d34ffbf8e" (UID: "45e6ae52-59ef-446f-917a-549d34ffbf8e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.095814 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45e6ae52-59ef-446f-917a-549d34ffbf8e-kube-api-access-xmfv2" (OuterVolumeSpecName: "kube-api-access-xmfv2") pod "45e6ae52-59ef-446f-917a-549d34ffbf8e" (UID: "45e6ae52-59ef-446f-917a-549d34ffbf8e"). InnerVolumeSpecName "kube-api-access-xmfv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.132199 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-j246z" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.152342 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45e6ae52-59ef-446f-917a-549d34ffbf8e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "45e6ae52-59ef-446f-917a-549d34ffbf8e" (UID: "45e6ae52-59ef-446f-917a-549d34ffbf8e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.167308 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-j246z" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.190465 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmfv2\" (UniqueName: \"kubernetes.io/projected/45e6ae52-59ef-446f-917a-549d34ffbf8e-kube-api-access-xmfv2\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.190508 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45e6ae52-59ef-446f-917a-549d34ffbf8e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.190598 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45e6ae52-59ef-446f-917a-549d34ffbf8e-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.487871 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5tnrx" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.596919 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rpl9\" (UniqueName: \"kubernetes.io/projected/6870caea-07d6-4465-86b1-645a2e29b240-kube-api-access-4rpl9\") pod \"6870caea-07d6-4465-86b1-645a2e29b240\" (UID: \"6870caea-07d6-4465-86b1-645a2e29b240\") " Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.596993 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6870caea-07d6-4465-86b1-645a2e29b240-catalog-content\") pod \"6870caea-07d6-4465-86b1-645a2e29b240\" (UID: \"6870caea-07d6-4465-86b1-645a2e29b240\") " Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.597078 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6870caea-07d6-4465-86b1-645a2e29b240-utilities\") pod \"6870caea-07d6-4465-86b1-645a2e29b240\" (UID: \"6870caea-07d6-4465-86b1-645a2e29b240\") " Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.598165 4837 generic.go:334] "Generic (PLEG): container finished" podID="6870caea-07d6-4465-86b1-645a2e29b240" containerID="da09fb123851d2f67106724c294a3c4bfa2982538246898019311eb5b26eaca3" exitCode=0 Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.598246 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5tnrx" event={"ID":"6870caea-07d6-4465-86b1-645a2e29b240","Type":"ContainerDied","Data":"da09fb123851d2f67106724c294a3c4bfa2982538246898019311eb5b26eaca3"} Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.598282 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5tnrx" event={"ID":"6870caea-07d6-4465-86b1-645a2e29b240","Type":"ContainerDied","Data":"bfe6aca1334934677df8bf272b7d6fdeb1c785b92dcc8ef7c0566c6636ddfaa3"} Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.598309 4837 scope.go:117] "RemoveContainer" containerID="da09fb123851d2f67106724c294a3c4bfa2982538246898019311eb5b26eaca3" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.598338 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6870caea-07d6-4465-86b1-645a2e29b240-utilities" (OuterVolumeSpecName: "utilities") pod "6870caea-07d6-4465-86b1-645a2e29b240" (UID: "6870caea-07d6-4465-86b1-645a2e29b240"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.598480 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5tnrx" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.601385 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6870caea-07d6-4465-86b1-645a2e29b240-kube-api-access-4rpl9" (OuterVolumeSpecName: "kube-api-access-4rpl9") pod "6870caea-07d6-4465-86b1-645a2e29b240" (UID: "6870caea-07d6-4465-86b1-645a2e29b240"). InnerVolumeSpecName "kube-api-access-4rpl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.603358 4837 generic.go:334] "Generic (PLEG): container finished" podID="45e6ae52-59ef-446f-917a-549d34ffbf8e" containerID="437830c336afe1b86abf5ecc6f987c951e9edf2376fca0f8dd482ae1f16fa9c8" exitCode=0 Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.603421 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vx4r8" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.603451 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vx4r8" event={"ID":"45e6ae52-59ef-446f-917a-549d34ffbf8e","Type":"ContainerDied","Data":"437830c336afe1b86abf5ecc6f987c951e9edf2376fca0f8dd482ae1f16fa9c8"} Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.603526 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vx4r8" event={"ID":"45e6ae52-59ef-446f-917a-549d34ffbf8e","Type":"ContainerDied","Data":"b8c38b609b1ee957c7e1e1a563341d86aa7368639c49a74a0e6c541c1d320168"} Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.625746 4837 scope.go:117] "RemoveContainer" containerID="e7b7bdfa9636e908ca85620d4aa821fe24093ede2d0e43b39c562069b0b2da62" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.645280 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vx4r8"] Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.648767 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vx4r8"] Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.652904 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ng6kk" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.661477 4837 scope.go:117] "RemoveContainer" containerID="ce94a39ad6afbdcb9bafffd4ef0157cee1cf3107185ab6fcc83d0813bc89a94d" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.680090 4837 scope.go:117] "RemoveContainer" containerID="da09fb123851d2f67106724c294a3c4bfa2982538246898019311eb5b26eaca3" Mar 13 11:52:52 crc kubenswrapper[4837]: E0313 11:52:52.680654 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da09fb123851d2f67106724c294a3c4bfa2982538246898019311eb5b26eaca3\": container with ID starting with da09fb123851d2f67106724c294a3c4bfa2982538246898019311eb5b26eaca3 not found: ID does not exist" containerID="da09fb123851d2f67106724c294a3c4bfa2982538246898019311eb5b26eaca3" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.680697 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da09fb123851d2f67106724c294a3c4bfa2982538246898019311eb5b26eaca3"} err="failed to get container status \"da09fb123851d2f67106724c294a3c4bfa2982538246898019311eb5b26eaca3\": rpc error: code = NotFound desc = could not find container \"da09fb123851d2f67106724c294a3c4bfa2982538246898019311eb5b26eaca3\": container with ID starting with da09fb123851d2f67106724c294a3c4bfa2982538246898019311eb5b26eaca3 not found: ID does not exist" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.680727 4837 scope.go:117] "RemoveContainer" containerID="e7b7bdfa9636e908ca85620d4aa821fe24093ede2d0e43b39c562069b0b2da62" Mar 13 11:52:52 crc kubenswrapper[4837]: E0313 11:52:52.681068 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7b7bdfa9636e908ca85620d4aa821fe24093ede2d0e43b39c562069b0b2da62\": container with ID starting with e7b7bdfa9636e908ca85620d4aa821fe24093ede2d0e43b39c562069b0b2da62 not found: ID does not exist" containerID="e7b7bdfa9636e908ca85620d4aa821fe24093ede2d0e43b39c562069b0b2da62" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.681126 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7b7bdfa9636e908ca85620d4aa821fe24093ede2d0e43b39c562069b0b2da62"} err="failed to get container status \"e7b7bdfa9636e908ca85620d4aa821fe24093ede2d0e43b39c562069b0b2da62\": rpc error: code = NotFound desc = could not find container \"e7b7bdfa9636e908ca85620d4aa821fe24093ede2d0e43b39c562069b0b2da62\": container with ID starting with e7b7bdfa9636e908ca85620d4aa821fe24093ede2d0e43b39c562069b0b2da62 not found: ID does not exist" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.681160 4837 scope.go:117] "RemoveContainer" containerID="ce94a39ad6afbdcb9bafffd4ef0157cee1cf3107185ab6fcc83d0813bc89a94d" Mar 13 11:52:52 crc kubenswrapper[4837]: E0313 11:52:52.681581 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce94a39ad6afbdcb9bafffd4ef0157cee1cf3107185ab6fcc83d0813bc89a94d\": container with ID starting with ce94a39ad6afbdcb9bafffd4ef0157cee1cf3107185ab6fcc83d0813bc89a94d not found: ID does not exist" containerID="ce94a39ad6afbdcb9bafffd4ef0157cee1cf3107185ab6fcc83d0813bc89a94d" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.681656 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce94a39ad6afbdcb9bafffd4ef0157cee1cf3107185ab6fcc83d0813bc89a94d"} err="failed to get container status \"ce94a39ad6afbdcb9bafffd4ef0157cee1cf3107185ab6fcc83d0813bc89a94d\": rpc error: code = NotFound desc = could not find container \"ce94a39ad6afbdcb9bafffd4ef0157cee1cf3107185ab6fcc83d0813bc89a94d\": container with ID starting with ce94a39ad6afbdcb9bafffd4ef0157cee1cf3107185ab6fcc83d0813bc89a94d not found: ID does not exist" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.681677 4837 scope.go:117] "RemoveContainer" containerID="437830c336afe1b86abf5ecc6f987c951e9edf2376fca0f8dd482ae1f16fa9c8" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.699366 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6870caea-07d6-4465-86b1-645a2e29b240-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6870caea-07d6-4465-86b1-645a2e29b240" (UID: "6870caea-07d6-4465-86b1-645a2e29b240"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.702391 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rpl9\" (UniqueName: \"kubernetes.io/projected/6870caea-07d6-4465-86b1-645a2e29b240-kube-api-access-4rpl9\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.702778 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6870caea-07d6-4465-86b1-645a2e29b240-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.702989 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6870caea-07d6-4465-86b1-645a2e29b240-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.703887 4837 scope.go:117] "RemoveContainer" containerID="3eeb1bee61050e1f32f457f88245564bdfbb228b2e770616db2d75ef4e55866a" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.725602 4837 scope.go:117] "RemoveContainer" containerID="29b7adb5a9c0b54134cefd4b865e773828049385da6ba275d2791faad9875780" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.739122 4837 scope.go:117] "RemoveContainer" containerID="437830c336afe1b86abf5ecc6f987c951e9edf2376fca0f8dd482ae1f16fa9c8" Mar 13 11:52:52 crc kubenswrapper[4837]: E0313 11:52:52.739796 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"437830c336afe1b86abf5ecc6f987c951e9edf2376fca0f8dd482ae1f16fa9c8\": container with ID starting with 437830c336afe1b86abf5ecc6f987c951e9edf2376fca0f8dd482ae1f16fa9c8 not found: ID does not exist" containerID="437830c336afe1b86abf5ecc6f987c951e9edf2376fca0f8dd482ae1f16fa9c8" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.739843 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"437830c336afe1b86abf5ecc6f987c951e9edf2376fca0f8dd482ae1f16fa9c8"} err="failed to get container status \"437830c336afe1b86abf5ecc6f987c951e9edf2376fca0f8dd482ae1f16fa9c8\": rpc error: code = NotFound desc = could not find container \"437830c336afe1b86abf5ecc6f987c951e9edf2376fca0f8dd482ae1f16fa9c8\": container with ID starting with 437830c336afe1b86abf5ecc6f987c951e9edf2376fca0f8dd482ae1f16fa9c8 not found: ID does not exist" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.739870 4837 scope.go:117] "RemoveContainer" containerID="3eeb1bee61050e1f32f457f88245564bdfbb228b2e770616db2d75ef4e55866a" Mar 13 11:52:52 crc kubenswrapper[4837]: E0313 11:52:52.740137 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3eeb1bee61050e1f32f457f88245564bdfbb228b2e770616db2d75ef4e55866a\": container with ID starting with 3eeb1bee61050e1f32f457f88245564bdfbb228b2e770616db2d75ef4e55866a not found: ID does not exist" containerID="3eeb1bee61050e1f32f457f88245564bdfbb228b2e770616db2d75ef4e55866a" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.740167 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3eeb1bee61050e1f32f457f88245564bdfbb228b2e770616db2d75ef4e55866a"} err="failed to get container status \"3eeb1bee61050e1f32f457f88245564bdfbb228b2e770616db2d75ef4e55866a\": rpc error: code = NotFound desc = could not find container \"3eeb1bee61050e1f32f457f88245564bdfbb228b2e770616db2d75ef4e55866a\": container with ID starting with 3eeb1bee61050e1f32f457f88245564bdfbb228b2e770616db2d75ef4e55866a not found: ID does not exist" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.740187 4837 scope.go:117] "RemoveContainer" containerID="29b7adb5a9c0b54134cefd4b865e773828049385da6ba275d2791faad9875780" Mar 13 11:52:52 crc kubenswrapper[4837]: E0313 11:52:52.740411 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29b7adb5a9c0b54134cefd4b865e773828049385da6ba275d2791faad9875780\": container with ID starting with 29b7adb5a9c0b54134cefd4b865e773828049385da6ba275d2791faad9875780 not found: ID does not exist" containerID="29b7adb5a9c0b54134cefd4b865e773828049385da6ba275d2791faad9875780" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.740451 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29b7adb5a9c0b54134cefd4b865e773828049385da6ba275d2791faad9875780"} err="failed to get container status \"29b7adb5a9c0b54134cefd4b865e773828049385da6ba275d2791faad9875780\": rpc error: code = NotFound desc = could not find container \"29b7adb5a9c0b54134cefd4b865e773828049385da6ba275d2791faad9875780\": container with ID starting with 29b7adb5a9c0b54134cefd4b865e773828049385da6ba275d2791faad9875780 not found: ID does not exist" Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.928018 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5tnrx"] Mar 13 11:52:52 crc kubenswrapper[4837]: I0313 11:52:52.931540 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5tnrx"] Mar 13 11:52:53 crc kubenswrapper[4837]: I0313 11:52:53.055536 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45e6ae52-59ef-446f-917a-549d34ffbf8e" path="/var/lib/kubelet/pods/45e6ae52-59ef-446f-917a-549d34ffbf8e/volumes" Mar 13 11:52:53 crc kubenswrapper[4837]: I0313 11:52:53.056208 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6870caea-07d6-4465-86b1-645a2e29b240" path="/var/lib/kubelet/pods/6870caea-07d6-4465-86b1-645a2e29b240/volumes" Mar 13 11:52:53 crc kubenswrapper[4837]: I0313 11:52:53.282744 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jspgm"] Mar 13 11:52:53 crc kubenswrapper[4837]: I0313 11:52:53.609254 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jspgm" podUID="5236ae0e-b305-4f1c-9125-bbac1eeb07f3" containerName="registry-server" containerID="cri-o://685e1bab7476abed18e470abc3204572f12f0276a4f4090ebce1ddb59e18c03e" gracePeriod=2 Mar 13 11:52:54 crc kubenswrapper[4837]: I0313 11:52:54.088272 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jspgm" Mar 13 11:52:54 crc kubenswrapper[4837]: I0313 11:52:54.121828 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5236ae0e-b305-4f1c-9125-bbac1eeb07f3-utilities\") pod \"5236ae0e-b305-4f1c-9125-bbac1eeb07f3\" (UID: \"5236ae0e-b305-4f1c-9125-bbac1eeb07f3\") " Mar 13 11:52:54 crc kubenswrapper[4837]: I0313 11:52:54.121958 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5236ae0e-b305-4f1c-9125-bbac1eeb07f3-catalog-content\") pod \"5236ae0e-b305-4f1c-9125-bbac1eeb07f3\" (UID: \"5236ae0e-b305-4f1c-9125-bbac1eeb07f3\") " Mar 13 11:52:54 crc kubenswrapper[4837]: I0313 11:52:54.122071 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l42fc\" (UniqueName: \"kubernetes.io/projected/5236ae0e-b305-4f1c-9125-bbac1eeb07f3-kube-api-access-l42fc\") pod \"5236ae0e-b305-4f1c-9125-bbac1eeb07f3\" (UID: \"5236ae0e-b305-4f1c-9125-bbac1eeb07f3\") " Mar 13 11:52:54 crc kubenswrapper[4837]: I0313 11:52:54.122973 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5236ae0e-b305-4f1c-9125-bbac1eeb07f3-utilities" (OuterVolumeSpecName: "utilities") pod "5236ae0e-b305-4f1c-9125-bbac1eeb07f3" (UID: "5236ae0e-b305-4f1c-9125-bbac1eeb07f3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:52:54 crc kubenswrapper[4837]: I0313 11:52:54.123905 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5236ae0e-b305-4f1c-9125-bbac1eeb07f3-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:54 crc kubenswrapper[4837]: I0313 11:52:54.126595 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5236ae0e-b305-4f1c-9125-bbac1eeb07f3-kube-api-access-l42fc" (OuterVolumeSpecName: "kube-api-access-l42fc") pod "5236ae0e-b305-4f1c-9125-bbac1eeb07f3" (UID: "5236ae0e-b305-4f1c-9125-bbac1eeb07f3"). InnerVolumeSpecName "kube-api-access-l42fc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:52:54 crc kubenswrapper[4837]: I0313 11:52:54.148310 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5236ae0e-b305-4f1c-9125-bbac1eeb07f3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5236ae0e-b305-4f1c-9125-bbac1eeb07f3" (UID: "5236ae0e-b305-4f1c-9125-bbac1eeb07f3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:52:54 crc kubenswrapper[4837]: I0313 11:52:54.225224 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5236ae0e-b305-4f1c-9125-bbac1eeb07f3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:54 crc kubenswrapper[4837]: I0313 11:52:54.225268 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l42fc\" (UniqueName: \"kubernetes.io/projected/5236ae0e-b305-4f1c-9125-bbac1eeb07f3-kube-api-access-l42fc\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:54 crc kubenswrapper[4837]: I0313 11:52:54.618619 4837 generic.go:334] "Generic (PLEG): container finished" podID="5236ae0e-b305-4f1c-9125-bbac1eeb07f3" containerID="685e1bab7476abed18e470abc3204572f12f0276a4f4090ebce1ddb59e18c03e" exitCode=0 Mar 13 11:52:54 crc kubenswrapper[4837]: I0313 11:52:54.618705 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jspgm" Mar 13 11:52:54 crc kubenswrapper[4837]: I0313 11:52:54.618782 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jspgm" event={"ID":"5236ae0e-b305-4f1c-9125-bbac1eeb07f3","Type":"ContainerDied","Data":"685e1bab7476abed18e470abc3204572f12f0276a4f4090ebce1ddb59e18c03e"} Mar 13 11:52:54 crc kubenswrapper[4837]: I0313 11:52:54.618933 4837 scope.go:117] "RemoveContainer" containerID="685e1bab7476abed18e470abc3204572f12f0276a4f4090ebce1ddb59e18c03e" Mar 13 11:52:54 crc kubenswrapper[4837]: I0313 11:52:54.619213 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jspgm" event={"ID":"5236ae0e-b305-4f1c-9125-bbac1eeb07f3","Type":"ContainerDied","Data":"307f294c9c816d0f8c581cbf3561f2a5e0cff01395517438e2ad320ce61f35e4"} Mar 13 11:52:54 crc kubenswrapper[4837]: I0313 11:52:54.646100 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jspgm"] Mar 13 11:52:54 crc kubenswrapper[4837]: I0313 11:52:54.649701 4837 scope.go:117] "RemoveContainer" containerID="2b9a329e446509859ed3d2441ed61db997519085747ff5a45b3c942e5be127c3" Mar 13 11:52:54 crc kubenswrapper[4837]: I0313 11:52:54.650227 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jspgm"] Mar 13 11:52:54 crc kubenswrapper[4837]: I0313 11:52:54.675299 4837 scope.go:117] "RemoveContainer" containerID="23cf1d25c2a824231d5fd39eb5a9920394e587e257254f6ffa6bf893fe9f2624" Mar 13 11:52:54 crc kubenswrapper[4837]: I0313 11:52:54.689455 4837 scope.go:117] "RemoveContainer" containerID="685e1bab7476abed18e470abc3204572f12f0276a4f4090ebce1ddb59e18c03e" Mar 13 11:52:54 crc kubenswrapper[4837]: E0313 11:52:54.689858 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"685e1bab7476abed18e470abc3204572f12f0276a4f4090ebce1ddb59e18c03e\": container with ID starting with 685e1bab7476abed18e470abc3204572f12f0276a4f4090ebce1ddb59e18c03e not found: ID does not exist" containerID="685e1bab7476abed18e470abc3204572f12f0276a4f4090ebce1ddb59e18c03e" Mar 13 11:52:54 crc kubenswrapper[4837]: I0313 11:52:54.689899 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"685e1bab7476abed18e470abc3204572f12f0276a4f4090ebce1ddb59e18c03e"} err="failed to get container status \"685e1bab7476abed18e470abc3204572f12f0276a4f4090ebce1ddb59e18c03e\": rpc error: code = NotFound desc = could not find container \"685e1bab7476abed18e470abc3204572f12f0276a4f4090ebce1ddb59e18c03e\": container with ID starting with 685e1bab7476abed18e470abc3204572f12f0276a4f4090ebce1ddb59e18c03e not found: ID does not exist" Mar 13 11:52:54 crc kubenswrapper[4837]: I0313 11:52:54.689925 4837 scope.go:117] "RemoveContainer" containerID="2b9a329e446509859ed3d2441ed61db997519085747ff5a45b3c942e5be127c3" Mar 13 11:52:54 crc kubenswrapper[4837]: E0313 11:52:54.690311 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b9a329e446509859ed3d2441ed61db997519085747ff5a45b3c942e5be127c3\": container with ID starting with 2b9a329e446509859ed3d2441ed61db997519085747ff5a45b3c942e5be127c3 not found: ID does not exist" containerID="2b9a329e446509859ed3d2441ed61db997519085747ff5a45b3c942e5be127c3" Mar 13 11:52:54 crc kubenswrapper[4837]: I0313 11:52:54.690358 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b9a329e446509859ed3d2441ed61db997519085747ff5a45b3c942e5be127c3"} err="failed to get container status \"2b9a329e446509859ed3d2441ed61db997519085747ff5a45b3c942e5be127c3\": rpc error: code = NotFound desc = could not find container \"2b9a329e446509859ed3d2441ed61db997519085747ff5a45b3c942e5be127c3\": container with ID starting with 2b9a329e446509859ed3d2441ed61db997519085747ff5a45b3c942e5be127c3 not found: ID does not exist" Mar 13 11:52:54 crc kubenswrapper[4837]: I0313 11:52:54.690375 4837 scope.go:117] "RemoveContainer" containerID="23cf1d25c2a824231d5fd39eb5a9920394e587e257254f6ffa6bf893fe9f2624" Mar 13 11:52:54 crc kubenswrapper[4837]: E0313 11:52:54.690695 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23cf1d25c2a824231d5fd39eb5a9920394e587e257254f6ffa6bf893fe9f2624\": container with ID starting with 23cf1d25c2a824231d5fd39eb5a9920394e587e257254f6ffa6bf893fe9f2624 not found: ID does not exist" containerID="23cf1d25c2a824231d5fd39eb5a9920394e587e257254f6ffa6bf893fe9f2624" Mar 13 11:52:54 crc kubenswrapper[4837]: I0313 11:52:54.690749 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23cf1d25c2a824231d5fd39eb5a9920394e587e257254f6ffa6bf893fe9f2624"} err="failed to get container status \"23cf1d25c2a824231d5fd39eb5a9920394e587e257254f6ffa6bf893fe9f2624\": rpc error: code = NotFound desc = could not find container \"23cf1d25c2a824231d5fd39eb5a9920394e587e257254f6ffa6bf893fe9f2624\": container with ID starting with 23cf1d25c2a824231d5fd39eb5a9920394e587e257254f6ffa6bf893fe9f2624 not found: ID does not exist" Mar 13 11:52:55 crc kubenswrapper[4837]: I0313 11:52:55.055361 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5236ae0e-b305-4f1c-9125-bbac1eeb07f3" path="/var/lib/kubelet/pods/5236ae0e-b305-4f1c-9125-bbac1eeb07f3/volumes" Mar 13 11:52:55 crc kubenswrapper[4837]: I0313 11:52:55.686578 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j246z"] Mar 13 11:52:55 crc kubenswrapper[4837]: I0313 11:52:55.687014 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-j246z" podUID="32a36cbe-a17f-46bf-9c6a-1df6f427e2c6" containerName="registry-server" containerID="cri-o://14abcb9ea1fbbb60b399b29e871faa61fa90553e6b8ac0c1201e048e766e55b2" gracePeriod=2 Mar 13 11:52:56 crc kubenswrapper[4837]: I0313 11:52:56.634721 4837 generic.go:334] "Generic (PLEG): container finished" podID="32a36cbe-a17f-46bf-9c6a-1df6f427e2c6" containerID="14abcb9ea1fbbb60b399b29e871faa61fa90553e6b8ac0c1201e048e766e55b2" exitCode=0 Mar 13 11:52:56 crc kubenswrapper[4837]: I0313 11:52:56.634764 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j246z" event={"ID":"32a36cbe-a17f-46bf-9c6a-1df6f427e2c6","Type":"ContainerDied","Data":"14abcb9ea1fbbb60b399b29e871faa61fa90553e6b8ac0c1201e048e766e55b2"} Mar 13 11:52:57 crc kubenswrapper[4837]: I0313 11:52:57.185321 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j246z" Mar 13 11:52:57 crc kubenswrapper[4837]: I0313 11:52:57.261844 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32a36cbe-a17f-46bf-9c6a-1df6f427e2c6-utilities\") pod \"32a36cbe-a17f-46bf-9c6a-1df6f427e2c6\" (UID: \"32a36cbe-a17f-46bf-9c6a-1df6f427e2c6\") " Mar 13 11:52:57 crc kubenswrapper[4837]: I0313 11:52:57.261928 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32a36cbe-a17f-46bf-9c6a-1df6f427e2c6-catalog-content\") pod \"32a36cbe-a17f-46bf-9c6a-1df6f427e2c6\" (UID: \"32a36cbe-a17f-46bf-9c6a-1df6f427e2c6\") " Mar 13 11:52:57 crc kubenswrapper[4837]: I0313 11:52:57.261953 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xx4zq\" (UniqueName: \"kubernetes.io/projected/32a36cbe-a17f-46bf-9c6a-1df6f427e2c6-kube-api-access-xx4zq\") pod \"32a36cbe-a17f-46bf-9c6a-1df6f427e2c6\" (UID: \"32a36cbe-a17f-46bf-9c6a-1df6f427e2c6\") " Mar 13 11:52:57 crc kubenswrapper[4837]: I0313 11:52:57.262897 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32a36cbe-a17f-46bf-9c6a-1df6f427e2c6-utilities" (OuterVolumeSpecName: "utilities") pod "32a36cbe-a17f-46bf-9c6a-1df6f427e2c6" (UID: "32a36cbe-a17f-46bf-9c6a-1df6f427e2c6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:52:57 crc kubenswrapper[4837]: I0313 11:52:57.266943 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32a36cbe-a17f-46bf-9c6a-1df6f427e2c6-kube-api-access-xx4zq" (OuterVolumeSpecName: "kube-api-access-xx4zq") pod "32a36cbe-a17f-46bf-9c6a-1df6f427e2c6" (UID: "32a36cbe-a17f-46bf-9c6a-1df6f427e2c6"). InnerVolumeSpecName "kube-api-access-xx4zq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:52:57 crc kubenswrapper[4837]: I0313 11:52:57.362872 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32a36cbe-a17f-46bf-9c6a-1df6f427e2c6-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:57 crc kubenswrapper[4837]: I0313 11:52:57.362899 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xx4zq\" (UniqueName: \"kubernetes.io/projected/32a36cbe-a17f-46bf-9c6a-1df6f427e2c6-kube-api-access-xx4zq\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:57 crc kubenswrapper[4837]: I0313 11:52:57.404903 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32a36cbe-a17f-46bf-9c6a-1df6f427e2c6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "32a36cbe-a17f-46bf-9c6a-1df6f427e2c6" (UID: "32a36cbe-a17f-46bf-9c6a-1df6f427e2c6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:52:57 crc kubenswrapper[4837]: I0313 11:52:57.464820 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32a36cbe-a17f-46bf-9c6a-1df6f427e2c6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:57 crc kubenswrapper[4837]: I0313 11:52:57.642780 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j246z" event={"ID":"32a36cbe-a17f-46bf-9c6a-1df6f427e2c6","Type":"ContainerDied","Data":"524d259feb76e4121fddc10b32a9829c69c7b137ab82d2d2c18f81ea9d556b60"} Mar 13 11:52:57 crc kubenswrapper[4837]: I0313 11:52:57.642833 4837 scope.go:117] "RemoveContainer" containerID="14abcb9ea1fbbb60b399b29e871faa61fa90553e6b8ac0c1201e048e766e55b2" Mar 13 11:52:57 crc kubenswrapper[4837]: I0313 11:52:57.642839 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j246z" Mar 13 11:52:57 crc kubenswrapper[4837]: I0313 11:52:57.660151 4837 scope.go:117] "RemoveContainer" containerID="73bd112e9625df12ee2d18dc3c732843142d27d7ee69bd7e78c3b55fe032dc84" Mar 13 11:52:57 crc kubenswrapper[4837]: I0313 11:52:57.679328 4837 scope.go:117] "RemoveContainer" containerID="613107c1ce24dcf9cb1cf0c1623f3de9a7d5b33bc09c57a646911cae7011d82e" Mar 13 11:52:57 crc kubenswrapper[4837]: I0313 11:52:57.680234 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j246z"] Mar 13 11:52:57 crc kubenswrapper[4837]: I0313 11:52:57.687743 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-j246z"] Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.128118 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7bb68b4479-msngs"] Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.128346 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7bb68b4479-msngs" podUID="5b640fc3-2425-48b0-adfa-3300a6d52002" containerName="controller-manager" containerID="cri-o://a06e6485ac9f8f31411ee3ba603ad966da0a6cdc4ea8a84b4df845f32f9ff0f0" gracePeriod=30 Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.230458 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb"] Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.231293 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb" podUID="e6fdbbb9-292f-4621-892a-53a6c1c13f65" containerName="route-controller-manager" containerID="cri-o://7fdcd49d58a893ad2332b8993c2a1cdba6ab455eb48a11b6f32d6c2c777e93c3" gracePeriod=30 Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.649889 4837 generic.go:334] "Generic (PLEG): container finished" podID="5b640fc3-2425-48b0-adfa-3300a6d52002" containerID="a06e6485ac9f8f31411ee3ba603ad966da0a6cdc4ea8a84b4df845f32f9ff0f0" exitCode=0 Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.649966 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7bb68b4479-msngs" event={"ID":"5b640fc3-2425-48b0-adfa-3300a6d52002","Type":"ContainerDied","Data":"a06e6485ac9f8f31411ee3ba603ad966da0a6cdc4ea8a84b4df845f32f9ff0f0"} Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.658369 4837 generic.go:334] "Generic (PLEG): container finished" podID="e6fdbbb9-292f-4621-892a-53a6c1c13f65" containerID="7fdcd49d58a893ad2332b8993c2a1cdba6ab455eb48a11b6f32d6c2c777e93c3" exitCode=0 Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.658404 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb" event={"ID":"e6fdbbb9-292f-4621-892a-53a6c1c13f65","Type":"ContainerDied","Data":"7fdcd49d58a893ad2332b8993c2a1cdba6ab455eb48a11b6f32d6c2c777e93c3"} Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.784996 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb" Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.791725 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bb68b4479-msngs" Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.893944 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5b640fc3-2425-48b0-adfa-3300a6d52002-proxy-ca-bundles\") pod \"5b640fc3-2425-48b0-adfa-3300a6d52002\" (UID: \"5b640fc3-2425-48b0-adfa-3300a6d52002\") " Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.893993 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rft5b\" (UniqueName: \"kubernetes.io/projected/e6fdbbb9-292f-4621-892a-53a6c1c13f65-kube-api-access-rft5b\") pod \"e6fdbbb9-292f-4621-892a-53a6c1c13f65\" (UID: \"e6fdbbb9-292f-4621-892a-53a6c1c13f65\") " Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.894029 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6fdbbb9-292f-4621-892a-53a6c1c13f65-config\") pod \"e6fdbbb9-292f-4621-892a-53a6c1c13f65\" (UID: \"e6fdbbb9-292f-4621-892a-53a6c1c13f65\") " Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.894060 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6fdbbb9-292f-4621-892a-53a6c1c13f65-serving-cert\") pod \"e6fdbbb9-292f-4621-892a-53a6c1c13f65\" (UID: \"e6fdbbb9-292f-4621-892a-53a6c1c13f65\") " Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.894092 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b640fc3-2425-48b0-adfa-3300a6d52002-serving-cert\") pod \"5b640fc3-2425-48b0-adfa-3300a6d52002\" (UID: \"5b640fc3-2425-48b0-adfa-3300a6d52002\") " Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.894114 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5b640fc3-2425-48b0-adfa-3300a6d52002-client-ca\") pod \"5b640fc3-2425-48b0-adfa-3300a6d52002\" (UID: \"5b640fc3-2425-48b0-adfa-3300a6d52002\") " Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.894132 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kndkg\" (UniqueName: \"kubernetes.io/projected/5b640fc3-2425-48b0-adfa-3300a6d52002-kube-api-access-kndkg\") pod \"5b640fc3-2425-48b0-adfa-3300a6d52002\" (UID: \"5b640fc3-2425-48b0-adfa-3300a6d52002\") " Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.894154 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e6fdbbb9-292f-4621-892a-53a6c1c13f65-client-ca\") pod \"e6fdbbb9-292f-4621-892a-53a6c1c13f65\" (UID: \"e6fdbbb9-292f-4621-892a-53a6c1c13f65\") " Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.894230 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b640fc3-2425-48b0-adfa-3300a6d52002-config\") pod \"5b640fc3-2425-48b0-adfa-3300a6d52002\" (UID: \"5b640fc3-2425-48b0-adfa-3300a6d52002\") " Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.895072 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6fdbbb9-292f-4621-892a-53a6c1c13f65-client-ca" (OuterVolumeSpecName: "client-ca") pod "e6fdbbb9-292f-4621-892a-53a6c1c13f65" (UID: "e6fdbbb9-292f-4621-892a-53a6c1c13f65"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.895150 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6fdbbb9-292f-4621-892a-53a6c1c13f65-config" (OuterVolumeSpecName: "config") pod "e6fdbbb9-292f-4621-892a-53a6c1c13f65" (UID: "e6fdbbb9-292f-4621-892a-53a6c1c13f65"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.895169 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b640fc3-2425-48b0-adfa-3300a6d52002-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "5b640fc3-2425-48b0-adfa-3300a6d52002" (UID: "5b640fc3-2425-48b0-adfa-3300a6d52002"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.895188 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b640fc3-2425-48b0-adfa-3300a6d52002-client-ca" (OuterVolumeSpecName: "client-ca") pod "5b640fc3-2425-48b0-adfa-3300a6d52002" (UID: "5b640fc3-2425-48b0-adfa-3300a6d52002"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.895250 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b640fc3-2425-48b0-adfa-3300a6d52002-config" (OuterVolumeSpecName: "config") pod "5b640fc3-2425-48b0-adfa-3300a6d52002" (UID: "5b640fc3-2425-48b0-adfa-3300a6d52002"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.895459 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b640fc3-2425-48b0-adfa-3300a6d52002-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.895473 4837 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5b640fc3-2425-48b0-adfa-3300a6d52002-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.895482 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6fdbbb9-292f-4621-892a-53a6c1c13f65-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.895490 4837 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5b640fc3-2425-48b0-adfa-3300a6d52002-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.895498 4837 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e6fdbbb9-292f-4621-892a-53a6c1c13f65-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.907882 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b640fc3-2425-48b0-adfa-3300a6d52002-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5b640fc3-2425-48b0-adfa-3300a6d52002" (UID: "5b640fc3-2425-48b0-adfa-3300a6d52002"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.907915 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6fdbbb9-292f-4621-892a-53a6c1c13f65-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e6fdbbb9-292f-4621-892a-53a6c1c13f65" (UID: "e6fdbbb9-292f-4621-892a-53a6c1c13f65"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.907952 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b640fc3-2425-48b0-adfa-3300a6d52002-kube-api-access-kndkg" (OuterVolumeSpecName: "kube-api-access-kndkg") pod "5b640fc3-2425-48b0-adfa-3300a6d52002" (UID: "5b640fc3-2425-48b0-adfa-3300a6d52002"). InnerVolumeSpecName "kube-api-access-kndkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.908060 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6fdbbb9-292f-4621-892a-53a6c1c13f65-kube-api-access-rft5b" (OuterVolumeSpecName: "kube-api-access-rft5b") pod "e6fdbbb9-292f-4621-892a-53a6c1c13f65" (UID: "e6fdbbb9-292f-4621-892a-53a6c1c13f65"). InnerVolumeSpecName "kube-api-access-rft5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.996671 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6fdbbb9-292f-4621-892a-53a6c1c13f65-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.996708 4837 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b640fc3-2425-48b0-adfa-3300a6d52002-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.996719 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kndkg\" (UniqueName: \"kubernetes.io/projected/5b640fc3-2425-48b0-adfa-3300a6d52002-kube-api-access-kndkg\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:58 crc kubenswrapper[4837]: I0313 11:52:58.996730 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rft5b\" (UniqueName: \"kubernetes.io/projected/e6fdbbb9-292f-4621-892a-53a6c1c13f65-kube-api-access-rft5b\") on node \"crc\" DevicePath \"\"" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.054062 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32a36cbe-a17f-46bf-9c6a-1df6f427e2c6" path="/var/lib/kubelet/pods/32a36cbe-a17f-46bf-9c6a-1df6f427e2c6/volumes" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.562189 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j"] Mar 13 11:52:59 crc kubenswrapper[4837]: E0313 11:52:59.562560 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b640fc3-2425-48b0-adfa-3300a6d52002" containerName="controller-manager" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.562594 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b640fc3-2425-48b0-adfa-3300a6d52002" containerName="controller-manager" Mar 13 11:52:59 crc kubenswrapper[4837]: E0313 11:52:59.562619 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45e6ae52-59ef-446f-917a-549d34ffbf8e" containerName="extract-utilities" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.562668 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="45e6ae52-59ef-446f-917a-549d34ffbf8e" containerName="extract-utilities" Mar 13 11:52:59 crc kubenswrapper[4837]: E0313 11:52:59.562684 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6870caea-07d6-4465-86b1-645a2e29b240" containerName="extract-content" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.562700 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6870caea-07d6-4465-86b1-645a2e29b240" containerName="extract-content" Mar 13 11:52:59 crc kubenswrapper[4837]: E0313 11:52:59.562723 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32a36cbe-a17f-46bf-9c6a-1df6f427e2c6" containerName="registry-server" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.562735 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="32a36cbe-a17f-46bf-9c6a-1df6f427e2c6" containerName="registry-server" Mar 13 11:52:59 crc kubenswrapper[4837]: E0313 11:52:59.562752 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45e6ae52-59ef-446f-917a-549d34ffbf8e" containerName="extract-content" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.562763 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="45e6ae52-59ef-446f-917a-549d34ffbf8e" containerName="extract-content" Mar 13 11:52:59 crc kubenswrapper[4837]: E0313 11:52:59.562785 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32a36cbe-a17f-46bf-9c6a-1df6f427e2c6" containerName="extract-content" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.562798 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="32a36cbe-a17f-46bf-9c6a-1df6f427e2c6" containerName="extract-content" Mar 13 11:52:59 crc kubenswrapper[4837]: E0313 11:52:59.562810 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5236ae0e-b305-4f1c-9125-bbac1eeb07f3" containerName="extract-content" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.562847 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5236ae0e-b305-4f1c-9125-bbac1eeb07f3" containerName="extract-content" Mar 13 11:52:59 crc kubenswrapper[4837]: E0313 11:52:59.562873 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32a36cbe-a17f-46bf-9c6a-1df6f427e2c6" containerName="extract-utilities" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.562885 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="32a36cbe-a17f-46bf-9c6a-1df6f427e2c6" containerName="extract-utilities" Mar 13 11:52:59 crc kubenswrapper[4837]: E0313 11:52:59.562905 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5236ae0e-b305-4f1c-9125-bbac1eeb07f3" containerName="extract-utilities" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.562917 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5236ae0e-b305-4f1c-9125-bbac1eeb07f3" containerName="extract-utilities" Mar 13 11:52:59 crc kubenswrapper[4837]: E0313 11:52:59.562932 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5236ae0e-b305-4f1c-9125-bbac1eeb07f3" containerName="registry-server" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.562944 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5236ae0e-b305-4f1c-9125-bbac1eeb07f3" containerName="registry-server" Mar 13 11:52:59 crc kubenswrapper[4837]: E0313 11:52:59.562961 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6870caea-07d6-4465-86b1-645a2e29b240" containerName="extract-utilities" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.562974 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6870caea-07d6-4465-86b1-645a2e29b240" containerName="extract-utilities" Mar 13 11:52:59 crc kubenswrapper[4837]: E0313 11:52:59.562992 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6870caea-07d6-4465-86b1-645a2e29b240" containerName="registry-server" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.563005 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6870caea-07d6-4465-86b1-645a2e29b240" containerName="registry-server" Mar 13 11:52:59 crc kubenswrapper[4837]: E0313 11:52:59.563024 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6fdbbb9-292f-4621-892a-53a6c1c13f65" containerName="route-controller-manager" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.563036 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6fdbbb9-292f-4621-892a-53a6c1c13f65" containerName="route-controller-manager" Mar 13 11:52:59 crc kubenswrapper[4837]: E0313 11:52:59.563058 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45e6ae52-59ef-446f-917a-549d34ffbf8e" containerName="registry-server" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.563070 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="45e6ae52-59ef-446f-917a-549d34ffbf8e" containerName="registry-server" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.563228 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="45e6ae52-59ef-446f-917a-549d34ffbf8e" containerName="registry-server" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.563261 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="6870caea-07d6-4465-86b1-645a2e29b240" containerName="registry-server" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.563276 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6fdbbb9-292f-4621-892a-53a6c1c13f65" containerName="route-controller-manager" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.563296 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="5236ae0e-b305-4f1c-9125-bbac1eeb07f3" containerName="registry-server" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.563313 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="32a36cbe-a17f-46bf-9c6a-1df6f427e2c6" containerName="registry-server" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.563326 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b640fc3-2425-48b0-adfa-3300a6d52002" containerName="controller-manager" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.564201 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.573543 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-68c5756767-4nmg2"] Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.574187 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j"] Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.574276 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-68c5756767-4nmg2" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.578146 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-68c5756767-4nmg2"] Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.612307 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5w9b8\" (UniqueName: \"kubernetes.io/projected/c6c2dd46-4cc0-4802-b96e-7d395d3dbc50-kube-api-access-5w9b8\") pod \"route-controller-manager-54bb9484b9-l9k8j\" (UID: \"c6c2dd46-4cc0-4802-b96e-7d395d3dbc50\") " pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.612410 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c6c2dd46-4cc0-4802-b96e-7d395d3dbc50-client-ca\") pod \"route-controller-manager-54bb9484b9-l9k8j\" (UID: \"c6c2dd46-4cc0-4802-b96e-7d395d3dbc50\") " pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.612526 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6c2dd46-4cc0-4802-b96e-7d395d3dbc50-config\") pod \"route-controller-manager-54bb9484b9-l9k8j\" (UID: \"c6c2dd46-4cc0-4802-b96e-7d395d3dbc50\") " pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.612672 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c6c2dd46-4cc0-4802-b96e-7d395d3dbc50-serving-cert\") pod \"route-controller-manager-54bb9484b9-l9k8j\" (UID: \"c6c2dd46-4cc0-4802-b96e-7d395d3dbc50\") " pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.666414 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7bb68b4479-msngs" event={"ID":"5b640fc3-2425-48b0-adfa-3300a6d52002","Type":"ContainerDied","Data":"097c9cec1e11f8aa83b30bf8b029f844e3b8d7cc463f16a2a3f8fa54330c0b4a"} Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.666442 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bb68b4479-msngs" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.666481 4837 scope.go:117] "RemoveContainer" containerID="a06e6485ac9f8f31411ee3ba603ad966da0a6cdc4ea8a84b4df845f32f9ff0f0" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.668130 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb" event={"ID":"e6fdbbb9-292f-4621-892a-53a6c1c13f65","Type":"ContainerDied","Data":"ec6669f507f7c1917535f69f1388c95e2e8ec84b237ee008635f694dc17102e4"} Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.668247 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.688040 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7bb68b4479-msngs"] Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.688978 4837 scope.go:117] "RemoveContainer" containerID="7fdcd49d58a893ad2332b8993c2a1cdba6ab455eb48a11b6f32d6c2c777e93c3" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.690811 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7bb68b4479-msngs"] Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.695487 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb"] Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.697987 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6764db5ddf-mqtsb"] Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.714074 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c6c2dd46-4cc0-4802-b96e-7d395d3dbc50-serving-cert\") pod \"route-controller-manager-54bb9484b9-l9k8j\" (UID: \"c6c2dd46-4cc0-4802-b96e-7d395d3dbc50\") " pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.714135 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/557e8146-afbb-41a0-a477-c69f4575656c-client-ca\") pod \"controller-manager-68c5756767-4nmg2\" (UID: \"557e8146-afbb-41a0-a477-c69f4575656c\") " pod="openshift-controller-manager/controller-manager-68c5756767-4nmg2" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.714193 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8xbg\" (UniqueName: \"kubernetes.io/projected/557e8146-afbb-41a0-a477-c69f4575656c-kube-api-access-r8xbg\") pod \"controller-manager-68c5756767-4nmg2\" (UID: \"557e8146-afbb-41a0-a477-c69f4575656c\") " pod="openshift-controller-manager/controller-manager-68c5756767-4nmg2" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.714220 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5w9b8\" (UniqueName: \"kubernetes.io/projected/c6c2dd46-4cc0-4802-b96e-7d395d3dbc50-kube-api-access-5w9b8\") pod \"route-controller-manager-54bb9484b9-l9k8j\" (UID: \"c6c2dd46-4cc0-4802-b96e-7d395d3dbc50\") " pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.714240 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/557e8146-afbb-41a0-a477-c69f4575656c-config\") pod \"controller-manager-68c5756767-4nmg2\" (UID: \"557e8146-afbb-41a0-a477-c69f4575656c\") " pod="openshift-controller-manager/controller-manager-68c5756767-4nmg2" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.714266 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/557e8146-afbb-41a0-a477-c69f4575656c-serving-cert\") pod \"controller-manager-68c5756767-4nmg2\" (UID: \"557e8146-afbb-41a0-a477-c69f4575656c\") " pod="openshift-controller-manager/controller-manager-68c5756767-4nmg2" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.714286 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c6c2dd46-4cc0-4802-b96e-7d395d3dbc50-client-ca\") pod \"route-controller-manager-54bb9484b9-l9k8j\" (UID: \"c6c2dd46-4cc0-4802-b96e-7d395d3dbc50\") " pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.714308 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6c2dd46-4cc0-4802-b96e-7d395d3dbc50-config\") pod \"route-controller-manager-54bb9484b9-l9k8j\" (UID: \"c6c2dd46-4cc0-4802-b96e-7d395d3dbc50\") " pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.714328 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/557e8146-afbb-41a0-a477-c69f4575656c-proxy-ca-bundles\") pod \"controller-manager-68c5756767-4nmg2\" (UID: \"557e8146-afbb-41a0-a477-c69f4575656c\") " pod="openshift-controller-manager/controller-manager-68c5756767-4nmg2" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.715568 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c6c2dd46-4cc0-4802-b96e-7d395d3dbc50-client-ca\") pod \"route-controller-manager-54bb9484b9-l9k8j\" (UID: \"c6c2dd46-4cc0-4802-b96e-7d395d3dbc50\") " pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.715850 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6c2dd46-4cc0-4802-b96e-7d395d3dbc50-config\") pod \"route-controller-manager-54bb9484b9-l9k8j\" (UID: \"c6c2dd46-4cc0-4802-b96e-7d395d3dbc50\") " pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.721977 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c6c2dd46-4cc0-4802-b96e-7d395d3dbc50-serving-cert\") pod \"route-controller-manager-54bb9484b9-l9k8j\" (UID: \"c6c2dd46-4cc0-4802-b96e-7d395d3dbc50\") " pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.730948 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5w9b8\" (UniqueName: \"kubernetes.io/projected/c6c2dd46-4cc0-4802-b96e-7d395d3dbc50-kube-api-access-5w9b8\") pod \"route-controller-manager-54bb9484b9-l9k8j\" (UID: \"c6c2dd46-4cc0-4802-b96e-7d395d3dbc50\") " pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.741441 4837 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.742302 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.742656 4837 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.742681 4837 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 13 11:52:59 crc kubenswrapper[4837]: E0313 11:52:59.742788 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.742800 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 13 11:52:59 crc kubenswrapper[4837]: E0313 11:52:59.742809 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.742815 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 13 11:52:59 crc kubenswrapper[4837]: E0313 11:52:59.742823 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.742829 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 11:52:59 crc kubenswrapper[4837]: E0313 11:52:59.742835 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.742841 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 11:52:59 crc kubenswrapper[4837]: E0313 11:52:59.742848 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.742856 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 11:52:59 crc kubenswrapper[4837]: E0313 11:52:59.742862 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.742868 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 13 11:52:59 crc kubenswrapper[4837]: E0313 11:52:59.742877 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.742883 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 13 11:52:59 crc kubenswrapper[4837]: E0313 11:52:59.742893 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.742899 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 13 11:52:59 crc kubenswrapper[4837]: E0313 11:52:59.742907 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.742913 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.742993 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.743001 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.743009 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.743017 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.743024 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.743030 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.743039 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 11:52:59 crc kubenswrapper[4837]: E0313 11:52:59.743128 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.743135 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.743212 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab" gracePeriod=15 Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.743522 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.744211 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.744853 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://abb4f7913ed2023bd133ac1171cd590f8b0366200f10ee3b27c1d2c3195fc8ea" gracePeriod=15 Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.744924 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2" gracePeriod=15 Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.745023 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208" gracePeriod=15 Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.745117 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17" gracePeriod=15 Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.815326 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.815371 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.815388 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.815411 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/557e8146-afbb-41a0-a477-c69f4575656c-client-ca\") pod \"controller-manager-68c5756767-4nmg2\" (UID: \"557e8146-afbb-41a0-a477-c69f4575656c\") " pod="openshift-controller-manager/controller-manager-68c5756767-4nmg2" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.815442 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.815455 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.815488 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8xbg\" (UniqueName: \"kubernetes.io/projected/557e8146-afbb-41a0-a477-c69f4575656c-kube-api-access-r8xbg\") pod \"controller-manager-68c5756767-4nmg2\" (UID: \"557e8146-afbb-41a0-a477-c69f4575656c\") " pod="openshift-controller-manager/controller-manager-68c5756767-4nmg2" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.815505 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.815549 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/557e8146-afbb-41a0-a477-c69f4575656c-config\") pod \"controller-manager-68c5756767-4nmg2\" (UID: \"557e8146-afbb-41a0-a477-c69f4575656c\") " pod="openshift-controller-manager/controller-manager-68c5756767-4nmg2" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.815568 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.815588 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/557e8146-afbb-41a0-a477-c69f4575656c-serving-cert\") pod \"controller-manager-68c5756767-4nmg2\" (UID: \"557e8146-afbb-41a0-a477-c69f4575656c\") " pod="openshift-controller-manager/controller-manager-68c5756767-4nmg2" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.815610 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/557e8146-afbb-41a0-a477-c69f4575656c-proxy-ca-bundles\") pod \"controller-manager-68c5756767-4nmg2\" (UID: \"557e8146-afbb-41a0-a477-c69f4575656c\") " pod="openshift-controller-manager/controller-manager-68c5756767-4nmg2" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.815664 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.816285 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/557e8146-afbb-41a0-a477-c69f4575656c-client-ca\") pod \"controller-manager-68c5756767-4nmg2\" (UID: \"557e8146-afbb-41a0-a477-c69f4575656c\") " pod="openshift-controller-manager/controller-manager-68c5756767-4nmg2" Mar 13 11:52:59 crc kubenswrapper[4837]: E0313 11:52:59.816441 4837 projected.go:194] Error preparing data for projected volume kube-api-access-r8xbg for pod openshift-controller-manager/controller-manager-68c5756767-4nmg2: failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/serviceaccounts/openshift-controller-manager-sa/token": dial tcp 38.102.83.138:6443: connect: connection refused Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.816891 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/557e8146-afbb-41a0-a477-c69f4575656c-config\") pod \"controller-manager-68c5756767-4nmg2\" (UID: \"557e8146-afbb-41a0-a477-c69f4575656c\") " pod="openshift-controller-manager/controller-manager-68c5756767-4nmg2" Mar 13 11:52:59 crc kubenswrapper[4837]: E0313 11:52:59.817152 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/557e8146-afbb-41a0-a477-c69f4575656c-kube-api-access-r8xbg podName:557e8146-afbb-41a0-a477-c69f4575656c nodeName:}" failed. No retries permitted until 2026-03-13 11:53:00.31712132 +0000 UTC m=+295.955388153 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-r8xbg" (UniqueName: "kubernetes.io/projected/557e8146-afbb-41a0-a477-c69f4575656c-kube-api-access-r8xbg") pod "controller-manager-68c5756767-4nmg2" (UID: "557e8146-afbb-41a0-a477-c69f4575656c") : failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/serviceaccounts/openshift-controller-manager-sa/token": dial tcp 38.102.83.138:6443: connect: connection refused Mar 13 11:52:59 crc kubenswrapper[4837]: E0313 11:52:59.817514 4837 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/events\": dial tcp 38.102.83.138:6443: connect: connection refused" event="&Event{ObjectMeta:{controller-manager-68c5756767-4nmg2.189c646eaeb065ea openshift-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-controller-manager,Name:controller-manager-68c5756767-4nmg2,UID:557e8146-afbb-41a0-a477-c69f4575656c,APIVersion:v1,ResourceVersion:29927,FieldPath:,},Reason:FailedMount,Message:MountVolume.SetUp failed for volume \"kube-api-access-r8xbg\" : failed to fetch token: Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/serviceaccounts/openshift-controller-manager-sa/token\": dial tcp 38.102.83.138:6443: connect: connection refused,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:52:59.81710897 +0000 UTC m=+295.455375743,LastTimestamp:2026-03-13 11:52:59.81710897 +0000 UTC m=+295.455375743,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.817726 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/557e8146-afbb-41a0-a477-c69f4575656c-proxy-ca-bundles\") pod \"controller-manager-68c5756767-4nmg2\" (UID: \"557e8146-afbb-41a0-a477-c69f4575656c\") " pod="openshift-controller-manager/controller-manager-68c5756767-4nmg2" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.823195 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/557e8146-afbb-41a0-a477-c69f4575656c-serving-cert\") pod \"controller-manager-68c5756767-4nmg2\" (UID: \"557e8146-afbb-41a0-a477-c69f4575656c\") " pod="openshift-controller-manager/controller-manager-68c5756767-4nmg2" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.917117 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.917193 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.917214 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.917230 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.917260 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.917273 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.917281 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.917313 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.917319 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.917321 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.917360 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.917362 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.917368 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.917337 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.917385 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.917399 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:52:59 crc kubenswrapper[4837]: I0313 11:52:59.928006 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" Mar 13 11:53:00 crc kubenswrapper[4837]: I0313 11:53:00.221990 4837 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" start-of-body= Mar 13 11:53:00 crc kubenswrapper[4837]: I0313 11:53:00.222322 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" Mar 13 11:53:00 crc kubenswrapper[4837]: I0313 11:53:00.323418 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8xbg\" (UniqueName: \"kubernetes.io/projected/557e8146-afbb-41a0-a477-c69f4575656c-kube-api-access-r8xbg\") pod \"controller-manager-68c5756767-4nmg2\" (UID: \"557e8146-afbb-41a0-a477-c69f4575656c\") " pod="openshift-controller-manager/controller-manager-68c5756767-4nmg2" Mar 13 11:53:00 crc kubenswrapper[4837]: E0313 11:53:00.324140 4837 projected.go:194] Error preparing data for projected volume kube-api-access-r8xbg for pod openshift-controller-manager/controller-manager-68c5756767-4nmg2: failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/serviceaccounts/openshift-controller-manager-sa/token": dial tcp 38.102.83.138:6443: connect: connection refused Mar 13 11:53:00 crc kubenswrapper[4837]: E0313 11:53:00.324253 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/557e8146-afbb-41a0-a477-c69f4575656c-kube-api-access-r8xbg podName:557e8146-afbb-41a0-a477-c69f4575656c nodeName:}" failed. No retries permitted until 2026-03-13 11:53:01.324219938 +0000 UTC m=+296.962486741 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-r8xbg" (UniqueName: "kubernetes.io/projected/557e8146-afbb-41a0-a477-c69f4575656c-kube-api-access-r8xbg") pod "controller-manager-68c5756767-4nmg2" (UID: "557e8146-afbb-41a0-a477-c69f4575656c") : failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/serviceaccounts/openshift-controller-manager-sa/token": dial tcp 38.102.83.138:6443: connect: connection refused Mar 13 11:53:00 crc kubenswrapper[4837]: E0313 11:53:00.508904 4837 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 13 11:53:00 crc kubenswrapper[4837]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-54bb9484b9-l9k8j_openshift-route-controller-manager_c6c2dd46-4cc0-4802-b96e-7d395d3dbc50_0(0e521213d7075b7205e2d99e482f1c55c0f091f3d98c1703c32a1f05a4e7e878): error adding pod openshift-route-controller-manager_route-controller-manager-54bb9484b9-l9k8j to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"0e521213d7075b7205e2d99e482f1c55c0f091f3d98c1703c32a1f05a4e7e878" Netns:"/var/run/netns/d70e04b4-380d-4dc7-a3e7-5f2471ba8c3a" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-54bb9484b9-l9k8j;K8S_POD_INFRA_CONTAINER_ID=0e521213d7075b7205e2d99e482f1c55c0f091f3d98c1703c32a1f05a4e7e878;K8S_POD_UID=c6c2dd46-4cc0-4802-b96e-7d395d3dbc50" Path:"" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j] networking: Multus: [openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j/c6c2dd46-4cc0-4802-b96e-7d395d3dbc50]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod route-controller-manager-54bb9484b9-l9k8j in out of cluster comm: SetNetworkStatus: failed to update the pod route-controller-manager-54bb9484b9-l9k8j in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-54bb9484b9-l9k8j?timeout=1m0s": dial tcp 38.102.83.138:6443: connect: connection refused Mar 13 11:53:00 crc kubenswrapper[4837]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 11:53:00 crc kubenswrapper[4837]: > Mar 13 11:53:00 crc kubenswrapper[4837]: E0313 11:53:00.508984 4837 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 13 11:53:00 crc kubenswrapper[4837]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-54bb9484b9-l9k8j_openshift-route-controller-manager_c6c2dd46-4cc0-4802-b96e-7d395d3dbc50_0(0e521213d7075b7205e2d99e482f1c55c0f091f3d98c1703c32a1f05a4e7e878): error adding pod openshift-route-controller-manager_route-controller-manager-54bb9484b9-l9k8j to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"0e521213d7075b7205e2d99e482f1c55c0f091f3d98c1703c32a1f05a4e7e878" Netns:"/var/run/netns/d70e04b4-380d-4dc7-a3e7-5f2471ba8c3a" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-54bb9484b9-l9k8j;K8S_POD_INFRA_CONTAINER_ID=0e521213d7075b7205e2d99e482f1c55c0f091f3d98c1703c32a1f05a4e7e878;K8S_POD_UID=c6c2dd46-4cc0-4802-b96e-7d395d3dbc50" Path:"" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j] networking: Multus: [openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j/c6c2dd46-4cc0-4802-b96e-7d395d3dbc50]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod route-controller-manager-54bb9484b9-l9k8j in out of cluster comm: SetNetworkStatus: failed to update the pod route-controller-manager-54bb9484b9-l9k8j in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-54bb9484b9-l9k8j?timeout=1m0s": dial tcp 38.102.83.138:6443: connect: connection refused Mar 13 11:53:00 crc kubenswrapper[4837]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 11:53:00 crc kubenswrapper[4837]: > pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" Mar 13 11:53:00 crc kubenswrapper[4837]: E0313 11:53:00.509008 4837 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 13 11:53:00 crc kubenswrapper[4837]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-54bb9484b9-l9k8j_openshift-route-controller-manager_c6c2dd46-4cc0-4802-b96e-7d395d3dbc50_0(0e521213d7075b7205e2d99e482f1c55c0f091f3d98c1703c32a1f05a4e7e878): error adding pod openshift-route-controller-manager_route-controller-manager-54bb9484b9-l9k8j to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"0e521213d7075b7205e2d99e482f1c55c0f091f3d98c1703c32a1f05a4e7e878" Netns:"/var/run/netns/d70e04b4-380d-4dc7-a3e7-5f2471ba8c3a" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-54bb9484b9-l9k8j;K8S_POD_INFRA_CONTAINER_ID=0e521213d7075b7205e2d99e482f1c55c0f091f3d98c1703c32a1f05a4e7e878;K8S_POD_UID=c6c2dd46-4cc0-4802-b96e-7d395d3dbc50" Path:"" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j] networking: Multus: [openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j/c6c2dd46-4cc0-4802-b96e-7d395d3dbc50]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod route-controller-manager-54bb9484b9-l9k8j in out of cluster comm: SetNetworkStatus: failed to update the pod route-controller-manager-54bb9484b9-l9k8j in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-54bb9484b9-l9k8j?timeout=1m0s": dial tcp 38.102.83.138:6443: connect: connection refused Mar 13 11:53:00 crc kubenswrapper[4837]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 11:53:00 crc kubenswrapper[4837]: > pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" Mar 13 11:53:00 crc kubenswrapper[4837]: E0313 11:53:00.509106 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"route-controller-manager-54bb9484b9-l9k8j_openshift-route-controller-manager(c6c2dd46-4cc0-4802-b96e-7d395d3dbc50)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"route-controller-manager-54bb9484b9-l9k8j_openshift-route-controller-manager(c6c2dd46-4cc0-4802-b96e-7d395d3dbc50)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-54bb9484b9-l9k8j_openshift-route-controller-manager_c6c2dd46-4cc0-4802-b96e-7d395d3dbc50_0(0e521213d7075b7205e2d99e482f1c55c0f091f3d98c1703c32a1f05a4e7e878): error adding pod openshift-route-controller-manager_route-controller-manager-54bb9484b9-l9k8j to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"0e521213d7075b7205e2d99e482f1c55c0f091f3d98c1703c32a1f05a4e7e878\\\" Netns:\\\"/var/run/netns/d70e04b4-380d-4dc7-a3e7-5f2471ba8c3a\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-54bb9484b9-l9k8j;K8S_POD_INFRA_CONTAINER_ID=0e521213d7075b7205e2d99e482f1c55c0f091f3d98c1703c32a1f05a4e7e878;K8S_POD_UID=c6c2dd46-4cc0-4802-b96e-7d395d3dbc50\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j] networking: Multus: [openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j/c6c2dd46-4cc0-4802-b96e-7d395d3dbc50]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod route-controller-manager-54bb9484b9-l9k8j in out of cluster comm: SetNetworkStatus: failed to update the pod route-controller-manager-54bb9484b9-l9k8j in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-54bb9484b9-l9k8j?timeout=1m0s\\\": dial tcp 38.102.83.138:6443: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" podUID="c6c2dd46-4cc0-4802-b96e-7d395d3dbc50" Mar 13 11:53:00 crc kubenswrapper[4837]: E0313 11:53:00.631761 4837 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 13 11:53:00 crc kubenswrapper[4837]: E0313 11:53:00.632460 4837 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 13 11:53:00 crc kubenswrapper[4837]: E0313 11:53:00.633099 4837 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 13 11:53:00 crc kubenswrapper[4837]: E0313 11:53:00.633529 4837 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 13 11:53:00 crc kubenswrapper[4837]: E0313 11:53:00.633849 4837 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 13 11:53:00 crc kubenswrapper[4837]: I0313 11:53:00.633917 4837 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 13 11:53:00 crc kubenswrapper[4837]: E0313 11:53:00.634379 4837 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="200ms" Mar 13 11:53:00 crc kubenswrapper[4837]: E0313 11:53:00.644676 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:53:00Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:53:00Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:53:00Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:53:00Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:1295a1f0e74ae87f51a733e28b64c6fdb6b9a5b069a6897b3870fe52cc1c3b0b\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:505eeaa3f051e9f4ea6a622aca92e5c4eae07078ca185d9fecfe8cc9b6dfc899\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1739173859},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:4855408bd0e4d0711383d0c14dcad53c98255ff9f83f6cbefb57e47eacc1f1f1\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:97bdbb5854e4ad7976209a44cff02c8a2b9542f58ad007c06a5c3a5e8266def1\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1284762325},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:898c67bf7fc973e99114f3148976a6c21ae0dbe413051415588fa9b995f5b331\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:a641939d2096609a4cf6eec872a1476b7c671bfd81cffc2edeb6e9f13c9deeba\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1231028434},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:27f5385c5b700fb400a618b51a628f0db39afa4a8db03380252ca5abf49518da\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:3d8cd257adb4bde31657aa6b0fe5da54d74b1f9eda5457c8dee929ed64ecece0\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1221692102},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-cli@sha256:69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9\\\",\\\"registry.redhat.io/openshift4/ose-cli@sha256:ef83967297f619f45075e7fd1428a1eb981622a6c174c46fb53b158ed24bed85\\\",\\\"registry.redhat.io/openshift4/ose-cli:latest\\\"],\\\"sizeBytes\\\":584351326},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 13 11:53:00 crc kubenswrapper[4837]: E0313 11:53:00.645480 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 13 11:53:00 crc kubenswrapper[4837]: E0313 11:53:00.645963 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 13 11:53:00 crc kubenswrapper[4837]: E0313 11:53:00.646205 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 13 11:53:00 crc kubenswrapper[4837]: E0313 11:53:00.646545 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 13 11:53:00 crc kubenswrapper[4837]: E0313 11:53:00.646579 4837 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 11:53:00 crc kubenswrapper[4837]: I0313 11:53:00.678175 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 13 11:53:00 crc kubenswrapper[4837]: I0313 11:53:00.679863 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 13 11:53:00 crc kubenswrapper[4837]: I0313 11:53:00.680833 4837 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="abb4f7913ed2023bd133ac1171cd590f8b0366200f10ee3b27c1d2c3195fc8ea" exitCode=0 Mar 13 11:53:00 crc kubenswrapper[4837]: I0313 11:53:00.680866 4837 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17" exitCode=0 Mar 13 11:53:00 crc kubenswrapper[4837]: I0313 11:53:00.680889 4837 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2" exitCode=0 Mar 13 11:53:00 crc kubenswrapper[4837]: I0313 11:53:00.680898 4837 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208" exitCode=2 Mar 13 11:53:00 crc kubenswrapper[4837]: I0313 11:53:00.680947 4837 scope.go:117] "RemoveContainer" containerID="6497d34f903113b60e61cd8a78263095184d7d0705eb29311b1a337ad03105c8" Mar 13 11:53:00 crc kubenswrapper[4837]: I0313 11:53:00.685695 4837 generic.go:334] "Generic (PLEG): container finished" podID="def1c7aa-51a6-4ee0-93d5-714721e9fc27" containerID="65e28d9ae9393725ced85e7d2690513d16c23a3765c0987cd0c005a1bb7bef87" exitCode=0 Mar 13 11:53:00 crc kubenswrapper[4837]: I0313 11:53:00.685781 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"def1c7aa-51a6-4ee0-93d5-714721e9fc27","Type":"ContainerDied","Data":"65e28d9ae9393725ced85e7d2690513d16c23a3765c0987cd0c005a1bb7bef87"} Mar 13 11:53:00 crc kubenswrapper[4837]: I0313 11:53:00.686298 4837 status_manager.go:851] "Failed to get status for pod" podUID="def1c7aa-51a6-4ee0-93d5-714721e9fc27" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 13 11:53:00 crc kubenswrapper[4837]: I0313 11:53:00.686612 4837 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 13 11:53:00 crc kubenswrapper[4837]: I0313 11:53:00.687365 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" Mar 13 11:53:00 crc kubenswrapper[4837]: I0313 11:53:00.687804 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" Mar 13 11:53:00 crc kubenswrapper[4837]: E0313 11:53:00.835558 4837 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="400ms" Mar 13 11:53:01 crc kubenswrapper[4837]: I0313 11:53:01.055306 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b640fc3-2425-48b0-adfa-3300a6d52002" path="/var/lib/kubelet/pods/5b640fc3-2425-48b0-adfa-3300a6d52002/volumes" Mar 13 11:53:01 crc kubenswrapper[4837]: I0313 11:53:01.056685 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6fdbbb9-292f-4621-892a-53a6c1c13f65" path="/var/lib/kubelet/pods/e6fdbbb9-292f-4621-892a-53a6c1c13f65/volumes" Mar 13 11:53:01 crc kubenswrapper[4837]: E0313 11:53:01.237321 4837 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="800ms" Mar 13 11:53:01 crc kubenswrapper[4837]: E0313 11:53:01.238072 4837 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 13 11:53:01 crc kubenswrapper[4837]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-54bb9484b9-l9k8j_openshift-route-controller-manager_c6c2dd46-4cc0-4802-b96e-7d395d3dbc50_0(a8c46c0a779ef94bfb027c26c4a1554b6e9d988b0405909a29e7717aeba7d586): error adding pod openshift-route-controller-manager_route-controller-manager-54bb9484b9-l9k8j to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"a8c46c0a779ef94bfb027c26c4a1554b6e9d988b0405909a29e7717aeba7d586" Netns:"/var/run/netns/e4fa0d47-327f-469f-9fcd-afaac8c1cd71" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-54bb9484b9-l9k8j;K8S_POD_INFRA_CONTAINER_ID=a8c46c0a779ef94bfb027c26c4a1554b6e9d988b0405909a29e7717aeba7d586;K8S_POD_UID=c6c2dd46-4cc0-4802-b96e-7d395d3dbc50" Path:"" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j] networking: Multus: [openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j/c6c2dd46-4cc0-4802-b96e-7d395d3dbc50]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod route-controller-manager-54bb9484b9-l9k8j in out of cluster comm: SetNetworkStatus: failed to update the pod route-controller-manager-54bb9484b9-l9k8j in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-54bb9484b9-l9k8j?timeout=1m0s": dial tcp 38.102.83.138:6443: connect: connection refused Mar 13 11:53:01 crc kubenswrapper[4837]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 11:53:01 crc kubenswrapper[4837]: > Mar 13 11:53:01 crc kubenswrapper[4837]: E0313 11:53:01.238149 4837 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 13 11:53:01 crc kubenswrapper[4837]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-54bb9484b9-l9k8j_openshift-route-controller-manager_c6c2dd46-4cc0-4802-b96e-7d395d3dbc50_0(a8c46c0a779ef94bfb027c26c4a1554b6e9d988b0405909a29e7717aeba7d586): error adding pod openshift-route-controller-manager_route-controller-manager-54bb9484b9-l9k8j to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"a8c46c0a779ef94bfb027c26c4a1554b6e9d988b0405909a29e7717aeba7d586" Netns:"/var/run/netns/e4fa0d47-327f-469f-9fcd-afaac8c1cd71" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-54bb9484b9-l9k8j;K8S_POD_INFRA_CONTAINER_ID=a8c46c0a779ef94bfb027c26c4a1554b6e9d988b0405909a29e7717aeba7d586;K8S_POD_UID=c6c2dd46-4cc0-4802-b96e-7d395d3dbc50" Path:"" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j] networking: Multus: [openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j/c6c2dd46-4cc0-4802-b96e-7d395d3dbc50]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod route-controller-manager-54bb9484b9-l9k8j in out of cluster comm: SetNetworkStatus: failed to update the pod route-controller-manager-54bb9484b9-l9k8j in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-54bb9484b9-l9k8j?timeout=1m0s": dial tcp 38.102.83.138:6443: connect: connection refused Mar 13 11:53:01 crc kubenswrapper[4837]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 11:53:01 crc kubenswrapper[4837]: > pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" Mar 13 11:53:01 crc kubenswrapper[4837]: E0313 11:53:01.238173 4837 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 13 11:53:01 crc kubenswrapper[4837]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-54bb9484b9-l9k8j_openshift-route-controller-manager_c6c2dd46-4cc0-4802-b96e-7d395d3dbc50_0(a8c46c0a779ef94bfb027c26c4a1554b6e9d988b0405909a29e7717aeba7d586): error adding pod openshift-route-controller-manager_route-controller-manager-54bb9484b9-l9k8j to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"a8c46c0a779ef94bfb027c26c4a1554b6e9d988b0405909a29e7717aeba7d586" Netns:"/var/run/netns/e4fa0d47-327f-469f-9fcd-afaac8c1cd71" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-54bb9484b9-l9k8j;K8S_POD_INFRA_CONTAINER_ID=a8c46c0a779ef94bfb027c26c4a1554b6e9d988b0405909a29e7717aeba7d586;K8S_POD_UID=c6c2dd46-4cc0-4802-b96e-7d395d3dbc50" Path:"" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j] networking: Multus: [openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j/c6c2dd46-4cc0-4802-b96e-7d395d3dbc50]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod route-controller-manager-54bb9484b9-l9k8j in out of cluster comm: SetNetworkStatus: failed to update the pod route-controller-manager-54bb9484b9-l9k8j in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-54bb9484b9-l9k8j?timeout=1m0s": dial tcp 38.102.83.138:6443: connect: connection refused Mar 13 11:53:01 crc kubenswrapper[4837]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 11:53:01 crc kubenswrapper[4837]: > pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" Mar 13 11:53:01 crc kubenswrapper[4837]: E0313 11:53:01.238244 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"route-controller-manager-54bb9484b9-l9k8j_openshift-route-controller-manager(c6c2dd46-4cc0-4802-b96e-7d395d3dbc50)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"route-controller-manager-54bb9484b9-l9k8j_openshift-route-controller-manager(c6c2dd46-4cc0-4802-b96e-7d395d3dbc50)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-54bb9484b9-l9k8j_openshift-route-controller-manager_c6c2dd46-4cc0-4802-b96e-7d395d3dbc50_0(a8c46c0a779ef94bfb027c26c4a1554b6e9d988b0405909a29e7717aeba7d586): error adding pod openshift-route-controller-manager_route-controller-manager-54bb9484b9-l9k8j to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"a8c46c0a779ef94bfb027c26c4a1554b6e9d988b0405909a29e7717aeba7d586\\\" Netns:\\\"/var/run/netns/e4fa0d47-327f-469f-9fcd-afaac8c1cd71\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-54bb9484b9-l9k8j;K8S_POD_INFRA_CONTAINER_ID=a8c46c0a779ef94bfb027c26c4a1554b6e9d988b0405909a29e7717aeba7d586;K8S_POD_UID=c6c2dd46-4cc0-4802-b96e-7d395d3dbc50\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j] networking: Multus: [openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j/c6c2dd46-4cc0-4802-b96e-7d395d3dbc50]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod route-controller-manager-54bb9484b9-l9k8j in out of cluster comm: SetNetworkStatus: failed to update the pod route-controller-manager-54bb9484b9-l9k8j in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-54bb9484b9-l9k8j?timeout=1m0s\\\": dial tcp 38.102.83.138:6443: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" podUID="c6c2dd46-4cc0-4802-b96e-7d395d3dbc50" Mar 13 11:53:01 crc kubenswrapper[4837]: I0313 11:53:01.335465 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8xbg\" (UniqueName: \"kubernetes.io/projected/557e8146-afbb-41a0-a477-c69f4575656c-kube-api-access-r8xbg\") pod \"controller-manager-68c5756767-4nmg2\" (UID: \"557e8146-afbb-41a0-a477-c69f4575656c\") " pod="openshift-controller-manager/controller-manager-68c5756767-4nmg2" Mar 13 11:53:01 crc kubenswrapper[4837]: E0313 11:53:01.336163 4837 projected.go:194] Error preparing data for projected volume kube-api-access-r8xbg for pod openshift-controller-manager/controller-manager-68c5756767-4nmg2: failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/serviceaccounts/openshift-controller-manager-sa/token": dial tcp 38.102.83.138:6443: connect: connection refused Mar 13 11:53:01 crc kubenswrapper[4837]: E0313 11:53:01.336253 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/557e8146-afbb-41a0-a477-c69f4575656c-kube-api-access-r8xbg podName:557e8146-afbb-41a0-a477-c69f4575656c nodeName:}" failed. No retries permitted until 2026-03-13 11:53:03.33623303 +0000 UTC m=+298.974499803 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-r8xbg" (UniqueName: "kubernetes.io/projected/557e8146-afbb-41a0-a477-c69f4575656c-kube-api-access-r8xbg") pod "controller-manager-68c5756767-4nmg2" (UID: "557e8146-afbb-41a0-a477-c69f4575656c") : failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/serviceaccounts/openshift-controller-manager-sa/token": dial tcp 38.102.83.138:6443: connect: connection refused Mar 13 11:53:01 crc kubenswrapper[4837]: I0313 11:53:01.694152 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 13 11:53:01 crc kubenswrapper[4837]: I0313 11:53:01.983065 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 13 11:53:01 crc kubenswrapper[4837]: I0313 11:53:01.984297 4837 status_manager.go:851] "Failed to get status for pod" podUID="def1c7aa-51a6-4ee0-93d5-714721e9fc27" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 13 11:53:02 crc kubenswrapper[4837]: E0313 11:53:02.038273 4837 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="1.6s" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.042658 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/def1c7aa-51a6-4ee0-93d5-714721e9fc27-kubelet-dir\") pod \"def1c7aa-51a6-4ee0-93d5-714721e9fc27\" (UID: \"def1c7aa-51a6-4ee0-93d5-714721e9fc27\") " Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.042752 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/def1c7aa-51a6-4ee0-93d5-714721e9fc27-var-lock\") pod \"def1c7aa-51a6-4ee0-93d5-714721e9fc27\" (UID: \"def1c7aa-51a6-4ee0-93d5-714721e9fc27\") " Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.042775 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/def1c7aa-51a6-4ee0-93d5-714721e9fc27-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "def1c7aa-51a6-4ee0-93d5-714721e9fc27" (UID: "def1c7aa-51a6-4ee0-93d5-714721e9fc27"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.042798 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/def1c7aa-51a6-4ee0-93d5-714721e9fc27-kube-api-access\") pod \"def1c7aa-51a6-4ee0-93d5-714721e9fc27\" (UID: \"def1c7aa-51a6-4ee0-93d5-714721e9fc27\") " Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.042813 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/def1c7aa-51a6-4ee0-93d5-714721e9fc27-var-lock" (OuterVolumeSpecName: "var-lock") pod "def1c7aa-51a6-4ee0-93d5-714721e9fc27" (UID: "def1c7aa-51a6-4ee0-93d5-714721e9fc27"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.043124 4837 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/def1c7aa-51a6-4ee0-93d5-714721e9fc27-var-lock\") on node \"crc\" DevicePath \"\"" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.043144 4837 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/def1c7aa-51a6-4ee0-93d5-714721e9fc27-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.048392 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/def1c7aa-51a6-4ee0-93d5-714721e9fc27-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "def1c7aa-51a6-4ee0-93d5-714721e9fc27" (UID: "def1c7aa-51a6-4ee0-93d5-714721e9fc27"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.118513 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.119338 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.119991 4837 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.120452 4837 status_manager.go:851] "Failed to get status for pod" podUID="def1c7aa-51a6-4ee0-93d5-714721e9fc27" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.144269 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/def1c7aa-51a6-4ee0-93d5-714721e9fc27-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.245029 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.245108 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.245198 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.245214 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.245288 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.245401 4837 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.245416 4837 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.245461 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.346427 4837 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.710775 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.711697 4837 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab" exitCode=0 Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.711813 4837 scope.go:117] "RemoveContainer" containerID="abb4f7913ed2023bd133ac1171cd590f8b0366200f10ee3b27c1d2c3195fc8ea" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.711826 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.715056 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"def1c7aa-51a6-4ee0-93d5-714721e9fc27","Type":"ContainerDied","Data":"971c2474cd87ef98e0d1f40d16055d082ae34c7f72c4c883a6feb86fd0ff4ce0"} Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.715113 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="971c2474cd87ef98e0d1f40d16055d082ae34c7f72c4c883a6feb86fd0ff4ce0" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.715370 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.729179 4837 scope.go:117] "RemoveContainer" containerID="682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.740158 4837 status_manager.go:851] "Failed to get status for pod" podUID="def1c7aa-51a6-4ee0-93d5-714721e9fc27" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.740803 4837 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.744538 4837 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.745293 4837 status_manager.go:851] "Failed to get status for pod" podUID="def1c7aa-51a6-4ee0-93d5-714721e9fc27" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.751940 4837 scope.go:117] "RemoveContainer" containerID="9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.764740 4837 scope.go:117] "RemoveContainer" containerID="804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.776783 4837 scope.go:117] "RemoveContainer" containerID="f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.795110 4837 scope.go:117] "RemoveContainer" containerID="6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.812600 4837 scope.go:117] "RemoveContainer" containerID="abb4f7913ed2023bd133ac1171cd590f8b0366200f10ee3b27c1d2c3195fc8ea" Mar 13 11:53:02 crc kubenswrapper[4837]: E0313 11:53:02.813151 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abb4f7913ed2023bd133ac1171cd590f8b0366200f10ee3b27c1d2c3195fc8ea\": container with ID starting with abb4f7913ed2023bd133ac1171cd590f8b0366200f10ee3b27c1d2c3195fc8ea not found: ID does not exist" containerID="abb4f7913ed2023bd133ac1171cd590f8b0366200f10ee3b27c1d2c3195fc8ea" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.813200 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abb4f7913ed2023bd133ac1171cd590f8b0366200f10ee3b27c1d2c3195fc8ea"} err="failed to get container status \"abb4f7913ed2023bd133ac1171cd590f8b0366200f10ee3b27c1d2c3195fc8ea\": rpc error: code = NotFound desc = could not find container \"abb4f7913ed2023bd133ac1171cd590f8b0366200f10ee3b27c1d2c3195fc8ea\": container with ID starting with abb4f7913ed2023bd133ac1171cd590f8b0366200f10ee3b27c1d2c3195fc8ea not found: ID does not exist" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.813236 4837 scope.go:117] "RemoveContainer" containerID="682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17" Mar 13 11:53:02 crc kubenswrapper[4837]: E0313 11:53:02.814087 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17\": container with ID starting with 682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17 not found: ID does not exist" containerID="682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.814168 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17"} err="failed to get container status \"682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17\": rpc error: code = NotFound desc = could not find container \"682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17\": container with ID starting with 682e36b6cbab4248f1486812db6307149c168d45176f3a76b32dce2f6cfc0d17 not found: ID does not exist" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.814213 4837 scope.go:117] "RemoveContainer" containerID="9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2" Mar 13 11:53:02 crc kubenswrapper[4837]: E0313 11:53:02.814847 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2\": container with ID starting with 9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2 not found: ID does not exist" containerID="9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.814879 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2"} err="failed to get container status \"9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2\": rpc error: code = NotFound desc = could not find container \"9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2\": container with ID starting with 9babc6a2fb34708385b95415ab1b6d766ac7f9bfb4f4d37dd1d0841baca343f2 not found: ID does not exist" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.814931 4837 scope.go:117] "RemoveContainer" containerID="804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208" Mar 13 11:53:02 crc kubenswrapper[4837]: E0313 11:53:02.815433 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208\": container with ID starting with 804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208 not found: ID does not exist" containerID="804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.815491 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208"} err="failed to get container status \"804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208\": rpc error: code = NotFound desc = could not find container \"804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208\": container with ID starting with 804167fb1a3dacfbee36e416e31cd2c4ba7f08659412d423efa25475ae05d208 not found: ID does not exist" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.815516 4837 scope.go:117] "RemoveContainer" containerID="f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab" Mar 13 11:53:02 crc kubenswrapper[4837]: E0313 11:53:02.815789 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab\": container with ID starting with f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab not found: ID does not exist" containerID="f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.815821 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab"} err="failed to get container status \"f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab\": rpc error: code = NotFound desc = could not find container \"f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab\": container with ID starting with f3bc71461eaae5f83cf7a5464f82961158b241944a8d8e4dded476ce41d025ab not found: ID does not exist" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.815838 4837 scope.go:117] "RemoveContainer" containerID="6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75" Mar 13 11:53:02 crc kubenswrapper[4837]: E0313 11:53:02.816265 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\": container with ID starting with 6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75 not found: ID does not exist" containerID="6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75" Mar 13 11:53:02 crc kubenswrapper[4837]: I0313 11:53:02.816293 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75"} err="failed to get container status \"6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\": rpc error: code = NotFound desc = could not find container \"6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75\": container with ID starting with 6b4f142349ff7953df04a82076568ff7046b7f7990dc5a6db3973dfea47aac75 not found: ID does not exist" Mar 13 11:53:03 crc kubenswrapper[4837]: I0313 11:53:03.056456 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 13 11:53:03 crc kubenswrapper[4837]: I0313 11:53:03.362134 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8xbg\" (UniqueName: \"kubernetes.io/projected/557e8146-afbb-41a0-a477-c69f4575656c-kube-api-access-r8xbg\") pod \"controller-manager-68c5756767-4nmg2\" (UID: \"557e8146-afbb-41a0-a477-c69f4575656c\") " pod="openshift-controller-manager/controller-manager-68c5756767-4nmg2" Mar 13 11:53:03 crc kubenswrapper[4837]: E0313 11:53:03.362767 4837 projected.go:194] Error preparing data for projected volume kube-api-access-r8xbg for pod openshift-controller-manager/controller-manager-68c5756767-4nmg2: failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/serviceaccounts/openshift-controller-manager-sa/token": dial tcp 38.102.83.138:6443: connect: connection refused Mar 13 11:53:03 crc kubenswrapper[4837]: E0313 11:53:03.362850 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/557e8146-afbb-41a0-a477-c69f4575656c-kube-api-access-r8xbg podName:557e8146-afbb-41a0-a477-c69f4575656c nodeName:}" failed. No retries permitted until 2026-03-13 11:53:07.362827541 +0000 UTC m=+303.001094304 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-r8xbg" (UniqueName: "kubernetes.io/projected/557e8146-afbb-41a0-a477-c69f4575656c-kube-api-access-r8xbg") pod "controller-manager-68c5756767-4nmg2" (UID: "557e8146-afbb-41a0-a477-c69f4575656c") : failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/serviceaccounts/openshift-controller-manager-sa/token": dial tcp 38.102.83.138:6443: connect: connection refused Mar 13 11:53:03 crc kubenswrapper[4837]: E0313 11:53:03.639483 4837 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="3.2s" Mar 13 11:53:04 crc kubenswrapper[4837]: E0313 11:53:04.786615 4837 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.138:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:53:04 crc kubenswrapper[4837]: I0313 11:53:04.793294 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:53:04 crc kubenswrapper[4837]: W0313 11:53:04.821021 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-472f6fd64f8b2b79919c1f52a7a66b4196ddee94f4c73bc4bae77a352b472876 WatchSource:0}: Error finding container 472f6fd64f8b2b79919c1f52a7a66b4196ddee94f4c73bc4bae77a352b472876: Status 404 returned error can't find the container with id 472f6fd64f8b2b79919c1f52a7a66b4196ddee94f4c73bc4bae77a352b472876 Mar 13 11:53:05 crc kubenswrapper[4837]: I0313 11:53:05.050864 4837 status_manager.go:851] "Failed to get status for pod" podUID="def1c7aa-51a6-4ee0-93d5-714721e9fc27" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 13 11:53:05 crc kubenswrapper[4837]: I0313 11:53:05.733369 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"f462647c368a2c9ec95ac55fb1cc58505023aef3779f0d3c720f9d8861e5c80e"} Mar 13 11:53:05 crc kubenswrapper[4837]: I0313 11:53:05.733660 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"472f6fd64f8b2b79919c1f52a7a66b4196ddee94f4c73bc4bae77a352b472876"} Mar 13 11:53:05 crc kubenswrapper[4837]: E0313 11:53:05.734259 4837 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.138:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:53:05 crc kubenswrapper[4837]: I0313 11:53:05.734359 4837 status_manager.go:851] "Failed to get status for pod" podUID="def1c7aa-51a6-4ee0-93d5-714721e9fc27" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 13 11:53:06 crc kubenswrapper[4837]: E0313 11:53:06.115874 4837 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/events\": dial tcp 38.102.83.138:6443: connect: connection refused" event="&Event{ObjectMeta:{controller-manager-68c5756767-4nmg2.189c646eaeb065ea openshift-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-controller-manager,Name:controller-manager-68c5756767-4nmg2,UID:557e8146-afbb-41a0-a477-c69f4575656c,APIVersion:v1,ResourceVersion:29927,FieldPath:,},Reason:FailedMount,Message:MountVolume.SetUp failed for volume \"kube-api-access-r8xbg\" : failed to fetch token: Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/serviceaccounts/openshift-controller-manager-sa/token\": dial tcp 38.102.83.138:6443: connect: connection refused,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 11:52:59.81710897 +0000 UTC m=+295.455375743,LastTimestamp:2026-03-13 11:52:59.81710897 +0000 UTC m=+295.455375743,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 11:53:06 crc kubenswrapper[4837]: E0313 11:53:06.840517 4837 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" interval="6.4s" Mar 13 11:53:07 crc kubenswrapper[4837]: I0313 11:53:07.419012 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8xbg\" (UniqueName: \"kubernetes.io/projected/557e8146-afbb-41a0-a477-c69f4575656c-kube-api-access-r8xbg\") pod \"controller-manager-68c5756767-4nmg2\" (UID: \"557e8146-afbb-41a0-a477-c69f4575656c\") " pod="openshift-controller-manager/controller-manager-68c5756767-4nmg2" Mar 13 11:53:07 crc kubenswrapper[4837]: E0313 11:53:07.420096 4837 projected.go:194] Error preparing data for projected volume kube-api-access-r8xbg for pod openshift-controller-manager/controller-manager-68c5756767-4nmg2: failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/serviceaccounts/openshift-controller-manager-sa/token": dial tcp 38.102.83.138:6443: connect: connection refused Mar 13 11:53:07 crc kubenswrapper[4837]: E0313 11:53:07.420173 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/557e8146-afbb-41a0-a477-c69f4575656c-kube-api-access-r8xbg podName:557e8146-afbb-41a0-a477-c69f4575656c nodeName:}" failed. No retries permitted until 2026-03-13 11:53:15.420152635 +0000 UTC m=+311.058419398 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-r8xbg" (UniqueName: "kubernetes.io/projected/557e8146-afbb-41a0-a477-c69f4575656c-kube-api-access-r8xbg") pod "controller-manager-68c5756767-4nmg2" (UID: "557e8146-afbb-41a0-a477-c69f4575656c") : failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/serviceaccounts/openshift-controller-manager-sa/token": dial tcp 38.102.83.138:6443: connect: connection refused Mar 13 11:53:10 crc kubenswrapper[4837]: E0313 11:53:10.880409 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:53:10Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:53:10Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:53:10Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T11:53:10Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:1295a1f0e74ae87f51a733e28b64c6fdb6b9a5b069a6897b3870fe52cc1c3b0b\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:505eeaa3f051e9f4ea6a622aca92e5c4eae07078ca185d9fecfe8cc9b6dfc899\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1739173859},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:4855408bd0e4d0711383d0c14dcad53c98255ff9f83f6cbefb57e47eacc1f1f1\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:97bdbb5854e4ad7976209a44cff02c8a2b9542f58ad007c06a5c3a5e8266def1\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1284762325},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:898c67bf7fc973e99114f3148976a6c21ae0dbe413051415588fa9b995f5b331\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:a641939d2096609a4cf6eec872a1476b7c671bfd81cffc2edeb6e9f13c9deeba\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1231028434},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:27f5385c5b700fb400a618b51a628f0db39afa4a8db03380252ca5abf49518da\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:3d8cd257adb4bde31657aa6b0fe5da54d74b1f9eda5457c8dee929ed64ecece0\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1221692102},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-cli@sha256:69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9\\\",\\\"registry.redhat.io/openshift4/ose-cli@sha256:ef83967297f619f45075e7fd1428a1eb981622a6c174c46fb53b158ed24bed85\\\",\\\"registry.redhat.io/openshift4/ose-cli:latest\\\"],\\\"sizeBytes\\\":584351326},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 13 11:53:10 crc kubenswrapper[4837]: E0313 11:53:10.881503 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 13 11:53:10 crc kubenswrapper[4837]: E0313 11:53:10.882141 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 13 11:53:10 crc kubenswrapper[4837]: E0313 11:53:10.882448 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 13 11:53:10 crc kubenswrapper[4837]: E0313 11:53:10.882776 4837 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 13 11:53:10 crc kubenswrapper[4837]: E0313 11:53:10.882807 4837 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 11:53:11 crc kubenswrapper[4837]: I0313 11:53:11.047763 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:53:11 crc kubenswrapper[4837]: I0313 11:53:11.049414 4837 status_manager.go:851] "Failed to get status for pod" podUID="def1c7aa-51a6-4ee0-93d5-714721e9fc27" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 13 11:53:11 crc kubenswrapper[4837]: I0313 11:53:11.073576 4837 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="93dcd114-c39a-4b27-aa9c-a42e3ef7cd79" Mar 13 11:53:11 crc kubenswrapper[4837]: I0313 11:53:11.073618 4837 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="93dcd114-c39a-4b27-aa9c-a42e3ef7cd79" Mar 13 11:53:11 crc kubenswrapper[4837]: E0313 11:53:11.074218 4837 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:53:11 crc kubenswrapper[4837]: I0313 11:53:11.074802 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:53:11 crc kubenswrapper[4837]: I0313 11:53:11.815624 4837 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="45e7881172f4f96a2ab523ddff9d11417fc1f83ee6944eb7846886e80e9ec03b" exitCode=0 Mar 13 11:53:11 crc kubenswrapper[4837]: I0313 11:53:11.815742 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"45e7881172f4f96a2ab523ddff9d11417fc1f83ee6944eb7846886e80e9ec03b"} Mar 13 11:53:11 crc kubenswrapper[4837]: I0313 11:53:11.816335 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c253ac6482cc3e65aaeedf2ec09af79a4403516e10e60eabd19cc7da59376637"} Mar 13 11:53:11 crc kubenswrapper[4837]: I0313 11:53:11.816846 4837 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="93dcd114-c39a-4b27-aa9c-a42e3ef7cd79" Mar 13 11:53:11 crc kubenswrapper[4837]: I0313 11:53:11.816885 4837 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="93dcd114-c39a-4b27-aa9c-a42e3ef7cd79" Mar 13 11:53:11 crc kubenswrapper[4837]: I0313 11:53:11.817326 4837 status_manager.go:851] "Failed to get status for pod" podUID="def1c7aa-51a6-4ee0-93d5-714721e9fc27" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" Mar 13 11:53:11 crc kubenswrapper[4837]: E0313 11:53:11.817446 4837 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.138:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:53:12 crc kubenswrapper[4837]: I0313 11:53:12.825329 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"466a4dd4a2a6112b71c193c861b87265fa79034d9da85058dae83b5b0ed36623"} Mar 13 11:53:12 crc kubenswrapper[4837]: I0313 11:53:12.825730 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"cb18eaac47947fa66afcc1f892beb971a140cbd7756952a8edf0fd75614e023a"} Mar 13 11:53:12 crc kubenswrapper[4837]: I0313 11:53:12.825746 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"91f9e508710d7c4605dd48a03bff40e6ad2f5986c090a33921c24d6137f14829"} Mar 13 11:53:12 crc kubenswrapper[4837]: I0313 11:53:12.825758 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"9aec12bda443dab2c3646cea5c65c999fbc3a967fd5e31db688ed90bdf968f4f"} Mar 13 11:53:13 crc kubenswrapper[4837]: I0313 11:53:13.833322 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1e519516e8e77325ae252ee0ce1635557d69f2d3c4b19cb91e4d081292c04e70"} Mar 13 11:53:13 crc kubenswrapper[4837]: I0313 11:53:13.833685 4837 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="93dcd114-c39a-4b27-aa9c-a42e3ef7cd79" Mar 13 11:53:13 crc kubenswrapper[4837]: I0313 11:53:13.833713 4837 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="93dcd114-c39a-4b27-aa9c-a42e3ef7cd79" Mar 13 11:53:13 crc kubenswrapper[4837]: I0313 11:53:13.833881 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:53:14 crc kubenswrapper[4837]: I0313 11:53:14.047914 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" Mar 13 11:53:14 crc kubenswrapper[4837]: I0313 11:53:14.048484 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" Mar 13 11:53:14 crc kubenswrapper[4837]: I0313 11:53:14.841372 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 13 11:53:14 crc kubenswrapper[4837]: I0313 11:53:14.842615 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 13 11:53:14 crc kubenswrapper[4837]: I0313 11:53:14.842669 4837 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="29dcf2d4dbca31492c07df5fcf50217d44ab7914e536e5ae6d8187e8b2b3e62f" exitCode=1 Mar 13 11:53:14 crc kubenswrapper[4837]: I0313 11:53:14.842704 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"29dcf2d4dbca31492c07df5fcf50217d44ab7914e536e5ae6d8187e8b2b3e62f"} Mar 13 11:53:14 crc kubenswrapper[4837]: I0313 11:53:14.843154 4837 scope.go:117] "RemoveContainer" containerID="29dcf2d4dbca31492c07df5fcf50217d44ab7914e536e5ae6d8187e8b2b3e62f" Mar 13 11:53:15 crc kubenswrapper[4837]: I0313 11:53:15.456083 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8xbg\" (UniqueName: \"kubernetes.io/projected/557e8146-afbb-41a0-a477-c69f4575656c-kube-api-access-r8xbg\") pod \"controller-manager-68c5756767-4nmg2\" (UID: \"557e8146-afbb-41a0-a477-c69f4575656c\") " pod="openshift-controller-manager/controller-manager-68c5756767-4nmg2" Mar 13 11:53:15 crc kubenswrapper[4837]: I0313 11:53:15.479675 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8xbg\" (UniqueName: \"kubernetes.io/projected/557e8146-afbb-41a0-a477-c69f4575656c-kube-api-access-r8xbg\") pod \"controller-manager-68c5756767-4nmg2\" (UID: \"557e8146-afbb-41a0-a477-c69f4575656c\") " pod="openshift-controller-manager/controller-manager-68c5756767-4nmg2" Mar 13 11:53:15 crc kubenswrapper[4837]: I0313 11:53:15.536285 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-68c5756767-4nmg2" Mar 13 11:53:15 crc kubenswrapper[4837]: I0313 11:53:15.853706 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 13 11:53:15 crc kubenswrapper[4837]: I0313 11:53:15.855042 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 13 11:53:15 crc kubenswrapper[4837]: I0313 11:53:15.855102 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"82a2cfe3baf1dac4243d4c8eb1e4cc4e0aabd08de67ece4654f31efea4f2dadf"} Mar 13 11:53:15 crc kubenswrapper[4837]: W0313 11:53:15.932525 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod557e8146_afbb_41a0_a477_c69f4575656c.slice/crio-c261a0e9e80a95ebb82e7949fc3a9fbd1359cabd2a415e857d84f9967ed8eb5d WatchSource:0}: Error finding container c261a0e9e80a95ebb82e7949fc3a9fbd1359cabd2a415e857d84f9967ed8eb5d: Status 404 returned error can't find the container with id c261a0e9e80a95ebb82e7949fc3a9fbd1359cabd2a415e857d84f9967ed8eb5d Mar 13 11:53:16 crc kubenswrapper[4837]: I0313 11:53:16.075231 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:53:16 crc kubenswrapper[4837]: I0313 11:53:16.075819 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:53:16 crc kubenswrapper[4837]: I0313 11:53:16.081213 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:53:16 crc kubenswrapper[4837]: I0313 11:53:16.860884 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-68c5756767-4nmg2" event={"ID":"557e8146-afbb-41a0-a477-c69f4575656c","Type":"ContainerStarted","Data":"bd0597c1cc32fbd78f39ed829a4641d444991994c57793796768416306af6501"} Mar 13 11:53:16 crc kubenswrapper[4837]: I0313 11:53:16.861233 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-68c5756767-4nmg2" Mar 13 11:53:16 crc kubenswrapper[4837]: I0313 11:53:16.861247 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-68c5756767-4nmg2" event={"ID":"557e8146-afbb-41a0-a477-c69f4575656c","Type":"ContainerStarted","Data":"c261a0e9e80a95ebb82e7949fc3a9fbd1359cabd2a415e857d84f9967ed8eb5d"} Mar 13 11:53:16 crc kubenswrapper[4837]: I0313 11:53:16.866153 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-68c5756767-4nmg2" Mar 13 11:53:18 crc kubenswrapper[4837]: I0313 11:53:18.842179 4837 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:53:18 crc kubenswrapper[4837]: I0313 11:53:18.871010 4837 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="93dcd114-c39a-4b27-aa9c-a42e3ef7cd79" Mar 13 11:53:18 crc kubenswrapper[4837]: I0313 11:53:18.871046 4837 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="93dcd114-c39a-4b27-aa9c-a42e3ef7cd79" Mar 13 11:53:18 crc kubenswrapper[4837]: I0313 11:53:18.877553 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:53:18 crc kubenswrapper[4837]: I0313 11:53:18.902310 4837 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="5fce7cde-9b5b-40cc-8fdb-92fce8257be0" Mar 13 11:53:19 crc kubenswrapper[4837]: I0313 11:53:19.878091 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" event={"ID":"c6c2dd46-4cc0-4802-b96e-7d395d3dbc50","Type":"ContainerStarted","Data":"37f72418a416002c47a4ca07b43700f505d4349cb995ba856c6770efb8dff147"} Mar 13 11:53:19 crc kubenswrapper[4837]: I0313 11:53:19.878439 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" event={"ID":"c6c2dd46-4cc0-4802-b96e-7d395d3dbc50","Type":"ContainerStarted","Data":"babcfb3743f649875ff8952ae21dfa6891e88a59ef60ef37e4e768f1a82566fb"} Mar 13 11:53:19 crc kubenswrapper[4837]: I0313 11:53:19.878398 4837 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="93dcd114-c39a-4b27-aa9c-a42e3ef7cd79" Mar 13 11:53:19 crc kubenswrapper[4837]: I0313 11:53:19.878551 4837 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="93dcd114-c39a-4b27-aa9c-a42e3ef7cd79" Mar 13 11:53:19 crc kubenswrapper[4837]: I0313 11:53:19.878695 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" Mar 13 11:53:19 crc kubenswrapper[4837]: I0313 11:53:19.896270 4837 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="5fce7cde-9b5b-40cc-8fdb-92fce8257be0" Mar 13 11:53:19 crc kubenswrapper[4837]: I0313 11:53:19.944558 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:53:20 crc kubenswrapper[4837]: I0313 11:53:20.878709 4837 patch_prober.go:28] interesting pod/route-controller-manager-54bb9484b9-l9k8j container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 11:53:20 crc kubenswrapper[4837]: I0313 11:53:20.878793 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" podUID="c6c2dd46-4cc0-4802-b96e-7d395d3dbc50" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 11:53:21 crc kubenswrapper[4837]: I0313 11:53:21.879345 4837 patch_prober.go:28] interesting pod/route-controller-manager-54bb9484b9-l9k8j container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 11:53:21 crc kubenswrapper[4837]: I0313 11:53:21.879419 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" podUID="c6c2dd46-4cc0-4802-b96e-7d395d3dbc50" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 11:53:22 crc kubenswrapper[4837]: I0313 11:53:22.880067 4837 patch_prober.go:28] interesting pod/route-controller-manager-54bb9484b9-l9k8j container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 11:53:22 crc kubenswrapper[4837]: I0313 11:53:22.880161 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" podUID="c6c2dd46-4cc0-4802-b96e-7d395d3dbc50" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 11:53:23 crc kubenswrapper[4837]: I0313 11:53:23.760418 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:53:23 crc kubenswrapper[4837]: I0313 11:53:23.767830 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:53:25 crc kubenswrapper[4837]: I0313 11:53:25.897193 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 13 11:53:27 crc kubenswrapper[4837]: I0313 11:53:27.228972 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 13 11:53:27 crc kubenswrapper[4837]: I0313 11:53:27.470893 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 13 11:53:29 crc kubenswrapper[4837]: I0313 11:53:29.003316 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 13 11:53:29 crc kubenswrapper[4837]: I0313 11:53:29.130910 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 13 11:53:29 crc kubenswrapper[4837]: I0313 11:53:29.380789 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 13 11:53:29 crc kubenswrapper[4837]: I0313 11:53:29.685850 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 13 11:53:29 crc kubenswrapper[4837]: I0313 11:53:29.822758 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 13 11:53:29 crc kubenswrapper[4837]: I0313 11:53:29.951687 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 11:53:30 crc kubenswrapper[4837]: I0313 11:53:30.232890 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 13 11:53:30 crc kubenswrapper[4837]: I0313 11:53:30.333516 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 11:53:30 crc kubenswrapper[4837]: I0313 11:53:30.601759 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 13 11:53:30 crc kubenswrapper[4837]: I0313 11:53:30.727303 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 11:53:30 crc kubenswrapper[4837]: I0313 11:53:30.819419 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 13 11:53:30 crc kubenswrapper[4837]: I0313 11:53:30.928290 4837 patch_prober.go:28] interesting pod/route-controller-manager-54bb9484b9-l9k8j container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 11:53:30 crc kubenswrapper[4837]: I0313 11:53:30.928371 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" podUID="c6c2dd46-4cc0-4802-b96e-7d395d3dbc50" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 11:53:30 crc kubenswrapper[4837]: I0313 11:53:30.976076 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 13 11:53:31 crc kubenswrapper[4837]: I0313 11:53:31.054842 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 13 11:53:31 crc kubenswrapper[4837]: I0313 11:53:31.247733 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 13 11:53:31 crc kubenswrapper[4837]: I0313 11:53:31.355315 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 11:53:31 crc kubenswrapper[4837]: I0313 11:53:31.464951 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 13 11:53:31 crc kubenswrapper[4837]: I0313 11:53:31.535875 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 13 11:53:31 crc kubenswrapper[4837]: I0313 11:53:31.560808 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 13 11:53:31 crc kubenswrapper[4837]: I0313 11:53:31.934372 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 13 11:53:32 crc kubenswrapper[4837]: I0313 11:53:32.051151 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 13 11:53:32 crc kubenswrapper[4837]: I0313 11:53:32.133229 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 13 11:53:32 crc kubenswrapper[4837]: I0313 11:53:32.304254 4837 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 13 11:53:32 crc kubenswrapper[4837]: I0313 11:53:32.327846 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 11:53:32 crc kubenswrapper[4837]: I0313 11:53:32.365434 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 13 11:53:32 crc kubenswrapper[4837]: I0313 11:53:32.402305 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 13 11:53:32 crc kubenswrapper[4837]: I0313 11:53:32.504396 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 13 11:53:32 crc kubenswrapper[4837]: I0313 11:53:32.575520 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 13 11:53:32 crc kubenswrapper[4837]: I0313 11:53:32.595822 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 13 11:53:32 crc kubenswrapper[4837]: I0313 11:53:32.636365 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 13 11:53:32 crc kubenswrapper[4837]: I0313 11:53:32.884488 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 13 11:53:33 crc kubenswrapper[4837]: I0313 11:53:33.180102 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 13 11:53:33 crc kubenswrapper[4837]: I0313 11:53:33.196620 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 13 11:53:33 crc kubenswrapper[4837]: I0313 11:53:33.326333 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 13 11:53:33 crc kubenswrapper[4837]: I0313 11:53:33.467063 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 13 11:53:33 crc kubenswrapper[4837]: I0313 11:53:33.469199 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 13 11:53:33 crc kubenswrapper[4837]: I0313 11:53:33.541368 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 13 11:53:33 crc kubenswrapper[4837]: I0313 11:53:33.641138 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 13 11:53:33 crc kubenswrapper[4837]: I0313 11:53:33.720461 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 13 11:53:33 crc kubenswrapper[4837]: I0313 11:53:33.749474 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 13 11:53:33 crc kubenswrapper[4837]: I0313 11:53:33.832784 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 13 11:53:33 crc kubenswrapper[4837]: I0313 11:53:33.871853 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 13 11:53:33 crc kubenswrapper[4837]: I0313 11:53:33.898173 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 13 11:53:33 crc kubenswrapper[4837]: I0313 11:53:33.898408 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 11:53:33 crc kubenswrapper[4837]: I0313 11:53:33.906015 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 13 11:53:33 crc kubenswrapper[4837]: I0313 11:53:33.927050 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 13 11:53:33 crc kubenswrapper[4837]: I0313 11:53:33.982412 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 13 11:53:34 crc kubenswrapper[4837]: I0313 11:53:34.018887 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 13 11:53:34 crc kubenswrapper[4837]: I0313 11:53:34.018999 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 13 11:53:34 crc kubenswrapper[4837]: I0313 11:53:34.075060 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 13 11:53:34 crc kubenswrapper[4837]: I0313 11:53:34.140176 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 13 11:53:34 crc kubenswrapper[4837]: I0313 11:53:34.149010 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 13 11:53:34 crc kubenswrapper[4837]: I0313 11:53:34.225183 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 13 11:53:34 crc kubenswrapper[4837]: I0313 11:53:34.279452 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 13 11:53:34 crc kubenswrapper[4837]: I0313 11:53:34.281280 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 13 11:53:34 crc kubenswrapper[4837]: I0313 11:53:34.298738 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 13 11:53:34 crc kubenswrapper[4837]: I0313 11:53:34.305372 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 13 11:53:34 crc kubenswrapper[4837]: I0313 11:53:34.380159 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 13 11:53:34 crc kubenswrapper[4837]: I0313 11:53:34.463174 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 13 11:53:34 crc kubenswrapper[4837]: I0313 11:53:34.492729 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 13 11:53:34 crc kubenswrapper[4837]: I0313 11:53:34.719686 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 13 11:53:34 crc kubenswrapper[4837]: I0313 11:53:34.720168 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 13 11:53:34 crc kubenswrapper[4837]: I0313 11:53:34.724885 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 13 11:53:34 crc kubenswrapper[4837]: I0313 11:53:34.806667 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 13 11:53:35 crc kubenswrapper[4837]: I0313 11:53:35.102691 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 13 11:53:35 crc kubenswrapper[4837]: I0313 11:53:35.117844 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 13 11:53:35 crc kubenswrapper[4837]: I0313 11:53:35.173227 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 13 11:53:35 crc kubenswrapper[4837]: I0313 11:53:35.283070 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 13 11:53:35 crc kubenswrapper[4837]: I0313 11:53:35.388772 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 13 11:53:35 crc kubenswrapper[4837]: I0313 11:53:35.460865 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 13 11:53:35 crc kubenswrapper[4837]: I0313 11:53:35.487660 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 13 11:53:35 crc kubenswrapper[4837]: I0313 11:53:35.494276 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 13 11:53:35 crc kubenswrapper[4837]: I0313 11:53:35.495364 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 13 11:53:35 crc kubenswrapper[4837]: I0313 11:53:35.595527 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 13 11:53:35 crc kubenswrapper[4837]: I0313 11:53:35.629218 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 13 11:53:35 crc kubenswrapper[4837]: I0313 11:53:35.726952 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 13 11:53:35 crc kubenswrapper[4837]: I0313 11:53:35.931791 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 13 11:53:35 crc kubenswrapper[4837]: I0313 11:53:35.936378 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 13 11:53:35 crc kubenswrapper[4837]: I0313 11:53:35.936576 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 13 11:53:35 crc kubenswrapper[4837]: I0313 11:53:35.974414 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 13 11:53:36 crc kubenswrapper[4837]: I0313 11:53:36.000839 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 13 11:53:36 crc kubenswrapper[4837]: I0313 11:53:36.061731 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 13 11:53:36 crc kubenswrapper[4837]: I0313 11:53:36.081199 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 13 11:53:36 crc kubenswrapper[4837]: I0313 11:53:36.146022 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 11:53:36 crc kubenswrapper[4837]: I0313 11:53:36.237500 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 13 11:53:36 crc kubenswrapper[4837]: I0313 11:53:36.237510 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 13 11:53:36 crc kubenswrapper[4837]: I0313 11:53:36.290767 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 13 11:53:36 crc kubenswrapper[4837]: I0313 11:53:36.319484 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 13 11:53:36 crc kubenswrapper[4837]: I0313 11:53:36.399745 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 11:53:36 crc kubenswrapper[4837]: I0313 11:53:36.431095 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 13 11:53:36 crc kubenswrapper[4837]: I0313 11:53:36.499429 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 13 11:53:36 crc kubenswrapper[4837]: I0313 11:53:36.509574 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 13 11:53:36 crc kubenswrapper[4837]: I0313 11:53:36.567265 4837 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 13 11:53:36 crc kubenswrapper[4837]: I0313 11:53:36.570900 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 13 11:53:36 crc kubenswrapper[4837]: I0313 11:53:36.578126 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 13 11:53:36 crc kubenswrapper[4837]: I0313 11:53:36.614172 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 13 11:53:36 crc kubenswrapper[4837]: I0313 11:53:36.663789 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 13 11:53:36 crc kubenswrapper[4837]: I0313 11:53:36.692781 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 13 11:53:36 crc kubenswrapper[4837]: I0313 11:53:36.767457 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 13 11:53:36 crc kubenswrapper[4837]: I0313 11:53:36.808857 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 13 11:53:37 crc kubenswrapper[4837]: I0313 11:53:37.019979 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 13 11:53:37 crc kubenswrapper[4837]: I0313 11:53:37.081146 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 13 11:53:37 crc kubenswrapper[4837]: I0313 11:53:37.090756 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 13 11:53:37 crc kubenswrapper[4837]: I0313 11:53:37.120409 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 13 11:53:37 crc kubenswrapper[4837]: I0313 11:53:37.169261 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 13 11:53:37 crc kubenswrapper[4837]: I0313 11:53:37.224085 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 13 11:53:37 crc kubenswrapper[4837]: I0313 11:53:37.239077 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 13 11:53:37 crc kubenswrapper[4837]: I0313 11:53:37.258254 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 13 11:53:37 crc kubenswrapper[4837]: I0313 11:53:37.364973 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 13 11:53:37 crc kubenswrapper[4837]: I0313 11:53:37.395342 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 13 11:53:37 crc kubenswrapper[4837]: I0313 11:53:37.446916 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 13 11:53:37 crc kubenswrapper[4837]: I0313 11:53:37.523149 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 13 11:53:37 crc kubenswrapper[4837]: I0313 11:53:37.550014 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 13 11:53:37 crc kubenswrapper[4837]: I0313 11:53:37.631706 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 13 11:53:37 crc kubenswrapper[4837]: I0313 11:53:37.825178 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 13 11:53:37 crc kubenswrapper[4837]: I0313 11:53:37.826068 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 11:53:37 crc kubenswrapper[4837]: I0313 11:53:37.883617 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 13 11:53:37 crc kubenswrapper[4837]: I0313 11:53:37.928121 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 13 11:53:37 crc kubenswrapper[4837]: I0313 11:53:37.950565 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 13 11:53:37 crc kubenswrapper[4837]: I0313 11:53:37.955433 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 13 11:53:38 crc kubenswrapper[4837]: I0313 11:53:38.054283 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 13 11:53:38 crc kubenswrapper[4837]: I0313 11:53:38.061684 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 13 11:53:38 crc kubenswrapper[4837]: I0313 11:53:38.147138 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 13 11:53:38 crc kubenswrapper[4837]: I0313 11:53:38.171601 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 13 11:53:38 crc kubenswrapper[4837]: I0313 11:53:38.199263 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 13 11:53:38 crc kubenswrapper[4837]: I0313 11:53:38.227809 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 13 11:53:38 crc kubenswrapper[4837]: I0313 11:53:38.238591 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 13 11:53:38 crc kubenswrapper[4837]: I0313 11:53:38.250282 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 13 11:53:38 crc kubenswrapper[4837]: I0313 11:53:38.293759 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 13 11:53:38 crc kubenswrapper[4837]: I0313 11:53:38.306242 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 13 11:53:38 crc kubenswrapper[4837]: I0313 11:53:38.414171 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 13 11:53:38 crc kubenswrapper[4837]: I0313 11:53:38.453921 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 13 11:53:38 crc kubenswrapper[4837]: I0313 11:53:38.516533 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 13 11:53:38 crc kubenswrapper[4837]: I0313 11:53:38.519004 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 13 11:53:38 crc kubenswrapper[4837]: I0313 11:53:38.617378 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 13 11:53:38 crc kubenswrapper[4837]: I0313 11:53:38.744232 4837 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 13 11:53:38 crc kubenswrapper[4837]: I0313 11:53:38.796704 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 13 11:53:38 crc kubenswrapper[4837]: I0313 11:53:38.873250 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 13 11:53:38 crc kubenswrapper[4837]: I0313 11:53:38.875379 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 13 11:53:38 crc kubenswrapper[4837]: I0313 11:53:38.927286 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 13 11:53:38 crc kubenswrapper[4837]: I0313 11:53:38.958791 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 13 11:53:39 crc kubenswrapper[4837]: I0313 11:53:39.070475 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 13 11:53:39 crc kubenswrapper[4837]: I0313 11:53:39.107352 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 13 11:53:39 crc kubenswrapper[4837]: I0313 11:53:39.124541 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 13 11:53:39 crc kubenswrapper[4837]: I0313 11:53:39.158717 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 13 11:53:39 crc kubenswrapper[4837]: I0313 11:53:39.224710 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 13 11:53:39 crc kubenswrapper[4837]: I0313 11:53:39.260081 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 13 11:53:39 crc kubenswrapper[4837]: I0313 11:53:39.295327 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 13 11:53:39 crc kubenswrapper[4837]: I0313 11:53:39.299204 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 13 11:53:39 crc kubenswrapper[4837]: I0313 11:53:39.472019 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 13 11:53:39 crc kubenswrapper[4837]: I0313 11:53:39.523663 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 13 11:53:39 crc kubenswrapper[4837]: I0313 11:53:39.579282 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 13 11:53:39 crc kubenswrapper[4837]: I0313 11:53:39.675785 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 13 11:53:39 crc kubenswrapper[4837]: I0313 11:53:39.730812 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 13 11:53:39 crc kubenswrapper[4837]: I0313 11:53:39.736487 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 13 11:53:39 crc kubenswrapper[4837]: I0313 11:53:39.747273 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 13 11:53:39 crc kubenswrapper[4837]: I0313 11:53:39.778876 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 13 11:53:39 crc kubenswrapper[4837]: I0313 11:53:39.860138 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 13 11:53:39 crc kubenswrapper[4837]: I0313 11:53:39.880531 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 11:53:39 crc kubenswrapper[4837]: I0313 11:53:39.889192 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 13 11:53:39 crc kubenswrapper[4837]: I0313 11:53:39.971592 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 13 11:53:39 crc kubenswrapper[4837]: I0313 11:53:39.977732 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.025581 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.035255 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.118719 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.170958 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.177501 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.263971 4837 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.400695 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.468802 4837 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.486210 4837 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.490394 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-68c5756767-4nmg2" podStartSLOduration=42.490363247 podStartE2EDuration="42.490363247s" podCreationTimestamp="2026-03-13 11:52:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:53:18.87269223 +0000 UTC m=+314.510958993" watchObservedRunningTime="2026-03-13 11:53:40.490363247 +0000 UTC m=+336.128630050" Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.497725 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" podStartSLOduration=42.497691755 podStartE2EDuration="42.497691755s" podCreationTimestamp="2026-03-13 11:52:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:53:19.894100288 +0000 UTC m=+315.532367051" watchObservedRunningTime="2026-03-13 11:53:40.497691755 +0000 UTC m=+336.135958548" Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.504404 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.504482 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.504545 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-68c5756767-4nmg2","openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j"] Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.504983 4837 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="93dcd114-c39a-4b27-aa9c-a42e3ef7cd79" Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.505025 4837 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="93dcd114-c39a-4b27-aa9c-a42e3ef7cd79" Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.513395 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.526302 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=22.526285342 podStartE2EDuration="22.526285342s" podCreationTimestamp="2026-03-13 11:53:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:53:40.526076445 +0000 UTC m=+336.164343218" watchObservedRunningTime="2026-03-13 11:53:40.526285342 +0000 UTC m=+336.164552115" Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.585626 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.624982 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.626969 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.654175 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.717724 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.731040 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.769595 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.812216 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.825905 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.930040 4837 patch_prober.go:28] interesting pod/route-controller-manager-54bb9484b9-l9k8j container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.930309 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" podUID="c6c2dd46-4cc0-4802-b96e-7d395d3dbc50" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.942787 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.947895 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.977542 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.991491 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 13 11:53:40 crc kubenswrapper[4837]: I0313 11:53:40.992911 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 13 11:53:41 crc kubenswrapper[4837]: I0313 11:53:41.098051 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 13 11:53:41 crc kubenswrapper[4837]: I0313 11:53:41.180488 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 13 11:53:41 crc kubenswrapper[4837]: I0313 11:53:41.240810 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 13 11:53:41 crc kubenswrapper[4837]: I0313 11:53:41.259086 4837 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 13 11:53:41 crc kubenswrapper[4837]: I0313 11:53:41.259369 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://f462647c368a2c9ec95ac55fb1cc58505023aef3779f0d3c720f9d8861e5c80e" gracePeriod=5 Mar 13 11:53:41 crc kubenswrapper[4837]: I0313 11:53:41.379024 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 13 11:53:41 crc kubenswrapper[4837]: I0313 11:53:41.480909 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 13 11:53:41 crc kubenswrapper[4837]: I0313 11:53:41.512757 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 13 11:53:41 crc kubenswrapper[4837]: I0313 11:53:41.546579 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 13 11:53:41 crc kubenswrapper[4837]: I0313 11:53:41.643064 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 13 11:53:41 crc kubenswrapper[4837]: I0313 11:53:41.725596 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 13 11:53:41 crc kubenswrapper[4837]: I0313 11:53:41.800025 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 11:53:41 crc kubenswrapper[4837]: I0313 11:53:41.831659 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 13 11:53:41 crc kubenswrapper[4837]: I0313 11:53:41.848977 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 13 11:53:41 crc kubenswrapper[4837]: I0313 11:53:41.856563 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 13 11:53:41 crc kubenswrapper[4837]: I0313 11:53:41.873029 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 13 11:53:41 crc kubenswrapper[4837]: I0313 11:53:41.893478 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 13 11:53:41 crc kubenswrapper[4837]: I0313 11:53:41.930503 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 13 11:53:41 crc kubenswrapper[4837]: I0313 11:53:41.930542 4837 patch_prober.go:28] interesting pod/route-controller-manager-54bb9484b9-l9k8j container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 11:53:41 crc kubenswrapper[4837]: I0313 11:53:41.930609 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" podUID="c6c2dd46-4cc0-4802-b96e-7d395d3dbc50" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 11:53:42 crc kubenswrapper[4837]: I0313 11:53:42.038997 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 13 11:53:42 crc kubenswrapper[4837]: I0313 11:53:42.042159 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 13 11:53:42 crc kubenswrapper[4837]: I0313 11:53:42.110835 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 13 11:53:42 crc kubenswrapper[4837]: I0313 11:53:42.519322 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 13 11:53:42 crc kubenswrapper[4837]: I0313 11:53:42.564842 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 13 11:53:42 crc kubenswrapper[4837]: I0313 11:53:42.571157 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 13 11:53:42 crc kubenswrapper[4837]: I0313 11:53:42.638981 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 13 11:53:42 crc kubenswrapper[4837]: I0313 11:53:42.667514 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 13 11:53:42 crc kubenswrapper[4837]: I0313 11:53:42.707434 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 13 11:53:42 crc kubenswrapper[4837]: I0313 11:53:42.851979 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 13 11:53:42 crc kubenswrapper[4837]: I0313 11:53:42.989625 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 13 11:53:43 crc kubenswrapper[4837]: I0313 11:53:43.052908 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 13 11:53:43 crc kubenswrapper[4837]: I0313 11:53:43.065609 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 13 11:53:43 crc kubenswrapper[4837]: I0313 11:53:43.183968 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 13 11:53:43 crc kubenswrapper[4837]: I0313 11:53:43.268324 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 13 11:53:43 crc kubenswrapper[4837]: I0313 11:53:43.268571 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 13 11:53:43 crc kubenswrapper[4837]: I0313 11:53:43.554012 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 13 11:53:43 crc kubenswrapper[4837]: I0313 11:53:43.572113 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 13 11:53:43 crc kubenswrapper[4837]: I0313 11:53:43.574565 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 11:53:43 crc kubenswrapper[4837]: I0313 11:53:43.653417 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 13 11:53:43 crc kubenswrapper[4837]: I0313 11:53:43.826342 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 13 11:53:43 crc kubenswrapper[4837]: I0313 11:53:43.855045 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 13 11:53:43 crc kubenswrapper[4837]: I0313 11:53:43.923374 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 13 11:53:44 crc kubenswrapper[4837]: I0313 11:53:44.035791 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 13 11:53:44 crc kubenswrapper[4837]: I0313 11:53:44.043561 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 13 11:53:44 crc kubenswrapper[4837]: I0313 11:53:44.066201 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 13 11:53:44 crc kubenswrapper[4837]: I0313 11:53:44.237326 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 13 11:53:44 crc kubenswrapper[4837]: I0313 11:53:44.285627 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 13 11:53:44 crc kubenswrapper[4837]: I0313 11:53:44.360279 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 13 11:53:44 crc kubenswrapper[4837]: I0313 11:53:44.581277 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 13 11:53:44 crc kubenswrapper[4837]: I0313 11:53:44.671749 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 13 11:53:44 crc kubenswrapper[4837]: I0313 11:53:44.771101 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 13 11:53:44 crc kubenswrapper[4837]: I0313 11:53:44.811995 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 13 11:53:44 crc kubenswrapper[4837]: I0313 11:53:44.960542 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 13 11:53:45 crc kubenswrapper[4837]: I0313 11:53:45.020127 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 13 11:53:45 crc kubenswrapper[4837]: I0313 11:53:45.063349 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 13 11:53:45 crc kubenswrapper[4837]: I0313 11:53:45.259795 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 13 11:53:45 crc kubenswrapper[4837]: I0313 11:53:45.425025 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 13 11:53:45 crc kubenswrapper[4837]: I0313 11:53:45.452949 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 13 11:53:45 crc kubenswrapper[4837]: I0313 11:53:45.473733 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 13 11:53:45 crc kubenswrapper[4837]: I0313 11:53:45.489022 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 13 11:53:45 crc kubenswrapper[4837]: I0313 11:53:45.902451 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 13 11:53:45 crc kubenswrapper[4837]: I0313 11:53:45.948501 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 13 11:53:46 crc kubenswrapper[4837]: I0313 11:53:46.233711 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 13 11:53:46 crc kubenswrapper[4837]: I0313 11:53:46.351234 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 13 11:53:46 crc kubenswrapper[4837]: I0313 11:53:46.376678 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 13 11:53:46 crc kubenswrapper[4837]: I0313 11:53:46.391671 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 13 11:53:46 crc kubenswrapper[4837]: I0313 11:53:46.537768 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 13 11:53:46 crc kubenswrapper[4837]: I0313 11:53:46.700101 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 13 11:53:46 crc kubenswrapper[4837]: I0313 11:53:46.834209 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 13 11:53:46 crc kubenswrapper[4837]: I0313 11:53:46.834290 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:53:46 crc kubenswrapper[4837]: I0313 11:53:46.927867 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 13 11:53:46 crc kubenswrapper[4837]: I0313 11:53:46.960769 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 13 11:53:46 crc kubenswrapper[4837]: I0313 11:53:46.960818 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 13 11:53:46 crc kubenswrapper[4837]: I0313 11:53:46.960849 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 13 11:53:46 crc kubenswrapper[4837]: I0313 11:53:46.960866 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 13 11:53:46 crc kubenswrapper[4837]: I0313 11:53:46.960912 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 13 11:53:46 crc kubenswrapper[4837]: I0313 11:53:46.960897 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:53:46 crc kubenswrapper[4837]: I0313 11:53:46.960930 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:53:46 crc kubenswrapper[4837]: I0313 11:53:46.960952 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:53:46 crc kubenswrapper[4837]: I0313 11:53:46.960967 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:53:46 crc kubenswrapper[4837]: I0313 11:53:46.961171 4837 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 13 11:53:46 crc kubenswrapper[4837]: I0313 11:53:46.961189 4837 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 13 11:53:46 crc kubenswrapper[4837]: I0313 11:53:46.961200 4837 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 13 11:53:46 crc kubenswrapper[4837]: I0313 11:53:46.961212 4837 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 13 11:53:46 crc kubenswrapper[4837]: I0313 11:53:46.967905 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:53:47 crc kubenswrapper[4837]: I0313 11:53:47.008570 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 13 11:53:47 crc kubenswrapper[4837]: I0313 11:53:47.008624 4837 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="f462647c368a2c9ec95ac55fb1cc58505023aef3779f0d3c720f9d8861e5c80e" exitCode=137 Mar 13 11:53:47 crc kubenswrapper[4837]: I0313 11:53:47.008892 4837 scope.go:117] "RemoveContainer" containerID="f462647c368a2c9ec95ac55fb1cc58505023aef3779f0d3c720f9d8861e5c80e" Mar 13 11:53:47 crc kubenswrapper[4837]: I0313 11:53:47.008918 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 11:53:47 crc kubenswrapper[4837]: I0313 11:53:47.028322 4837 scope.go:117] "RemoveContainer" containerID="f462647c368a2c9ec95ac55fb1cc58505023aef3779f0d3c720f9d8861e5c80e" Mar 13 11:53:47 crc kubenswrapper[4837]: E0313 11:53:47.028693 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f462647c368a2c9ec95ac55fb1cc58505023aef3779f0d3c720f9d8861e5c80e\": container with ID starting with f462647c368a2c9ec95ac55fb1cc58505023aef3779f0d3c720f9d8861e5c80e not found: ID does not exist" containerID="f462647c368a2c9ec95ac55fb1cc58505023aef3779f0d3c720f9d8861e5c80e" Mar 13 11:53:47 crc kubenswrapper[4837]: I0313 11:53:47.028722 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f462647c368a2c9ec95ac55fb1cc58505023aef3779f0d3c720f9d8861e5c80e"} err="failed to get container status \"f462647c368a2c9ec95ac55fb1cc58505023aef3779f0d3c720f9d8861e5c80e\": rpc error: code = NotFound desc = could not find container \"f462647c368a2c9ec95ac55fb1cc58505023aef3779f0d3c720f9d8861e5c80e\": container with ID starting with f462647c368a2c9ec95ac55fb1cc58505023aef3779f0d3c720f9d8861e5c80e not found: ID does not exist" Mar 13 11:53:47 crc kubenswrapper[4837]: I0313 11:53:47.055786 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 13 11:53:47 crc kubenswrapper[4837]: I0313 11:53:47.062069 4837 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 13 11:53:49 crc kubenswrapper[4837]: I0313 11:53:49.933533 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-54bb9484b9-l9k8j" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.131434 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ft6cr"] Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.133289 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ft6cr" podUID="e6060cf2-077e-4112-af57-f100e297f320" containerName="registry-server" containerID="cri-o://981d238a29da8dc69fd7413479e02e57c3595b2787cab7169c57d333172bede1" gracePeriod=30 Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.138203 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-twtbj"] Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.138626 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-twtbj" podUID="278c91cc-2624-42cd-a35e-287e22d22f7d" containerName="registry-server" containerID="cri-o://e6908d46230c52fea1c314d660f53fb74dbfd03beed19d0d7b5d526d78fc8a6c" gracePeriod=30 Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.152014 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8vgmn"] Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.152231 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-8vgmn" podUID="8fb85cad-ec2d-4ada-bd68-55937d96a779" containerName="marketplace-operator" containerID="cri-o://7062c61986b41d101ebecc3d1bfaa5e447d278c907a23b8b3db80e27716fe090" gracePeriod=30 Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.164443 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7crb6"] Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.164715 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7crb6" podUID="080747b0-3d43-4ff1-b21c-b8ea9fc2f961" containerName="registry-server" containerID="cri-o://caf645720e683fd04b4144b714a66fde6f0b64f2a123d5270dabac05a2a4caaa" gracePeriod=30 Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.173166 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7rzpc"] Mar 13 11:53:59 crc kubenswrapper[4837]: E0313 11:53:59.173472 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="def1c7aa-51a6-4ee0-93d5-714721e9fc27" containerName="installer" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.173496 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="def1c7aa-51a6-4ee0-93d5-714721e9fc27" containerName="installer" Mar 13 11:53:59 crc kubenswrapper[4837]: E0313 11:53:59.173513 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.173522 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.173668 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.173692 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="def1c7aa-51a6-4ee0-93d5-714721e9fc27" containerName="installer" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.175403 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7rzpc" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.177199 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ng6kk"] Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.177421 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ng6kk" podUID="bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d" containerName="registry-server" containerID="cri-o://1b6b0960e651037356989556f5ddff9457e82572c75941cbde7fc59810854ea0" gracePeriod=30 Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.181622 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7rzpc"] Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.232916 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b87c8f86-a346-4907-9441-048c3220646f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7rzpc\" (UID: \"b87c8f86-a346-4907-9441-048c3220646f\") " pod="openshift-marketplace/marketplace-operator-79b997595-7rzpc" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.232997 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b87c8f86-a346-4907-9441-048c3220646f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7rzpc\" (UID: \"b87c8f86-a346-4907-9441-048c3220646f\") " pod="openshift-marketplace/marketplace-operator-79b997595-7rzpc" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.233080 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgdm8\" (UniqueName: \"kubernetes.io/projected/b87c8f86-a346-4907-9441-048c3220646f-kube-api-access-rgdm8\") pod \"marketplace-operator-79b997595-7rzpc\" (UID: \"b87c8f86-a346-4907-9441-048c3220646f\") " pod="openshift-marketplace/marketplace-operator-79b997595-7rzpc" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.334560 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgdm8\" (UniqueName: \"kubernetes.io/projected/b87c8f86-a346-4907-9441-048c3220646f-kube-api-access-rgdm8\") pod \"marketplace-operator-79b997595-7rzpc\" (UID: \"b87c8f86-a346-4907-9441-048c3220646f\") " pod="openshift-marketplace/marketplace-operator-79b997595-7rzpc" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.334613 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b87c8f86-a346-4907-9441-048c3220646f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7rzpc\" (UID: \"b87c8f86-a346-4907-9441-048c3220646f\") " pod="openshift-marketplace/marketplace-operator-79b997595-7rzpc" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.334660 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b87c8f86-a346-4907-9441-048c3220646f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7rzpc\" (UID: \"b87c8f86-a346-4907-9441-048c3220646f\") " pod="openshift-marketplace/marketplace-operator-79b997595-7rzpc" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.340069 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b87c8f86-a346-4907-9441-048c3220646f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7rzpc\" (UID: \"b87c8f86-a346-4907-9441-048c3220646f\") " pod="openshift-marketplace/marketplace-operator-79b997595-7rzpc" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.348749 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b87c8f86-a346-4907-9441-048c3220646f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7rzpc\" (UID: \"b87c8f86-a346-4907-9441-048c3220646f\") " pod="openshift-marketplace/marketplace-operator-79b997595-7rzpc" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.374043 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgdm8\" (UniqueName: \"kubernetes.io/projected/b87c8f86-a346-4907-9441-048c3220646f-kube-api-access-rgdm8\") pod \"marketplace-operator-79b997595-7rzpc\" (UID: \"b87c8f86-a346-4907-9441-048c3220646f\") " pod="openshift-marketplace/marketplace-operator-79b997595-7rzpc" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.496578 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7rzpc" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.732476 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ft6cr" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.739204 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6060cf2-077e-4112-af57-f100e297f320-utilities\") pod \"e6060cf2-077e-4112-af57-f100e297f320\" (UID: \"e6060cf2-077e-4112-af57-f100e297f320\") " Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.739281 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfvgs\" (UniqueName: \"kubernetes.io/projected/e6060cf2-077e-4112-af57-f100e297f320-kube-api-access-gfvgs\") pod \"e6060cf2-077e-4112-af57-f100e297f320\" (UID: \"e6060cf2-077e-4112-af57-f100e297f320\") " Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.739327 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6060cf2-077e-4112-af57-f100e297f320-catalog-content\") pod \"e6060cf2-077e-4112-af57-f100e297f320\" (UID: \"e6060cf2-077e-4112-af57-f100e297f320\") " Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.744742 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6060cf2-077e-4112-af57-f100e297f320-utilities" (OuterVolumeSpecName: "utilities") pod "e6060cf2-077e-4112-af57-f100e297f320" (UID: "e6060cf2-077e-4112-af57-f100e297f320"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.745774 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6060cf2-077e-4112-af57-f100e297f320-kube-api-access-gfvgs" (OuterVolumeSpecName: "kube-api-access-gfvgs") pod "e6060cf2-077e-4112-af57-f100e297f320" (UID: "e6060cf2-077e-4112-af57-f100e297f320"). InnerVolumeSpecName "kube-api-access-gfvgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.841067 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6060cf2-077e-4112-af57-f100e297f320-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e6060cf2-077e-4112-af57-f100e297f320" (UID: "e6060cf2-077e-4112-af57-f100e297f320"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.842171 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfvgs\" (UniqueName: \"kubernetes.io/projected/e6060cf2-077e-4112-af57-f100e297f320-kube-api-access-gfvgs\") on node \"crc\" DevicePath \"\"" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.842216 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6060cf2-077e-4112-af57-f100e297f320-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.842228 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6060cf2-077e-4112-af57-f100e297f320-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.938589 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ng6kk" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.942412 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d-catalog-content\") pod \"bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d\" (UID: \"bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d\") " Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.942491 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpclz\" (UniqueName: \"kubernetes.io/projected/bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d-kube-api-access-vpclz\") pod \"bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d\" (UID: \"bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d\") " Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.942535 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d-utilities\") pod \"bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d\" (UID: \"bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d\") " Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.943994 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d-utilities" (OuterVolumeSpecName: "utilities") pod "bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d" (UID: "bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.948773 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d-kube-api-access-vpclz" (OuterVolumeSpecName: "kube-api-access-vpclz") pod "bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d" (UID: "bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d"). InnerVolumeSpecName "kube-api-access-vpclz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.955750 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-twtbj" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.958479 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8vgmn" Mar 13 11:53:59 crc kubenswrapper[4837]: I0313 11:53:59.997153 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7crb6" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.043839 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8fb85cad-ec2d-4ada-bd68-55937d96a779-marketplace-trusted-ca\") pod \"8fb85cad-ec2d-4ada-bd68-55937d96a779\" (UID: \"8fb85cad-ec2d-4ada-bd68-55937d96a779\") " Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.043937 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/080747b0-3d43-4ff1-b21c-b8ea9fc2f961-catalog-content\") pod \"080747b0-3d43-4ff1-b21c-b8ea9fc2f961\" (UID: \"080747b0-3d43-4ff1-b21c-b8ea9fc2f961\") " Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.043965 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j25rc\" (UniqueName: \"kubernetes.io/projected/8fb85cad-ec2d-4ada-bd68-55937d96a779-kube-api-access-j25rc\") pod \"8fb85cad-ec2d-4ada-bd68-55937d96a779\" (UID: \"8fb85cad-ec2d-4ada-bd68-55937d96a779\") " Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.057185 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8fb85cad-ec2d-4ada-bd68-55937d96a779-marketplace-operator-metrics\") pod \"8fb85cad-ec2d-4ada-bd68-55937d96a779\" (UID: \"8fb85cad-ec2d-4ada-bd68-55937d96a779\") " Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.057235 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/080747b0-3d43-4ff1-b21c-b8ea9fc2f961-utilities\") pod \"080747b0-3d43-4ff1-b21c-b8ea9fc2f961\" (UID: \"080747b0-3d43-4ff1-b21c-b8ea9fc2f961\") " Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.057267 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/278c91cc-2624-42cd-a35e-287e22d22f7d-utilities\") pod \"278c91cc-2624-42cd-a35e-287e22d22f7d\" (UID: \"278c91cc-2624-42cd-a35e-287e22d22f7d\") " Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.057300 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcgkl\" (UniqueName: \"kubernetes.io/projected/278c91cc-2624-42cd-a35e-287e22d22f7d-kube-api-access-bcgkl\") pod \"278c91cc-2624-42cd-a35e-287e22d22f7d\" (UID: \"278c91cc-2624-42cd-a35e-287e22d22f7d\") " Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.057379 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfddm\" (UniqueName: \"kubernetes.io/projected/080747b0-3d43-4ff1-b21c-b8ea9fc2f961-kube-api-access-wfddm\") pod \"080747b0-3d43-4ff1-b21c-b8ea9fc2f961\" (UID: \"080747b0-3d43-4ff1-b21c-b8ea9fc2f961\") " Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.057458 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/278c91cc-2624-42cd-a35e-287e22d22f7d-catalog-content\") pod \"278c91cc-2624-42cd-a35e-287e22d22f7d\" (UID: \"278c91cc-2624-42cd-a35e-287e22d22f7d\") " Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.044473 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fb85cad-ec2d-4ada-bd68-55937d96a779-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "8fb85cad-ec2d-4ada-bd68-55937d96a779" (UID: "8fb85cad-ec2d-4ada-bd68-55937d96a779"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.047143 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fb85cad-ec2d-4ada-bd68-55937d96a779-kube-api-access-j25rc" (OuterVolumeSpecName: "kube-api-access-j25rc") pod "8fb85cad-ec2d-4ada-bd68-55937d96a779" (UID: "8fb85cad-ec2d-4ada-bd68-55937d96a779"). InnerVolumeSpecName "kube-api-access-j25rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.058351 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/080747b0-3d43-4ff1-b21c-b8ea9fc2f961-utilities" (OuterVolumeSpecName: "utilities") pod "080747b0-3d43-4ff1-b21c-b8ea9fc2f961" (UID: "080747b0-3d43-4ff1-b21c-b8ea9fc2f961"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.060547 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/278c91cc-2624-42cd-a35e-287e22d22f7d-utilities" (OuterVolumeSpecName: "utilities") pod "278c91cc-2624-42cd-a35e-287e22d22f7d" (UID: "278c91cc-2624-42cd-a35e-287e22d22f7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.060749 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/278c91cc-2624-42cd-a35e-287e22d22f7d-kube-api-access-bcgkl" (OuterVolumeSpecName: "kube-api-access-bcgkl") pod "278c91cc-2624-42cd-a35e-287e22d22f7d" (UID: "278c91cc-2624-42cd-a35e-287e22d22f7d"). InnerVolumeSpecName "kube-api-access-bcgkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.061219 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpclz\" (UniqueName: \"kubernetes.io/projected/bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d-kube-api-access-vpclz\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.061247 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j25rc\" (UniqueName: \"kubernetes.io/projected/8fb85cad-ec2d-4ada-bd68-55937d96a779-kube-api-access-j25rc\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.061261 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/080747b0-3d43-4ff1-b21c-b8ea9fc2f961-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.061272 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/278c91cc-2624-42cd-a35e-287e22d22f7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.061283 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcgkl\" (UniqueName: \"kubernetes.io/projected/278c91cc-2624-42cd-a35e-287e22d22f7d-kube-api-access-bcgkl\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.061293 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.061304 4837 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8fb85cad-ec2d-4ada-bd68-55937d96a779-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.062831 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fb85cad-ec2d-4ada-bd68-55937d96a779-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "8fb85cad-ec2d-4ada-bd68-55937d96a779" (UID: "8fb85cad-ec2d-4ada-bd68-55937d96a779"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.068092 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7rzpc"] Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.076817 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/080747b0-3d43-4ff1-b21c-b8ea9fc2f961-kube-api-access-wfddm" (OuterVolumeSpecName: "kube-api-access-wfddm") pod "080747b0-3d43-4ff1-b21c-b8ea9fc2f961" (UID: "080747b0-3d43-4ff1-b21c-b8ea9fc2f961"). InnerVolumeSpecName "kube-api-access-wfddm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.088109 4837 generic.go:334] "Generic (PLEG): container finished" podID="bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d" containerID="1b6b0960e651037356989556f5ddff9457e82572c75941cbde7fc59810854ea0" exitCode=0 Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.088233 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ng6kk" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.088294 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ng6kk" event={"ID":"bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d","Type":"ContainerDied","Data":"1b6b0960e651037356989556f5ddff9457e82572c75941cbde7fc59810854ea0"} Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.088486 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ng6kk" event={"ID":"bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d","Type":"ContainerDied","Data":"eac45e620e44e693cbb55f704b7783d81f0f024e3e2cf4051be3383dc9b6b145"} Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.088509 4837 scope.go:117] "RemoveContainer" containerID="1b6b0960e651037356989556f5ddff9457e82572c75941cbde7fc59810854ea0" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.102135 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7rzpc" event={"ID":"b87c8f86-a346-4907-9441-048c3220646f","Type":"ContainerStarted","Data":"a2529193e61a49baf66b7493e41d129b6cf282b6365689bc07cffc62fec9884b"} Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.102185 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d" (UID: "bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.105601 4837 generic.go:334] "Generic (PLEG): container finished" podID="8fb85cad-ec2d-4ada-bd68-55937d96a779" containerID="7062c61986b41d101ebecc3d1bfaa5e447d278c907a23b8b3db80e27716fe090" exitCode=0 Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.105676 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8vgmn" event={"ID":"8fb85cad-ec2d-4ada-bd68-55937d96a779","Type":"ContainerDied","Data":"7062c61986b41d101ebecc3d1bfaa5e447d278c907a23b8b3db80e27716fe090"} Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.105702 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8vgmn" event={"ID":"8fb85cad-ec2d-4ada-bd68-55937d96a779","Type":"ContainerDied","Data":"32fbad917c53f080dae29a17b7d2e0db3f0b48efe2df248f03fa8431da965ad3"} Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.105706 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8vgmn" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.116071 4837 generic.go:334] "Generic (PLEG): container finished" podID="080747b0-3d43-4ff1-b21c-b8ea9fc2f961" containerID="caf645720e683fd04b4144b714a66fde6f0b64f2a123d5270dabac05a2a4caaa" exitCode=0 Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.116143 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7crb6" event={"ID":"080747b0-3d43-4ff1-b21c-b8ea9fc2f961","Type":"ContainerDied","Data":"caf645720e683fd04b4144b714a66fde6f0b64f2a123d5270dabac05a2a4caaa"} Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.116181 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7crb6" event={"ID":"080747b0-3d43-4ff1-b21c-b8ea9fc2f961","Type":"ContainerDied","Data":"3672e1f233b40bf42b048214c1fa7e9647f6025a8a0466aed9482e60a925fb22"} Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.116185 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7crb6" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.118056 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/080747b0-3d43-4ff1-b21c-b8ea9fc2f961-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "080747b0-3d43-4ff1-b21c-b8ea9fc2f961" (UID: "080747b0-3d43-4ff1-b21c-b8ea9fc2f961"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.120693 4837 generic.go:334] "Generic (PLEG): container finished" podID="e6060cf2-077e-4112-af57-f100e297f320" containerID="981d238a29da8dc69fd7413479e02e57c3595b2787cab7169c57d333172bede1" exitCode=0 Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.120753 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ft6cr" event={"ID":"e6060cf2-077e-4112-af57-f100e297f320","Type":"ContainerDied","Data":"981d238a29da8dc69fd7413479e02e57c3595b2787cab7169c57d333172bede1"} Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.120758 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ft6cr" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.120774 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ft6cr" event={"ID":"e6060cf2-077e-4112-af57-f100e297f320","Type":"ContainerDied","Data":"4b6c9ae51e3fb9c4dadef31697baf0c351e16ed9f865f9be7126242388f9b2dd"} Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.130376 4837 scope.go:117] "RemoveContainer" containerID="f0462ba3a7de7e503c15d1b89ef700574f3f475b90e578e1c36542ba0d37ed51" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.143401 4837 generic.go:334] "Generic (PLEG): container finished" podID="278c91cc-2624-42cd-a35e-287e22d22f7d" containerID="e6908d46230c52fea1c314d660f53fb74dbfd03beed19d0d7b5d526d78fc8a6c" exitCode=0 Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.143450 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-twtbj" event={"ID":"278c91cc-2624-42cd-a35e-287e22d22f7d","Type":"ContainerDied","Data":"e6908d46230c52fea1c314d660f53fb74dbfd03beed19d0d7b5d526d78fc8a6c"} Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.143482 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-twtbj" event={"ID":"278c91cc-2624-42cd-a35e-287e22d22f7d","Type":"ContainerDied","Data":"c6c53bda7c5d3c5997c1cf5e6db327e83ff0de4776f4c37d442594e9111862d1"} Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.143555 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-twtbj" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.164245 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfddm\" (UniqueName: \"kubernetes.io/projected/080747b0-3d43-4ff1-b21c-b8ea9fc2f961-kube-api-access-wfddm\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.164276 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.164288 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/080747b0-3d43-4ff1-b21c-b8ea9fc2f961-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.164302 4837 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8fb85cad-ec2d-4ada-bd68-55937d96a779-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.173463 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8vgmn"] Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.178925 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556714-jzzgx"] Mar 13 11:54:00 crc kubenswrapper[4837]: E0313 11:54:00.179581 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d" containerName="extract-content" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.179601 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d" containerName="extract-content" Mar 13 11:54:00 crc kubenswrapper[4837]: E0313 11:54:00.179644 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="080747b0-3d43-4ff1-b21c-b8ea9fc2f961" containerName="registry-server" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.179655 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="080747b0-3d43-4ff1-b21c-b8ea9fc2f961" containerName="registry-server" Mar 13 11:54:00 crc kubenswrapper[4837]: E0313 11:54:00.179662 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="080747b0-3d43-4ff1-b21c-b8ea9fc2f961" containerName="extract-content" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.179688 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="080747b0-3d43-4ff1-b21c-b8ea9fc2f961" containerName="extract-content" Mar 13 11:54:00 crc kubenswrapper[4837]: E0313 11:54:00.179722 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d" containerName="registry-server" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.179729 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d" containerName="registry-server" Mar 13 11:54:00 crc kubenswrapper[4837]: E0313 11:54:00.179736 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d" containerName="extract-utilities" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.179744 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d" containerName="extract-utilities" Mar 13 11:54:00 crc kubenswrapper[4837]: E0313 11:54:00.179755 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="080747b0-3d43-4ff1-b21c-b8ea9fc2f961" containerName="extract-utilities" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.179760 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="080747b0-3d43-4ff1-b21c-b8ea9fc2f961" containerName="extract-utilities" Mar 13 11:54:00 crc kubenswrapper[4837]: E0313 11:54:00.179770 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6060cf2-077e-4112-af57-f100e297f320" containerName="registry-server" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.179801 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6060cf2-077e-4112-af57-f100e297f320" containerName="registry-server" Mar 13 11:54:00 crc kubenswrapper[4837]: E0313 11:54:00.179808 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="278c91cc-2624-42cd-a35e-287e22d22f7d" containerName="extract-utilities" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.179814 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="278c91cc-2624-42cd-a35e-287e22d22f7d" containerName="extract-utilities" Mar 13 11:54:00 crc kubenswrapper[4837]: E0313 11:54:00.179820 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6060cf2-077e-4112-af57-f100e297f320" containerName="extract-utilities" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.179827 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6060cf2-077e-4112-af57-f100e297f320" containerName="extract-utilities" Mar 13 11:54:00 crc kubenswrapper[4837]: E0313 11:54:00.179846 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="278c91cc-2624-42cd-a35e-287e22d22f7d" containerName="registry-server" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.179854 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="278c91cc-2624-42cd-a35e-287e22d22f7d" containerName="registry-server" Mar 13 11:54:00 crc kubenswrapper[4837]: E0313 11:54:00.179894 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6060cf2-077e-4112-af57-f100e297f320" containerName="extract-content" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.179904 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6060cf2-077e-4112-af57-f100e297f320" containerName="extract-content" Mar 13 11:54:00 crc kubenswrapper[4837]: E0313 11:54:00.179912 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fb85cad-ec2d-4ada-bd68-55937d96a779" containerName="marketplace-operator" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.179920 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fb85cad-ec2d-4ada-bd68-55937d96a779" containerName="marketplace-operator" Mar 13 11:54:00 crc kubenswrapper[4837]: E0313 11:54:00.179932 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="278c91cc-2624-42cd-a35e-287e22d22f7d" containerName="extract-content" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.179940 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="278c91cc-2624-42cd-a35e-287e22d22f7d" containerName="extract-content" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.180108 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="080747b0-3d43-4ff1-b21c-b8ea9fc2f961" containerName="registry-server" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.180118 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6060cf2-077e-4112-af57-f100e297f320" containerName="registry-server" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.180128 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fb85cad-ec2d-4ada-bd68-55937d96a779" containerName="marketplace-operator" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.180137 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="278c91cc-2624-42cd-a35e-287e22d22f7d" containerName="registry-server" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.180145 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d" containerName="registry-server" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.180668 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556714-jzzgx" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.187314 4837 scope.go:117] "RemoveContainer" containerID="96174c590656df138c1d79af4e8416815fc220b78535e35ba951e58fb70ac305" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.187497 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8vgmn"] Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.195053 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.195184 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.195251 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.198068 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556714-jzzgx"] Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.208803 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/278c91cc-2624-42cd-a35e-287e22d22f7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "278c91cc-2624-42cd-a35e-287e22d22f7d" (UID: "278c91cc-2624-42cd-a35e-287e22d22f7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.219262 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ft6cr"] Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.229852 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ft6cr"] Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.264924 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jdql\" (UniqueName: \"kubernetes.io/projected/2b7a269a-3d94-4758-922d-9886312f2a25-kube-api-access-7jdql\") pod \"auto-csr-approver-29556714-jzzgx\" (UID: \"2b7a269a-3d94-4758-922d-9886312f2a25\") " pod="openshift-infra/auto-csr-approver-29556714-jzzgx" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.265014 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/278c91cc-2624-42cd-a35e-287e22d22f7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.277876 4837 scope.go:117] "RemoveContainer" containerID="1b6b0960e651037356989556f5ddff9457e82572c75941cbde7fc59810854ea0" Mar 13 11:54:00 crc kubenswrapper[4837]: E0313 11:54:00.282471 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b6b0960e651037356989556f5ddff9457e82572c75941cbde7fc59810854ea0\": container with ID starting with 1b6b0960e651037356989556f5ddff9457e82572c75941cbde7fc59810854ea0 not found: ID does not exist" containerID="1b6b0960e651037356989556f5ddff9457e82572c75941cbde7fc59810854ea0" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.282531 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b6b0960e651037356989556f5ddff9457e82572c75941cbde7fc59810854ea0"} err="failed to get container status \"1b6b0960e651037356989556f5ddff9457e82572c75941cbde7fc59810854ea0\": rpc error: code = NotFound desc = could not find container \"1b6b0960e651037356989556f5ddff9457e82572c75941cbde7fc59810854ea0\": container with ID starting with 1b6b0960e651037356989556f5ddff9457e82572c75941cbde7fc59810854ea0 not found: ID does not exist" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.282564 4837 scope.go:117] "RemoveContainer" containerID="f0462ba3a7de7e503c15d1b89ef700574f3f475b90e578e1c36542ba0d37ed51" Mar 13 11:54:00 crc kubenswrapper[4837]: E0313 11:54:00.282981 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0462ba3a7de7e503c15d1b89ef700574f3f475b90e578e1c36542ba0d37ed51\": container with ID starting with f0462ba3a7de7e503c15d1b89ef700574f3f475b90e578e1c36542ba0d37ed51 not found: ID does not exist" containerID="f0462ba3a7de7e503c15d1b89ef700574f3f475b90e578e1c36542ba0d37ed51" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.283023 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0462ba3a7de7e503c15d1b89ef700574f3f475b90e578e1c36542ba0d37ed51"} err="failed to get container status \"f0462ba3a7de7e503c15d1b89ef700574f3f475b90e578e1c36542ba0d37ed51\": rpc error: code = NotFound desc = could not find container \"f0462ba3a7de7e503c15d1b89ef700574f3f475b90e578e1c36542ba0d37ed51\": container with ID starting with f0462ba3a7de7e503c15d1b89ef700574f3f475b90e578e1c36542ba0d37ed51 not found: ID does not exist" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.283052 4837 scope.go:117] "RemoveContainer" containerID="96174c590656df138c1d79af4e8416815fc220b78535e35ba951e58fb70ac305" Mar 13 11:54:00 crc kubenswrapper[4837]: E0313 11:54:00.283555 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96174c590656df138c1d79af4e8416815fc220b78535e35ba951e58fb70ac305\": container with ID starting with 96174c590656df138c1d79af4e8416815fc220b78535e35ba951e58fb70ac305 not found: ID does not exist" containerID="96174c590656df138c1d79af4e8416815fc220b78535e35ba951e58fb70ac305" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.283580 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96174c590656df138c1d79af4e8416815fc220b78535e35ba951e58fb70ac305"} err="failed to get container status \"96174c590656df138c1d79af4e8416815fc220b78535e35ba951e58fb70ac305\": rpc error: code = NotFound desc = could not find container \"96174c590656df138c1d79af4e8416815fc220b78535e35ba951e58fb70ac305\": container with ID starting with 96174c590656df138c1d79af4e8416815fc220b78535e35ba951e58fb70ac305 not found: ID does not exist" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.283595 4837 scope.go:117] "RemoveContainer" containerID="7062c61986b41d101ebecc3d1bfaa5e447d278c907a23b8b3db80e27716fe090" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.307418 4837 scope.go:117] "RemoveContainer" containerID="7062c61986b41d101ebecc3d1bfaa5e447d278c907a23b8b3db80e27716fe090" Mar 13 11:54:00 crc kubenswrapper[4837]: E0313 11:54:00.308194 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7062c61986b41d101ebecc3d1bfaa5e447d278c907a23b8b3db80e27716fe090\": container with ID starting with 7062c61986b41d101ebecc3d1bfaa5e447d278c907a23b8b3db80e27716fe090 not found: ID does not exist" containerID="7062c61986b41d101ebecc3d1bfaa5e447d278c907a23b8b3db80e27716fe090" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.308239 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7062c61986b41d101ebecc3d1bfaa5e447d278c907a23b8b3db80e27716fe090"} err="failed to get container status \"7062c61986b41d101ebecc3d1bfaa5e447d278c907a23b8b3db80e27716fe090\": rpc error: code = NotFound desc = could not find container \"7062c61986b41d101ebecc3d1bfaa5e447d278c907a23b8b3db80e27716fe090\": container with ID starting with 7062c61986b41d101ebecc3d1bfaa5e447d278c907a23b8b3db80e27716fe090 not found: ID does not exist" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.308268 4837 scope.go:117] "RemoveContainer" containerID="caf645720e683fd04b4144b714a66fde6f0b64f2a123d5270dabac05a2a4caaa" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.320665 4837 scope.go:117] "RemoveContainer" containerID="0d9d2068fa9a75fcedf62135c026dbc7b6be8fecbfe8ba1e1bd893e7874fe650" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.334897 4837 scope.go:117] "RemoveContainer" containerID="86b2c7193d237a632c12a22d24c63b2f2247ec49e6eadaf1bcddddc01304e298" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.351572 4837 scope.go:117] "RemoveContainer" containerID="caf645720e683fd04b4144b714a66fde6f0b64f2a123d5270dabac05a2a4caaa" Mar 13 11:54:00 crc kubenswrapper[4837]: E0313 11:54:00.352071 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"caf645720e683fd04b4144b714a66fde6f0b64f2a123d5270dabac05a2a4caaa\": container with ID starting with caf645720e683fd04b4144b714a66fde6f0b64f2a123d5270dabac05a2a4caaa not found: ID does not exist" containerID="caf645720e683fd04b4144b714a66fde6f0b64f2a123d5270dabac05a2a4caaa" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.352109 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"caf645720e683fd04b4144b714a66fde6f0b64f2a123d5270dabac05a2a4caaa"} err="failed to get container status \"caf645720e683fd04b4144b714a66fde6f0b64f2a123d5270dabac05a2a4caaa\": rpc error: code = NotFound desc = could not find container \"caf645720e683fd04b4144b714a66fde6f0b64f2a123d5270dabac05a2a4caaa\": container with ID starting with caf645720e683fd04b4144b714a66fde6f0b64f2a123d5270dabac05a2a4caaa not found: ID does not exist" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.352134 4837 scope.go:117] "RemoveContainer" containerID="0d9d2068fa9a75fcedf62135c026dbc7b6be8fecbfe8ba1e1bd893e7874fe650" Mar 13 11:54:00 crc kubenswrapper[4837]: E0313 11:54:00.353121 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d9d2068fa9a75fcedf62135c026dbc7b6be8fecbfe8ba1e1bd893e7874fe650\": container with ID starting with 0d9d2068fa9a75fcedf62135c026dbc7b6be8fecbfe8ba1e1bd893e7874fe650 not found: ID does not exist" containerID="0d9d2068fa9a75fcedf62135c026dbc7b6be8fecbfe8ba1e1bd893e7874fe650" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.353159 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d9d2068fa9a75fcedf62135c026dbc7b6be8fecbfe8ba1e1bd893e7874fe650"} err="failed to get container status \"0d9d2068fa9a75fcedf62135c026dbc7b6be8fecbfe8ba1e1bd893e7874fe650\": rpc error: code = NotFound desc = could not find container \"0d9d2068fa9a75fcedf62135c026dbc7b6be8fecbfe8ba1e1bd893e7874fe650\": container with ID starting with 0d9d2068fa9a75fcedf62135c026dbc7b6be8fecbfe8ba1e1bd893e7874fe650 not found: ID does not exist" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.353392 4837 scope.go:117] "RemoveContainer" containerID="86b2c7193d237a632c12a22d24c63b2f2247ec49e6eadaf1bcddddc01304e298" Mar 13 11:54:00 crc kubenswrapper[4837]: E0313 11:54:00.355099 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86b2c7193d237a632c12a22d24c63b2f2247ec49e6eadaf1bcddddc01304e298\": container with ID starting with 86b2c7193d237a632c12a22d24c63b2f2247ec49e6eadaf1bcddddc01304e298 not found: ID does not exist" containerID="86b2c7193d237a632c12a22d24c63b2f2247ec49e6eadaf1bcddddc01304e298" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.355147 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86b2c7193d237a632c12a22d24c63b2f2247ec49e6eadaf1bcddddc01304e298"} err="failed to get container status \"86b2c7193d237a632c12a22d24c63b2f2247ec49e6eadaf1bcddddc01304e298\": rpc error: code = NotFound desc = could not find container \"86b2c7193d237a632c12a22d24c63b2f2247ec49e6eadaf1bcddddc01304e298\": container with ID starting with 86b2c7193d237a632c12a22d24c63b2f2247ec49e6eadaf1bcddddc01304e298 not found: ID does not exist" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.355184 4837 scope.go:117] "RemoveContainer" containerID="981d238a29da8dc69fd7413479e02e57c3595b2787cab7169c57d333172bede1" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.366246 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jdql\" (UniqueName: \"kubernetes.io/projected/2b7a269a-3d94-4758-922d-9886312f2a25-kube-api-access-7jdql\") pod \"auto-csr-approver-29556714-jzzgx\" (UID: \"2b7a269a-3d94-4758-922d-9886312f2a25\") " pod="openshift-infra/auto-csr-approver-29556714-jzzgx" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.378709 4837 scope.go:117] "RemoveContainer" containerID="18157b9c0c2686c96644926d7ef6b55f8879075870dcef9cd6f8b2f09be008ae" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.384589 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jdql\" (UniqueName: \"kubernetes.io/projected/2b7a269a-3d94-4758-922d-9886312f2a25-kube-api-access-7jdql\") pod \"auto-csr-approver-29556714-jzzgx\" (UID: \"2b7a269a-3d94-4758-922d-9886312f2a25\") " pod="openshift-infra/auto-csr-approver-29556714-jzzgx" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.409018 4837 scope.go:117] "RemoveContainer" containerID="92a2c4e8c63e772dff31e06b47098a1634c8714cf4402cee4344a939f71bc1a7" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.422800 4837 scope.go:117] "RemoveContainer" containerID="981d238a29da8dc69fd7413479e02e57c3595b2787cab7169c57d333172bede1" Mar 13 11:54:00 crc kubenswrapper[4837]: E0313 11:54:00.424002 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"981d238a29da8dc69fd7413479e02e57c3595b2787cab7169c57d333172bede1\": container with ID starting with 981d238a29da8dc69fd7413479e02e57c3595b2787cab7169c57d333172bede1 not found: ID does not exist" containerID="981d238a29da8dc69fd7413479e02e57c3595b2787cab7169c57d333172bede1" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.424037 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"981d238a29da8dc69fd7413479e02e57c3595b2787cab7169c57d333172bede1"} err="failed to get container status \"981d238a29da8dc69fd7413479e02e57c3595b2787cab7169c57d333172bede1\": rpc error: code = NotFound desc = could not find container \"981d238a29da8dc69fd7413479e02e57c3595b2787cab7169c57d333172bede1\": container with ID starting with 981d238a29da8dc69fd7413479e02e57c3595b2787cab7169c57d333172bede1 not found: ID does not exist" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.424056 4837 scope.go:117] "RemoveContainer" containerID="18157b9c0c2686c96644926d7ef6b55f8879075870dcef9cd6f8b2f09be008ae" Mar 13 11:54:00 crc kubenswrapper[4837]: E0313 11:54:00.424337 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18157b9c0c2686c96644926d7ef6b55f8879075870dcef9cd6f8b2f09be008ae\": container with ID starting with 18157b9c0c2686c96644926d7ef6b55f8879075870dcef9cd6f8b2f09be008ae not found: ID does not exist" containerID="18157b9c0c2686c96644926d7ef6b55f8879075870dcef9cd6f8b2f09be008ae" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.424373 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18157b9c0c2686c96644926d7ef6b55f8879075870dcef9cd6f8b2f09be008ae"} err="failed to get container status \"18157b9c0c2686c96644926d7ef6b55f8879075870dcef9cd6f8b2f09be008ae\": rpc error: code = NotFound desc = could not find container \"18157b9c0c2686c96644926d7ef6b55f8879075870dcef9cd6f8b2f09be008ae\": container with ID starting with 18157b9c0c2686c96644926d7ef6b55f8879075870dcef9cd6f8b2f09be008ae not found: ID does not exist" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.424389 4837 scope.go:117] "RemoveContainer" containerID="92a2c4e8c63e772dff31e06b47098a1634c8714cf4402cee4344a939f71bc1a7" Mar 13 11:54:00 crc kubenswrapper[4837]: E0313 11:54:00.424710 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92a2c4e8c63e772dff31e06b47098a1634c8714cf4402cee4344a939f71bc1a7\": container with ID starting with 92a2c4e8c63e772dff31e06b47098a1634c8714cf4402cee4344a939f71bc1a7 not found: ID does not exist" containerID="92a2c4e8c63e772dff31e06b47098a1634c8714cf4402cee4344a939f71bc1a7" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.424737 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92a2c4e8c63e772dff31e06b47098a1634c8714cf4402cee4344a939f71bc1a7"} err="failed to get container status \"92a2c4e8c63e772dff31e06b47098a1634c8714cf4402cee4344a939f71bc1a7\": rpc error: code = NotFound desc = could not find container \"92a2c4e8c63e772dff31e06b47098a1634c8714cf4402cee4344a939f71bc1a7\": container with ID starting with 92a2c4e8c63e772dff31e06b47098a1634c8714cf4402cee4344a939f71bc1a7 not found: ID does not exist" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.424755 4837 scope.go:117] "RemoveContainer" containerID="e6908d46230c52fea1c314d660f53fb74dbfd03beed19d0d7b5d526d78fc8a6c" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.430164 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ng6kk"] Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.435868 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ng6kk"] Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.444062 4837 scope.go:117] "RemoveContainer" containerID="a7440d1435344c8bb57482e03b8adcb5ddb52bcac5f7b4d1ca42e60d0e6e91e3" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.451805 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7crb6"] Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.456765 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7crb6"] Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.461431 4837 scope.go:117] "RemoveContainer" containerID="2e92979b20e46c7135bec8322dc3792b24e0a0cbeac503bda1fa03e0621168f3" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.469179 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-twtbj"] Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.474844 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-twtbj"] Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.491815 4837 scope.go:117] "RemoveContainer" containerID="e6908d46230c52fea1c314d660f53fb74dbfd03beed19d0d7b5d526d78fc8a6c" Mar 13 11:54:00 crc kubenswrapper[4837]: E0313 11:54:00.492313 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6908d46230c52fea1c314d660f53fb74dbfd03beed19d0d7b5d526d78fc8a6c\": container with ID starting with e6908d46230c52fea1c314d660f53fb74dbfd03beed19d0d7b5d526d78fc8a6c not found: ID does not exist" containerID="e6908d46230c52fea1c314d660f53fb74dbfd03beed19d0d7b5d526d78fc8a6c" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.492353 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6908d46230c52fea1c314d660f53fb74dbfd03beed19d0d7b5d526d78fc8a6c"} err="failed to get container status \"e6908d46230c52fea1c314d660f53fb74dbfd03beed19d0d7b5d526d78fc8a6c\": rpc error: code = NotFound desc = could not find container \"e6908d46230c52fea1c314d660f53fb74dbfd03beed19d0d7b5d526d78fc8a6c\": container with ID starting with e6908d46230c52fea1c314d660f53fb74dbfd03beed19d0d7b5d526d78fc8a6c not found: ID does not exist" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.492379 4837 scope.go:117] "RemoveContainer" containerID="a7440d1435344c8bb57482e03b8adcb5ddb52bcac5f7b4d1ca42e60d0e6e91e3" Mar 13 11:54:00 crc kubenswrapper[4837]: E0313 11:54:00.492974 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7440d1435344c8bb57482e03b8adcb5ddb52bcac5f7b4d1ca42e60d0e6e91e3\": container with ID starting with a7440d1435344c8bb57482e03b8adcb5ddb52bcac5f7b4d1ca42e60d0e6e91e3 not found: ID does not exist" containerID="a7440d1435344c8bb57482e03b8adcb5ddb52bcac5f7b4d1ca42e60d0e6e91e3" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.493025 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7440d1435344c8bb57482e03b8adcb5ddb52bcac5f7b4d1ca42e60d0e6e91e3"} err="failed to get container status \"a7440d1435344c8bb57482e03b8adcb5ddb52bcac5f7b4d1ca42e60d0e6e91e3\": rpc error: code = NotFound desc = could not find container \"a7440d1435344c8bb57482e03b8adcb5ddb52bcac5f7b4d1ca42e60d0e6e91e3\": container with ID starting with a7440d1435344c8bb57482e03b8adcb5ddb52bcac5f7b4d1ca42e60d0e6e91e3 not found: ID does not exist" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.493054 4837 scope.go:117] "RemoveContainer" containerID="2e92979b20e46c7135bec8322dc3792b24e0a0cbeac503bda1fa03e0621168f3" Mar 13 11:54:00 crc kubenswrapper[4837]: E0313 11:54:00.493316 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e92979b20e46c7135bec8322dc3792b24e0a0cbeac503bda1fa03e0621168f3\": container with ID starting with 2e92979b20e46c7135bec8322dc3792b24e0a0cbeac503bda1fa03e0621168f3 not found: ID does not exist" containerID="2e92979b20e46c7135bec8322dc3792b24e0a0cbeac503bda1fa03e0621168f3" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.493347 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e92979b20e46c7135bec8322dc3792b24e0a0cbeac503bda1fa03e0621168f3"} err="failed to get container status \"2e92979b20e46c7135bec8322dc3792b24e0a0cbeac503bda1fa03e0621168f3\": rpc error: code = NotFound desc = could not find container \"2e92979b20e46c7135bec8322dc3792b24e0a0cbeac503bda1fa03e0621168f3\": container with ID starting with 2e92979b20e46c7135bec8322dc3792b24e0a0cbeac503bda1fa03e0621168f3 not found: ID does not exist" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.529057 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556714-jzzgx" Mar 13 11:54:00 crc kubenswrapper[4837]: I0313 11:54:00.947865 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556714-jzzgx"] Mar 13 11:54:00 crc kubenswrapper[4837]: W0313 11:54:00.953351 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b7a269a_3d94_4758_922d_9886312f2a25.slice/crio-f1d46501ec1fa73334626e53641f933a075eca354c4c9f492c76bbcb4a084039 WatchSource:0}: Error finding container f1d46501ec1fa73334626e53641f933a075eca354c4c9f492c76bbcb4a084039: Status 404 returned error can't find the container with id f1d46501ec1fa73334626e53641f933a075eca354c4c9f492c76bbcb4a084039 Mar 13 11:54:01 crc kubenswrapper[4837]: I0313 11:54:01.058389 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="080747b0-3d43-4ff1-b21c-b8ea9fc2f961" path="/var/lib/kubelet/pods/080747b0-3d43-4ff1-b21c-b8ea9fc2f961/volumes" Mar 13 11:54:01 crc kubenswrapper[4837]: I0313 11:54:01.059406 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="278c91cc-2624-42cd-a35e-287e22d22f7d" path="/var/lib/kubelet/pods/278c91cc-2624-42cd-a35e-287e22d22f7d/volumes" Mar 13 11:54:01 crc kubenswrapper[4837]: I0313 11:54:01.060174 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fb85cad-ec2d-4ada-bd68-55937d96a779" path="/var/lib/kubelet/pods/8fb85cad-ec2d-4ada-bd68-55937d96a779/volumes" Mar 13 11:54:01 crc kubenswrapper[4837]: I0313 11:54:01.061092 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d" path="/var/lib/kubelet/pods/bf40d0dd-bb1c-470d-97c9-dbbbd4625e5d/volumes" Mar 13 11:54:01 crc kubenswrapper[4837]: I0313 11:54:01.061685 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6060cf2-077e-4112-af57-f100e297f320" path="/var/lib/kubelet/pods/e6060cf2-077e-4112-af57-f100e297f320/volumes" Mar 13 11:54:01 crc kubenswrapper[4837]: I0313 11:54:01.149587 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556714-jzzgx" event={"ID":"2b7a269a-3d94-4758-922d-9886312f2a25","Type":"ContainerStarted","Data":"f1d46501ec1fa73334626e53641f933a075eca354c4c9f492c76bbcb4a084039"} Mar 13 11:54:01 crc kubenswrapper[4837]: I0313 11:54:01.150934 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7rzpc" event={"ID":"b87c8f86-a346-4907-9441-048c3220646f","Type":"ContainerStarted","Data":"469523a6f85ae1746666577053ce1be84d9142d1935d58c9331cba700d12263d"} Mar 13 11:54:01 crc kubenswrapper[4837]: I0313 11:54:01.151260 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-7rzpc" Mar 13 11:54:01 crc kubenswrapper[4837]: I0313 11:54:01.154669 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-7rzpc" Mar 13 11:54:01 crc kubenswrapper[4837]: I0313 11:54:01.170104 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-7rzpc" podStartSLOduration=2.170085533 podStartE2EDuration="2.170085533s" podCreationTimestamp="2026-03-13 11:53:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:54:01.165291687 +0000 UTC m=+356.803558450" watchObservedRunningTime="2026-03-13 11:54:01.170085533 +0000 UTC m=+356.808352286" Mar 13 11:54:02 crc kubenswrapper[4837]: I0313 11:54:02.164773 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556714-jzzgx" event={"ID":"2b7a269a-3d94-4758-922d-9886312f2a25","Type":"ContainerStarted","Data":"35377d4210b529c8401b806fa107dba5beb6002cbc3a3ce3ea9ad22bd10d0960"} Mar 13 11:54:02 crc kubenswrapper[4837]: I0313 11:54:02.176369 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556714-jzzgx" podStartSLOduration=1.372039008 podStartE2EDuration="2.176348919s" podCreationTimestamp="2026-03-13 11:54:00 +0000 UTC" firstStartedPulling="2026-03-13 11:54:00.958314908 +0000 UTC m=+356.596581681" lastFinishedPulling="2026-03-13 11:54:01.762624829 +0000 UTC m=+357.400891592" observedRunningTime="2026-03-13 11:54:02.175548483 +0000 UTC m=+357.813815246" watchObservedRunningTime="2026-03-13 11:54:02.176348919 +0000 UTC m=+357.814615682" Mar 13 11:54:03 crc kubenswrapper[4837]: I0313 11:54:03.172600 4837 generic.go:334] "Generic (PLEG): container finished" podID="2b7a269a-3d94-4758-922d-9886312f2a25" containerID="35377d4210b529c8401b806fa107dba5beb6002cbc3a3ce3ea9ad22bd10d0960" exitCode=0 Mar 13 11:54:03 crc kubenswrapper[4837]: I0313 11:54:03.172694 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556714-jzzgx" event={"ID":"2b7a269a-3d94-4758-922d-9886312f2a25","Type":"ContainerDied","Data":"35377d4210b529c8401b806fa107dba5beb6002cbc3a3ce3ea9ad22bd10d0960"} Mar 13 11:54:04 crc kubenswrapper[4837]: I0313 11:54:04.522901 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556714-jzzgx" Mar 13 11:54:04 crc kubenswrapper[4837]: I0313 11:54:04.648386 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jdql\" (UniqueName: \"kubernetes.io/projected/2b7a269a-3d94-4758-922d-9886312f2a25-kube-api-access-7jdql\") pod \"2b7a269a-3d94-4758-922d-9886312f2a25\" (UID: \"2b7a269a-3d94-4758-922d-9886312f2a25\") " Mar 13 11:54:04 crc kubenswrapper[4837]: I0313 11:54:04.653270 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b7a269a-3d94-4758-922d-9886312f2a25-kube-api-access-7jdql" (OuterVolumeSpecName: "kube-api-access-7jdql") pod "2b7a269a-3d94-4758-922d-9886312f2a25" (UID: "2b7a269a-3d94-4758-922d-9886312f2a25"). InnerVolumeSpecName "kube-api-access-7jdql". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:54:04 crc kubenswrapper[4837]: I0313 11:54:04.749268 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jdql\" (UniqueName: \"kubernetes.io/projected/2b7a269a-3d94-4758-922d-9886312f2a25-kube-api-access-7jdql\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:05 crc kubenswrapper[4837]: I0313 11:54:05.188334 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556714-jzzgx" event={"ID":"2b7a269a-3d94-4758-922d-9886312f2a25","Type":"ContainerDied","Data":"f1d46501ec1fa73334626e53641f933a075eca354c4c9f492c76bbcb4a084039"} Mar 13 11:54:05 crc kubenswrapper[4837]: I0313 11:54:05.188399 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1d46501ec1fa73334626e53641f933a075eca354c4c9f492c76bbcb4a084039" Mar 13 11:54:05 crc kubenswrapper[4837]: I0313 11:54:05.188441 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556714-jzzgx" Mar 13 11:54:17 crc kubenswrapper[4837]: I0313 11:54:17.996997 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8q6j6"] Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.022002 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" podUID="27d45de2-e0ab-4c3e-b3da-b20e60e26801" containerName="oauth-openshift" containerID="cri-o://7788f0babcbd0ba3005289dc42abd3560a56f1f0efe57b0376342454820793c4" gracePeriod=15 Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.407790 4837 generic.go:334] "Generic (PLEG): container finished" podID="27d45de2-e0ab-4c3e-b3da-b20e60e26801" containerID="7788f0babcbd0ba3005289dc42abd3560a56f1f0efe57b0376342454820793c4" exitCode=0 Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.407876 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" event={"ID":"27d45de2-e0ab-4c3e-b3da-b20e60e26801","Type":"ContainerDied","Data":"7788f0babcbd0ba3005289dc42abd3560a56f1f0efe57b0376342454820793c4"} Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.408196 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" event={"ID":"27d45de2-e0ab-4c3e-b3da-b20e60e26801","Type":"ContainerDied","Data":"48f88856d0aa99c22451af4774004c789a7baf644ed71ee96a301b56c7368078"} Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.408209 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48f88856d0aa99c22451af4774004c789a7baf644ed71ee96a301b56c7368078" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.423515 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.458977 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-576c48cff9-gb95w"] Mar 13 11:54:43 crc kubenswrapper[4837]: E0313 11:54:43.459230 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b7a269a-3d94-4758-922d-9886312f2a25" containerName="oc" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.459251 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b7a269a-3d94-4758-922d-9886312f2a25" containerName="oc" Mar 13 11:54:43 crc kubenswrapper[4837]: E0313 11:54:43.459469 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27d45de2-e0ab-4c3e-b3da-b20e60e26801" containerName="oauth-openshift" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.459673 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="27d45de2-e0ab-4c3e-b3da-b20e60e26801" containerName="oauth-openshift" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.459863 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="27d45de2-e0ab-4c3e-b3da-b20e60e26801" containerName="oauth-openshift" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.459892 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b7a269a-3d94-4758-922d-9886312f2a25" containerName="oc" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.460359 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.465009 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-576c48cff9-gb95w"] Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.505351 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-cliconfig\") pod \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.505414 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-user-idp-0-file-data\") pod \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.505472 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-service-ca\") pod \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.505499 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-user-template-error\") pod \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.505529 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/27d45de2-e0ab-4c3e-b3da-b20e60e26801-audit-policies\") pod \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.505559 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-ocp-branding-template\") pod \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.505593 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-user-template-login\") pod \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.505674 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-serving-cert\") pod \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.505702 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-user-template-provider-selection\") pod \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.505736 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zn764\" (UniqueName: \"kubernetes.io/projected/27d45de2-e0ab-4c3e-b3da-b20e60e26801-kube-api-access-zn764\") pod \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.505771 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-session\") pod \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.505794 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/27d45de2-e0ab-4c3e-b3da-b20e60e26801-audit-dir\") pod \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.505822 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-router-certs\") pod \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.505860 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-trusted-ca-bundle\") pod \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\" (UID: \"27d45de2-e0ab-4c3e-b3da-b20e60e26801\") " Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.506054 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-system-router-certs\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.506107 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-user-template-error\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.506137 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/10374b6c-b203-46d7-856b-ca95bb2f19a7-audit-dir\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.506175 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-system-session\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.506206 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-system-service-ca\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.506232 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-user-template-login\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.506248 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "27d45de2-e0ab-4c3e-b3da-b20e60e26801" (UID: "27d45de2-e0ab-4c3e-b3da-b20e60e26801"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.506259 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/10374b6c-b203-46d7-856b-ca95bb2f19a7-audit-policies\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.506318 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.506323 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "27d45de2-e0ab-4c3e-b3da-b20e60e26801" (UID: "27d45de2-e0ab-4c3e-b3da-b20e60e26801"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.506343 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.506421 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.506462 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.506467 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27d45de2-e0ab-4c3e-b3da-b20e60e26801-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "27d45de2-e0ab-4c3e-b3da-b20e60e26801" (UID: "27d45de2-e0ab-4c3e-b3da-b20e60e26801"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.506490 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9grzq\" (UniqueName: \"kubernetes.io/projected/10374b6c-b203-46d7-856b-ca95bb2f19a7-kube-api-access-9grzq\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.506625 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.506667 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "27d45de2-e0ab-4c3e-b3da-b20e60e26801" (UID: "27d45de2-e0ab-4c3e-b3da-b20e60e26801"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.506685 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.506851 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.506864 4837 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/27d45de2-e0ab-4c3e-b3da-b20e60e26801-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.506883 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.506900 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.506697 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/27d45de2-e0ab-4c3e-b3da-b20e60e26801-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "27d45de2-e0ab-4c3e-b3da-b20e60e26801" (UID: "27d45de2-e0ab-4c3e-b3da-b20e60e26801"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.512848 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "27d45de2-e0ab-4c3e-b3da-b20e60e26801" (UID: "27d45de2-e0ab-4c3e-b3da-b20e60e26801"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.514211 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "27d45de2-e0ab-4c3e-b3da-b20e60e26801" (UID: "27d45de2-e0ab-4c3e-b3da-b20e60e26801"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.514399 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "27d45de2-e0ab-4c3e-b3da-b20e60e26801" (UID: "27d45de2-e0ab-4c3e-b3da-b20e60e26801"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.515042 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "27d45de2-e0ab-4c3e-b3da-b20e60e26801" (UID: "27d45de2-e0ab-4c3e-b3da-b20e60e26801"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.515835 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "27d45de2-e0ab-4c3e-b3da-b20e60e26801" (UID: "27d45de2-e0ab-4c3e-b3da-b20e60e26801"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.515993 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "27d45de2-e0ab-4c3e-b3da-b20e60e26801" (UID: "27d45de2-e0ab-4c3e-b3da-b20e60e26801"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.516673 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "27d45de2-e0ab-4c3e-b3da-b20e60e26801" (UID: "27d45de2-e0ab-4c3e-b3da-b20e60e26801"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.517687 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "27d45de2-e0ab-4c3e-b3da-b20e60e26801" (UID: "27d45de2-e0ab-4c3e-b3da-b20e60e26801"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.525622 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27d45de2-e0ab-4c3e-b3da-b20e60e26801-kube-api-access-zn764" (OuterVolumeSpecName: "kube-api-access-zn764") pod "27d45de2-e0ab-4c3e-b3da-b20e60e26801" (UID: "27d45de2-e0ab-4c3e-b3da-b20e60e26801"). InnerVolumeSpecName "kube-api-access-zn764". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.608014 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-user-template-error\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.608278 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/10374b6c-b203-46d7-856b-ca95bb2f19a7-audit-dir\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.608371 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-system-session\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.608464 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-system-service-ca\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.608566 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-user-template-login\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.608803 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/10374b6c-b203-46d7-856b-ca95bb2f19a7-audit-policies\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.608362 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/10374b6c-b203-46d7-856b-ca95bb2f19a7-audit-dir\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.608891 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.609165 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.609219 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.609262 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.609288 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9grzq\" (UniqueName: \"kubernetes.io/projected/10374b6c-b203-46d7-856b-ca95bb2f19a7-kube-api-access-9grzq\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.609329 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.609352 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.609407 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-system-router-certs\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.609473 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.609491 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.609502 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.609514 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.609526 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zn764\" (UniqueName: \"kubernetes.io/projected/27d45de2-e0ab-4c3e-b3da-b20e60e26801-kube-api-access-zn764\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.609536 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.609548 4837 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/27d45de2-e0ab-4c3e-b3da-b20e60e26801-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.609557 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.609568 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.609577 4837 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/27d45de2-e0ab-4c3e-b3da-b20e60e26801-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.609220 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-system-service-ca\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.610024 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.610068 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.610165 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/10374b6c-b203-46d7-856b-ca95bb2f19a7-audit-policies\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.611325 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-user-template-error\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.612898 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-system-router-certs\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.613042 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-user-template-login\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.613214 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.613251 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.613738 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-system-session\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.614579 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.626950 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/10374b6c-b203-46d7-856b-ca95bb2f19a7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.629354 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9grzq\" (UniqueName: \"kubernetes.io/projected/10374b6c-b203-46d7-856b-ca95bb2f19a7-kube-api-access-9grzq\") pod \"oauth-openshift-576c48cff9-gb95w\" (UID: \"10374b6c-b203-46d7-856b-ca95bb2f19a7\") " pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:43 crc kubenswrapper[4837]: I0313 11:54:43.776559 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:44 crc kubenswrapper[4837]: I0313 11:54:44.211062 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-576c48cff9-gb95w"] Mar 13 11:54:44 crc kubenswrapper[4837]: I0313 11:54:44.366612 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tjvc6"] Mar 13 11:54:44 crc kubenswrapper[4837]: I0313 11:54:44.368337 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tjvc6" Mar 13 11:54:44 crc kubenswrapper[4837]: I0313 11:54:44.371140 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 13 11:54:44 crc kubenswrapper[4837]: I0313 11:54:44.382808 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tjvc6"] Mar 13 11:54:44 crc kubenswrapper[4837]: I0313 11:54:44.414481 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" event={"ID":"10374b6c-b203-46d7-856b-ca95bb2f19a7","Type":"ContainerStarted","Data":"b9ac8d05b9f1c86c2c199c32721944cbd6e9fdc505e7326b2ec5798ccc5c9882"} Mar 13 11:54:44 crc kubenswrapper[4837]: I0313 11:54:44.414509 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-8q6j6" Mar 13 11:54:44 crc kubenswrapper[4837]: I0313 11:54:44.417548 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t5r4\" (UniqueName: \"kubernetes.io/projected/07889497-1048-4f7a-9245-132767bb28b6-kube-api-access-9t5r4\") pod \"community-operators-tjvc6\" (UID: \"07889497-1048-4f7a-9245-132767bb28b6\") " pod="openshift-marketplace/community-operators-tjvc6" Mar 13 11:54:44 crc kubenswrapper[4837]: I0313 11:54:44.417829 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07889497-1048-4f7a-9245-132767bb28b6-catalog-content\") pod \"community-operators-tjvc6\" (UID: \"07889497-1048-4f7a-9245-132767bb28b6\") " pod="openshift-marketplace/community-operators-tjvc6" Mar 13 11:54:44 crc kubenswrapper[4837]: I0313 11:54:44.417953 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07889497-1048-4f7a-9245-132767bb28b6-utilities\") pod \"community-operators-tjvc6\" (UID: \"07889497-1048-4f7a-9245-132767bb28b6\") " pod="openshift-marketplace/community-operators-tjvc6" Mar 13 11:54:44 crc kubenswrapper[4837]: I0313 11:54:44.446205 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8q6j6"] Mar 13 11:54:44 crc kubenswrapper[4837]: I0313 11:54:44.449112 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8q6j6"] Mar 13 11:54:44 crc kubenswrapper[4837]: I0313 11:54:44.519352 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9t5r4\" (UniqueName: \"kubernetes.io/projected/07889497-1048-4f7a-9245-132767bb28b6-kube-api-access-9t5r4\") pod \"community-operators-tjvc6\" (UID: \"07889497-1048-4f7a-9245-132767bb28b6\") " pod="openshift-marketplace/community-operators-tjvc6" Mar 13 11:54:44 crc kubenswrapper[4837]: I0313 11:54:44.519807 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07889497-1048-4f7a-9245-132767bb28b6-catalog-content\") pod \"community-operators-tjvc6\" (UID: \"07889497-1048-4f7a-9245-132767bb28b6\") " pod="openshift-marketplace/community-operators-tjvc6" Mar 13 11:54:44 crc kubenswrapper[4837]: I0313 11:54:44.519906 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07889497-1048-4f7a-9245-132767bb28b6-utilities\") pod \"community-operators-tjvc6\" (UID: \"07889497-1048-4f7a-9245-132767bb28b6\") " pod="openshift-marketplace/community-operators-tjvc6" Mar 13 11:54:44 crc kubenswrapper[4837]: I0313 11:54:44.520624 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07889497-1048-4f7a-9245-132767bb28b6-catalog-content\") pod \"community-operators-tjvc6\" (UID: \"07889497-1048-4f7a-9245-132767bb28b6\") " pod="openshift-marketplace/community-operators-tjvc6" Mar 13 11:54:44 crc kubenswrapper[4837]: I0313 11:54:44.520803 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07889497-1048-4f7a-9245-132767bb28b6-utilities\") pod \"community-operators-tjvc6\" (UID: \"07889497-1048-4f7a-9245-132767bb28b6\") " pod="openshift-marketplace/community-operators-tjvc6" Mar 13 11:54:44 crc kubenswrapper[4837]: I0313 11:54:44.538574 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t5r4\" (UniqueName: \"kubernetes.io/projected/07889497-1048-4f7a-9245-132767bb28b6-kube-api-access-9t5r4\") pod \"community-operators-tjvc6\" (UID: \"07889497-1048-4f7a-9245-132767bb28b6\") " pod="openshift-marketplace/community-operators-tjvc6" Mar 13 11:54:44 crc kubenswrapper[4837]: I0313 11:54:44.725492 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tjvc6" Mar 13 11:54:45 crc kubenswrapper[4837]: I0313 11:54:45.055050 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27d45de2-e0ab-4c3e-b3da-b20e60e26801" path="/var/lib/kubelet/pods/27d45de2-e0ab-4c3e-b3da-b20e60e26801/volumes" Mar 13 11:54:45 crc kubenswrapper[4837]: I0313 11:54:45.169983 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tjvc6"] Mar 13 11:54:45 crc kubenswrapper[4837]: W0313 11:54:45.178148 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07889497_1048_4f7a_9245_132767bb28b6.slice/crio-88809656a4007e3a80b8ce9dec903bad48ed23e1bf04235cdee5afb969a57ed8 WatchSource:0}: Error finding container 88809656a4007e3a80b8ce9dec903bad48ed23e1bf04235cdee5afb969a57ed8: Status 404 returned error can't find the container with id 88809656a4007e3a80b8ce9dec903bad48ed23e1bf04235cdee5afb969a57ed8 Mar 13 11:54:45 crc kubenswrapper[4837]: I0313 11:54:45.422864 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" event={"ID":"10374b6c-b203-46d7-856b-ca95bb2f19a7","Type":"ContainerStarted","Data":"8f25fb8bd0e85a8245bb09f66b22b04c88fd4b7f814a4f8e7416ee0bd04e0d77"} Mar 13 11:54:45 crc kubenswrapper[4837]: I0313 11:54:45.424785 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:45 crc kubenswrapper[4837]: I0313 11:54:45.426401 4837 generic.go:334] "Generic (PLEG): container finished" podID="07889497-1048-4f7a-9245-132767bb28b6" containerID="d65091f8ad9f264991fc060bd7f7b7cc92043cbc5ac816b835141daaa1a15860" exitCode=0 Mar 13 11:54:45 crc kubenswrapper[4837]: I0313 11:54:45.426464 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tjvc6" event={"ID":"07889497-1048-4f7a-9245-132767bb28b6","Type":"ContainerDied","Data":"d65091f8ad9f264991fc060bd7f7b7cc92043cbc5ac816b835141daaa1a15860"} Mar 13 11:54:45 crc kubenswrapper[4837]: I0313 11:54:45.426534 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tjvc6" event={"ID":"07889497-1048-4f7a-9245-132767bb28b6","Type":"ContainerStarted","Data":"88809656a4007e3a80b8ce9dec903bad48ed23e1bf04235cdee5afb969a57ed8"} Mar 13 11:54:45 crc kubenswrapper[4837]: I0313 11:54:45.435512 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" Mar 13 11:54:45 crc kubenswrapper[4837]: I0313 11:54:45.455011 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-576c48cff9-gb95w" podStartSLOduration=27.454991948 podStartE2EDuration="27.454991948s" podCreationTimestamp="2026-03-13 11:54:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:54:45.452325453 +0000 UTC m=+401.090592246" watchObservedRunningTime="2026-03-13 11:54:45.454991948 +0000 UTC m=+401.093258711" Mar 13 11:54:45 crc kubenswrapper[4837]: I0313 11:54:45.759162 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kpp2z"] Mar 13 11:54:45 crc kubenswrapper[4837]: I0313 11:54:45.760189 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kpp2z" Mar 13 11:54:45 crc kubenswrapper[4837]: I0313 11:54:45.761958 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 13 11:54:45 crc kubenswrapper[4837]: I0313 11:54:45.774278 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kpp2z"] Mar 13 11:54:45 crc kubenswrapper[4837]: I0313 11:54:45.839455 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d96905d-521e-4ab9-87a8-d6edd0c027ed-utilities\") pod \"redhat-operators-kpp2z\" (UID: \"8d96905d-521e-4ab9-87a8-d6edd0c027ed\") " pod="openshift-marketplace/redhat-operators-kpp2z" Mar 13 11:54:45 crc kubenswrapper[4837]: I0313 11:54:45.839689 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d96905d-521e-4ab9-87a8-d6edd0c027ed-catalog-content\") pod \"redhat-operators-kpp2z\" (UID: \"8d96905d-521e-4ab9-87a8-d6edd0c027ed\") " pod="openshift-marketplace/redhat-operators-kpp2z" Mar 13 11:54:45 crc kubenswrapper[4837]: I0313 11:54:45.839774 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgw6h\" (UniqueName: \"kubernetes.io/projected/8d96905d-521e-4ab9-87a8-d6edd0c027ed-kube-api-access-hgw6h\") pod \"redhat-operators-kpp2z\" (UID: \"8d96905d-521e-4ab9-87a8-d6edd0c027ed\") " pod="openshift-marketplace/redhat-operators-kpp2z" Mar 13 11:54:45 crc kubenswrapper[4837]: I0313 11:54:45.941894 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d96905d-521e-4ab9-87a8-d6edd0c027ed-catalog-content\") pod \"redhat-operators-kpp2z\" (UID: \"8d96905d-521e-4ab9-87a8-d6edd0c027ed\") " pod="openshift-marketplace/redhat-operators-kpp2z" Mar 13 11:54:45 crc kubenswrapper[4837]: I0313 11:54:45.942087 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgw6h\" (UniqueName: \"kubernetes.io/projected/8d96905d-521e-4ab9-87a8-d6edd0c027ed-kube-api-access-hgw6h\") pod \"redhat-operators-kpp2z\" (UID: \"8d96905d-521e-4ab9-87a8-d6edd0c027ed\") " pod="openshift-marketplace/redhat-operators-kpp2z" Mar 13 11:54:45 crc kubenswrapper[4837]: I0313 11:54:45.942288 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d96905d-521e-4ab9-87a8-d6edd0c027ed-utilities\") pod \"redhat-operators-kpp2z\" (UID: \"8d96905d-521e-4ab9-87a8-d6edd0c027ed\") " pod="openshift-marketplace/redhat-operators-kpp2z" Mar 13 11:54:45 crc kubenswrapper[4837]: I0313 11:54:45.942474 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d96905d-521e-4ab9-87a8-d6edd0c027ed-catalog-content\") pod \"redhat-operators-kpp2z\" (UID: \"8d96905d-521e-4ab9-87a8-d6edd0c027ed\") " pod="openshift-marketplace/redhat-operators-kpp2z" Mar 13 11:54:45 crc kubenswrapper[4837]: I0313 11:54:45.942776 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d96905d-521e-4ab9-87a8-d6edd0c027ed-utilities\") pod \"redhat-operators-kpp2z\" (UID: \"8d96905d-521e-4ab9-87a8-d6edd0c027ed\") " pod="openshift-marketplace/redhat-operators-kpp2z" Mar 13 11:54:45 crc kubenswrapper[4837]: I0313 11:54:45.962464 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgw6h\" (UniqueName: \"kubernetes.io/projected/8d96905d-521e-4ab9-87a8-d6edd0c027ed-kube-api-access-hgw6h\") pod \"redhat-operators-kpp2z\" (UID: \"8d96905d-521e-4ab9-87a8-d6edd0c027ed\") " pod="openshift-marketplace/redhat-operators-kpp2z" Mar 13 11:54:46 crc kubenswrapper[4837]: I0313 11:54:46.082127 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kpp2z" Mar 13 11:54:46 crc kubenswrapper[4837]: I0313 11:54:46.434622 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tjvc6" event={"ID":"07889497-1048-4f7a-9245-132767bb28b6","Type":"ContainerStarted","Data":"f302494fedb15539b050a28041be2f2f75279a64dd9207158b12d8d4c083cf2d"} Mar 13 11:54:46 crc kubenswrapper[4837]: I0313 11:54:46.489763 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kpp2z"] Mar 13 11:54:46 crc kubenswrapper[4837]: W0313 11:54:46.533001 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d96905d_521e_4ab9_87a8_d6edd0c027ed.slice/crio-4b96d844ae19ddd09e809c479227559e4835154b1a463a235e51a5a8275afc65 WatchSource:0}: Error finding container 4b96d844ae19ddd09e809c479227559e4835154b1a463a235e51a5a8275afc65: Status 404 returned error can't find the container with id 4b96d844ae19ddd09e809c479227559e4835154b1a463a235e51a5a8275afc65 Mar 13 11:54:47 crc kubenswrapper[4837]: I0313 11:54:47.158916 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zckjb"] Mar 13 11:54:47 crc kubenswrapper[4837]: I0313 11:54:47.160086 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zckjb" Mar 13 11:54:47 crc kubenswrapper[4837]: I0313 11:54:47.162018 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 13 11:54:47 crc kubenswrapper[4837]: I0313 11:54:47.169034 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zckjb"] Mar 13 11:54:47 crc kubenswrapper[4837]: I0313 11:54:47.267168 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj6zs\" (UniqueName: \"kubernetes.io/projected/4298f221-fd11-49a1-a0e9-6f95dbdedc44-kube-api-access-xj6zs\") pod \"certified-operators-zckjb\" (UID: \"4298f221-fd11-49a1-a0e9-6f95dbdedc44\") " pod="openshift-marketplace/certified-operators-zckjb" Mar 13 11:54:47 crc kubenswrapper[4837]: I0313 11:54:47.267256 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4298f221-fd11-49a1-a0e9-6f95dbdedc44-utilities\") pod \"certified-operators-zckjb\" (UID: \"4298f221-fd11-49a1-a0e9-6f95dbdedc44\") " pod="openshift-marketplace/certified-operators-zckjb" Mar 13 11:54:47 crc kubenswrapper[4837]: I0313 11:54:47.267564 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4298f221-fd11-49a1-a0e9-6f95dbdedc44-catalog-content\") pod \"certified-operators-zckjb\" (UID: \"4298f221-fd11-49a1-a0e9-6f95dbdedc44\") " pod="openshift-marketplace/certified-operators-zckjb" Mar 13 11:54:47 crc kubenswrapper[4837]: I0313 11:54:47.368737 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4298f221-fd11-49a1-a0e9-6f95dbdedc44-catalog-content\") pod \"certified-operators-zckjb\" (UID: \"4298f221-fd11-49a1-a0e9-6f95dbdedc44\") " pod="openshift-marketplace/certified-operators-zckjb" Mar 13 11:54:47 crc kubenswrapper[4837]: I0313 11:54:47.368821 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xj6zs\" (UniqueName: \"kubernetes.io/projected/4298f221-fd11-49a1-a0e9-6f95dbdedc44-kube-api-access-xj6zs\") pod \"certified-operators-zckjb\" (UID: \"4298f221-fd11-49a1-a0e9-6f95dbdedc44\") " pod="openshift-marketplace/certified-operators-zckjb" Mar 13 11:54:47 crc kubenswrapper[4837]: I0313 11:54:47.368877 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4298f221-fd11-49a1-a0e9-6f95dbdedc44-utilities\") pod \"certified-operators-zckjb\" (UID: \"4298f221-fd11-49a1-a0e9-6f95dbdedc44\") " pod="openshift-marketplace/certified-operators-zckjb" Mar 13 11:54:47 crc kubenswrapper[4837]: I0313 11:54:47.369443 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4298f221-fd11-49a1-a0e9-6f95dbdedc44-catalog-content\") pod \"certified-operators-zckjb\" (UID: \"4298f221-fd11-49a1-a0e9-6f95dbdedc44\") " pod="openshift-marketplace/certified-operators-zckjb" Mar 13 11:54:47 crc kubenswrapper[4837]: I0313 11:54:47.369591 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4298f221-fd11-49a1-a0e9-6f95dbdedc44-utilities\") pod \"certified-operators-zckjb\" (UID: \"4298f221-fd11-49a1-a0e9-6f95dbdedc44\") " pod="openshift-marketplace/certified-operators-zckjb" Mar 13 11:54:47 crc kubenswrapper[4837]: I0313 11:54:47.393563 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xj6zs\" (UniqueName: \"kubernetes.io/projected/4298f221-fd11-49a1-a0e9-6f95dbdedc44-kube-api-access-xj6zs\") pod \"certified-operators-zckjb\" (UID: \"4298f221-fd11-49a1-a0e9-6f95dbdedc44\") " pod="openshift-marketplace/certified-operators-zckjb" Mar 13 11:54:47 crc kubenswrapper[4837]: I0313 11:54:47.440562 4837 generic.go:334] "Generic (PLEG): container finished" podID="8d96905d-521e-4ab9-87a8-d6edd0c027ed" containerID="21d402f48e20199c93f215f89accb38b25a12a6308104cc7886b3e3df832e271" exitCode=0 Mar 13 11:54:47 crc kubenswrapper[4837]: I0313 11:54:47.440696 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kpp2z" event={"ID":"8d96905d-521e-4ab9-87a8-d6edd0c027ed","Type":"ContainerDied","Data":"21d402f48e20199c93f215f89accb38b25a12a6308104cc7886b3e3df832e271"} Mar 13 11:54:47 crc kubenswrapper[4837]: I0313 11:54:47.440805 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kpp2z" event={"ID":"8d96905d-521e-4ab9-87a8-d6edd0c027ed","Type":"ContainerStarted","Data":"4b96d844ae19ddd09e809c479227559e4835154b1a463a235e51a5a8275afc65"} Mar 13 11:54:47 crc kubenswrapper[4837]: I0313 11:54:47.442540 4837 generic.go:334] "Generic (PLEG): container finished" podID="07889497-1048-4f7a-9245-132767bb28b6" containerID="f302494fedb15539b050a28041be2f2f75279a64dd9207158b12d8d4c083cf2d" exitCode=0 Mar 13 11:54:47 crc kubenswrapper[4837]: I0313 11:54:47.442627 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tjvc6" event={"ID":"07889497-1048-4f7a-9245-132767bb28b6","Type":"ContainerDied","Data":"f302494fedb15539b050a28041be2f2f75279a64dd9207158b12d8d4c083cf2d"} Mar 13 11:54:47 crc kubenswrapper[4837]: I0313 11:54:47.477884 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zckjb" Mar 13 11:54:47 crc kubenswrapper[4837]: I0313 11:54:47.870630 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zckjb"] Mar 13 11:54:48 crc kubenswrapper[4837]: I0313 11:54:48.156927 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-m5v5n"] Mar 13 11:54:48 crc kubenswrapper[4837]: I0313 11:54:48.157950 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m5v5n" Mar 13 11:54:48 crc kubenswrapper[4837]: I0313 11:54:48.160282 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 13 11:54:48 crc kubenswrapper[4837]: I0313 11:54:48.171048 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m5v5n"] Mar 13 11:54:48 crc kubenswrapper[4837]: I0313 11:54:48.183106 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fec78503-41e5-45f4-9217-1debe55ec107-utilities\") pod \"redhat-marketplace-m5v5n\" (UID: \"fec78503-41e5-45f4-9217-1debe55ec107\") " pod="openshift-marketplace/redhat-marketplace-m5v5n" Mar 13 11:54:48 crc kubenswrapper[4837]: I0313 11:54:48.183158 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fec78503-41e5-45f4-9217-1debe55ec107-catalog-content\") pod \"redhat-marketplace-m5v5n\" (UID: \"fec78503-41e5-45f4-9217-1debe55ec107\") " pod="openshift-marketplace/redhat-marketplace-m5v5n" Mar 13 11:54:48 crc kubenswrapper[4837]: I0313 11:54:48.183227 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2xqh\" (UniqueName: \"kubernetes.io/projected/fec78503-41e5-45f4-9217-1debe55ec107-kube-api-access-w2xqh\") pod \"redhat-marketplace-m5v5n\" (UID: \"fec78503-41e5-45f4-9217-1debe55ec107\") " pod="openshift-marketplace/redhat-marketplace-m5v5n" Mar 13 11:54:48 crc kubenswrapper[4837]: I0313 11:54:48.283881 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fec78503-41e5-45f4-9217-1debe55ec107-catalog-content\") pod \"redhat-marketplace-m5v5n\" (UID: \"fec78503-41e5-45f4-9217-1debe55ec107\") " pod="openshift-marketplace/redhat-marketplace-m5v5n" Mar 13 11:54:48 crc kubenswrapper[4837]: I0313 11:54:48.283934 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2xqh\" (UniqueName: \"kubernetes.io/projected/fec78503-41e5-45f4-9217-1debe55ec107-kube-api-access-w2xqh\") pod \"redhat-marketplace-m5v5n\" (UID: \"fec78503-41e5-45f4-9217-1debe55ec107\") " pod="openshift-marketplace/redhat-marketplace-m5v5n" Mar 13 11:54:48 crc kubenswrapper[4837]: I0313 11:54:48.284015 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fec78503-41e5-45f4-9217-1debe55ec107-utilities\") pod \"redhat-marketplace-m5v5n\" (UID: \"fec78503-41e5-45f4-9217-1debe55ec107\") " pod="openshift-marketplace/redhat-marketplace-m5v5n" Mar 13 11:54:48 crc kubenswrapper[4837]: I0313 11:54:48.284776 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fec78503-41e5-45f4-9217-1debe55ec107-catalog-content\") pod \"redhat-marketplace-m5v5n\" (UID: \"fec78503-41e5-45f4-9217-1debe55ec107\") " pod="openshift-marketplace/redhat-marketplace-m5v5n" Mar 13 11:54:48 crc kubenswrapper[4837]: I0313 11:54:48.284797 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fec78503-41e5-45f4-9217-1debe55ec107-utilities\") pod \"redhat-marketplace-m5v5n\" (UID: \"fec78503-41e5-45f4-9217-1debe55ec107\") " pod="openshift-marketplace/redhat-marketplace-m5v5n" Mar 13 11:54:48 crc kubenswrapper[4837]: I0313 11:54:48.304005 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2xqh\" (UniqueName: \"kubernetes.io/projected/fec78503-41e5-45f4-9217-1debe55ec107-kube-api-access-w2xqh\") pod \"redhat-marketplace-m5v5n\" (UID: \"fec78503-41e5-45f4-9217-1debe55ec107\") " pod="openshift-marketplace/redhat-marketplace-m5v5n" Mar 13 11:54:48 crc kubenswrapper[4837]: I0313 11:54:48.452114 4837 generic.go:334] "Generic (PLEG): container finished" podID="4298f221-fd11-49a1-a0e9-6f95dbdedc44" containerID="18e61808bcae466833085d24179cdda44de2a637db7884a1b4abbd72b382f4a5" exitCode=0 Mar 13 11:54:48 crc kubenswrapper[4837]: I0313 11:54:48.452185 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zckjb" event={"ID":"4298f221-fd11-49a1-a0e9-6f95dbdedc44","Type":"ContainerDied","Data":"18e61808bcae466833085d24179cdda44de2a637db7884a1b4abbd72b382f4a5"} Mar 13 11:54:48 crc kubenswrapper[4837]: I0313 11:54:48.452587 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zckjb" event={"ID":"4298f221-fd11-49a1-a0e9-6f95dbdedc44","Type":"ContainerStarted","Data":"786b091b8fbd379c50d6730e052c15a6bae184fc90f99430dc54b0aa7193c96c"} Mar 13 11:54:48 crc kubenswrapper[4837]: I0313 11:54:48.456312 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kpp2z" event={"ID":"8d96905d-521e-4ab9-87a8-d6edd0c027ed","Type":"ContainerStarted","Data":"c2579f686b8be02a64326106fec11a117b09f60501908a0ca1df4ea63ca94522"} Mar 13 11:54:48 crc kubenswrapper[4837]: I0313 11:54:48.461940 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tjvc6" event={"ID":"07889497-1048-4f7a-9245-132767bb28b6","Type":"ContainerStarted","Data":"6dc4460a0d460c82f61640b3c5c6c53eac6f4b5becc2eae019cec7a28347f2bd"} Mar 13 11:54:48 crc kubenswrapper[4837]: I0313 11:54:48.480121 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m5v5n" Mar 13 11:54:48 crc kubenswrapper[4837]: I0313 11:54:48.519256 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tjvc6" podStartSLOduration=1.8924836040000002 podStartE2EDuration="4.519237885s" podCreationTimestamp="2026-03-13 11:54:44 +0000 UTC" firstStartedPulling="2026-03-13 11:54:45.42835796 +0000 UTC m=+401.066624723" lastFinishedPulling="2026-03-13 11:54:48.055112241 +0000 UTC m=+403.693379004" observedRunningTime="2026-03-13 11:54:48.519029789 +0000 UTC m=+404.157296552" watchObservedRunningTime="2026-03-13 11:54:48.519237885 +0000 UTC m=+404.157504648" Mar 13 11:54:48 crc kubenswrapper[4837]: I0313 11:54:48.953043 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m5v5n"] Mar 13 11:54:49 crc kubenswrapper[4837]: I0313 11:54:49.469078 4837 generic.go:334] "Generic (PLEG): container finished" podID="8d96905d-521e-4ab9-87a8-d6edd0c027ed" containerID="c2579f686b8be02a64326106fec11a117b09f60501908a0ca1df4ea63ca94522" exitCode=0 Mar 13 11:54:49 crc kubenswrapper[4837]: I0313 11:54:49.469195 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kpp2z" event={"ID":"8d96905d-521e-4ab9-87a8-d6edd0c027ed","Type":"ContainerDied","Data":"c2579f686b8be02a64326106fec11a117b09f60501908a0ca1df4ea63ca94522"} Mar 13 11:54:49 crc kubenswrapper[4837]: I0313 11:54:49.472946 4837 generic.go:334] "Generic (PLEG): container finished" podID="fec78503-41e5-45f4-9217-1debe55ec107" containerID="e1853ac25f407645157ae4b773160c6c162ac692f7690b340b59353f7a9f34a9" exitCode=0 Mar 13 11:54:49 crc kubenswrapper[4837]: I0313 11:54:49.472990 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m5v5n" event={"ID":"fec78503-41e5-45f4-9217-1debe55ec107","Type":"ContainerDied","Data":"e1853ac25f407645157ae4b773160c6c162ac692f7690b340b59353f7a9f34a9"} Mar 13 11:54:49 crc kubenswrapper[4837]: I0313 11:54:49.473030 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m5v5n" event={"ID":"fec78503-41e5-45f4-9217-1debe55ec107","Type":"ContainerStarted","Data":"5e9f2d804e0645269d40b445a2e8eaa9cf543ef4b51c2e12b9e3e4addefc241c"} Mar 13 11:54:49 crc kubenswrapper[4837]: I0313 11:54:49.476374 4837 generic.go:334] "Generic (PLEG): container finished" podID="4298f221-fd11-49a1-a0e9-6f95dbdedc44" containerID="13331e95c57b7e12e7a85ec5552e0f4233276b481c6c04bd1f77be9de05f66ec" exitCode=0 Mar 13 11:54:49 crc kubenswrapper[4837]: I0313 11:54:49.476479 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zckjb" event={"ID":"4298f221-fd11-49a1-a0e9-6f95dbdedc44","Type":"ContainerDied","Data":"13331e95c57b7e12e7a85ec5552e0f4233276b481c6c04bd1f77be9de05f66ec"} Mar 13 11:54:50 crc kubenswrapper[4837]: I0313 11:54:50.483833 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kpp2z" event={"ID":"8d96905d-521e-4ab9-87a8-d6edd0c027ed","Type":"ContainerStarted","Data":"4be2ec40a620706421a0a4c0f49c8c79e68837d9d8e70c2a0546a05222c9171c"} Mar 13 11:54:50 crc kubenswrapper[4837]: I0313 11:54:50.486189 4837 generic.go:334] "Generic (PLEG): container finished" podID="fec78503-41e5-45f4-9217-1debe55ec107" containerID="be6a714256c0eb57e3916efdf5b2ce4ae349e4629a72382f7d597ae3648b7920" exitCode=0 Mar 13 11:54:50 crc kubenswrapper[4837]: I0313 11:54:50.486246 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m5v5n" event={"ID":"fec78503-41e5-45f4-9217-1debe55ec107","Type":"ContainerDied","Data":"be6a714256c0eb57e3916efdf5b2ce4ae349e4629a72382f7d597ae3648b7920"} Mar 13 11:54:50 crc kubenswrapper[4837]: I0313 11:54:50.489973 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zckjb" event={"ID":"4298f221-fd11-49a1-a0e9-6f95dbdedc44","Type":"ContainerStarted","Data":"860677672a2e1ac86a6b3b31b163a6c722e4b411174f630862f72771875670b8"} Mar 13 11:54:50 crc kubenswrapper[4837]: I0313 11:54:50.513868 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kpp2z" podStartSLOduration=3.051657229 podStartE2EDuration="5.51385263s" podCreationTimestamp="2026-03-13 11:54:45 +0000 UTC" firstStartedPulling="2026-03-13 11:54:47.442738674 +0000 UTC m=+403.081005437" lastFinishedPulling="2026-03-13 11:54:49.904934075 +0000 UTC m=+405.543200838" observedRunningTime="2026-03-13 11:54:50.511665991 +0000 UTC m=+406.149932754" watchObservedRunningTime="2026-03-13 11:54:50.51385263 +0000 UTC m=+406.152119393" Mar 13 11:54:50 crc kubenswrapper[4837]: I0313 11:54:50.557825 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zckjb" podStartSLOduration=2.186153118 podStartE2EDuration="3.55780051s" podCreationTimestamp="2026-03-13 11:54:47 +0000 UTC" firstStartedPulling="2026-03-13 11:54:48.455745202 +0000 UTC m=+404.094011965" lastFinishedPulling="2026-03-13 11:54:49.827392594 +0000 UTC m=+405.465659357" observedRunningTime="2026-03-13 11:54:50.55560303 +0000 UTC m=+406.193869793" watchObservedRunningTime="2026-03-13 11:54:50.55780051 +0000 UTC m=+406.196067273" Mar 13 11:54:51 crc kubenswrapper[4837]: I0313 11:54:51.496759 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m5v5n" event={"ID":"fec78503-41e5-45f4-9217-1debe55ec107","Type":"ContainerStarted","Data":"01939a6de402f495d6072809827b6480bdfc5a111a835043e7e862cf70198082"} Mar 13 11:54:51 crc kubenswrapper[4837]: I0313 11:54:51.524783 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-m5v5n" podStartSLOduration=2.095969219 podStartE2EDuration="3.524756111s" podCreationTimestamp="2026-03-13 11:54:48 +0000 UTC" firstStartedPulling="2026-03-13 11:54:49.474363549 +0000 UTC m=+405.112630312" lastFinishedPulling="2026-03-13 11:54:50.903150421 +0000 UTC m=+406.541417204" observedRunningTime="2026-03-13 11:54:51.521814967 +0000 UTC m=+407.160081750" watchObservedRunningTime="2026-03-13 11:54:51.524756111 +0000 UTC m=+407.163022874" Mar 13 11:54:54 crc kubenswrapper[4837]: I0313 11:54:54.726127 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tjvc6" Mar 13 11:54:54 crc kubenswrapper[4837]: I0313 11:54:54.726496 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tjvc6" Mar 13 11:54:54 crc kubenswrapper[4837]: I0313 11:54:54.763580 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tjvc6" Mar 13 11:54:55 crc kubenswrapper[4837]: I0313 11:54:55.560242 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tjvc6" Mar 13 11:54:56 crc kubenswrapper[4837]: I0313 11:54:56.082605 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kpp2z" Mar 13 11:54:56 crc kubenswrapper[4837]: I0313 11:54:56.082660 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kpp2z" Mar 13 11:54:57 crc kubenswrapper[4837]: I0313 11:54:57.122612 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kpp2z" podUID="8d96905d-521e-4ab9-87a8-d6edd0c027ed" containerName="registry-server" probeResult="failure" output=< Mar 13 11:54:57 crc kubenswrapper[4837]: timeout: failed to connect service ":50051" within 1s Mar 13 11:54:57 crc kubenswrapper[4837]: > Mar 13 11:54:57 crc kubenswrapper[4837]: I0313 11:54:57.477986 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zckjb" Mar 13 11:54:57 crc kubenswrapper[4837]: I0313 11:54:57.478106 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zckjb" Mar 13 11:54:57 crc kubenswrapper[4837]: I0313 11:54:57.519070 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zckjb" Mar 13 11:54:57 crc kubenswrapper[4837]: I0313 11:54:57.563088 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zckjb" Mar 13 11:54:58 crc kubenswrapper[4837]: I0313 11:54:58.480973 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-m5v5n" Mar 13 11:54:58 crc kubenswrapper[4837]: I0313 11:54:58.481009 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-m5v5n" Mar 13 11:54:58 crc kubenswrapper[4837]: I0313 11:54:58.586860 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-m5v5n" Mar 13 11:54:58 crc kubenswrapper[4837]: I0313 11:54:58.625874 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-m5v5n" Mar 13 11:55:05 crc kubenswrapper[4837]: I0313 11:55:05.484443 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 11:55:05 crc kubenswrapper[4837]: I0313 11:55:05.485054 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 11:55:06 crc kubenswrapper[4837]: I0313 11:55:06.130321 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kpp2z" Mar 13 11:55:06 crc kubenswrapper[4837]: I0313 11:55:06.185118 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kpp2z" Mar 13 11:55:07 crc kubenswrapper[4837]: I0313 11:55:07.275405 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-l62n7"] Mar 13 11:55:07 crc kubenswrapper[4837]: I0313 11:55:07.276041 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" Mar 13 11:55:07 crc kubenswrapper[4837]: I0313 11:55:07.295782 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-l62n7"] Mar 13 11:55:07 crc kubenswrapper[4837]: I0313 11:55:07.315207 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/411da074-7224-4c1c-a75a-a5c3f29c0e92-registry-tls\") pod \"image-registry-66df7c8f76-l62n7\" (UID: \"411da074-7224-4c1c-a75a-a5c3f29c0e92\") " pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" Mar 13 11:55:07 crc kubenswrapper[4837]: I0313 11:55:07.315272 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxjlr\" (UniqueName: \"kubernetes.io/projected/411da074-7224-4c1c-a75a-a5c3f29c0e92-kube-api-access-gxjlr\") pod \"image-registry-66df7c8f76-l62n7\" (UID: \"411da074-7224-4c1c-a75a-a5c3f29c0e92\") " pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" Mar 13 11:55:07 crc kubenswrapper[4837]: I0313 11:55:07.315331 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/411da074-7224-4c1c-a75a-a5c3f29c0e92-registry-certificates\") pod \"image-registry-66df7c8f76-l62n7\" (UID: \"411da074-7224-4c1c-a75a-a5c3f29c0e92\") " pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" Mar 13 11:55:07 crc kubenswrapper[4837]: I0313 11:55:07.315445 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/411da074-7224-4c1c-a75a-a5c3f29c0e92-bound-sa-token\") pod \"image-registry-66df7c8f76-l62n7\" (UID: \"411da074-7224-4c1c-a75a-a5c3f29c0e92\") " pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" Mar 13 11:55:07 crc kubenswrapper[4837]: I0313 11:55:07.315498 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-l62n7\" (UID: \"411da074-7224-4c1c-a75a-a5c3f29c0e92\") " pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" Mar 13 11:55:07 crc kubenswrapper[4837]: I0313 11:55:07.315519 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/411da074-7224-4c1c-a75a-a5c3f29c0e92-installation-pull-secrets\") pod \"image-registry-66df7c8f76-l62n7\" (UID: \"411da074-7224-4c1c-a75a-a5c3f29c0e92\") " pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" Mar 13 11:55:07 crc kubenswrapper[4837]: I0313 11:55:07.315614 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/411da074-7224-4c1c-a75a-a5c3f29c0e92-trusted-ca\") pod \"image-registry-66df7c8f76-l62n7\" (UID: \"411da074-7224-4c1c-a75a-a5c3f29c0e92\") " pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" Mar 13 11:55:07 crc kubenswrapper[4837]: I0313 11:55:07.315704 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/411da074-7224-4c1c-a75a-a5c3f29c0e92-ca-trust-extracted\") pod \"image-registry-66df7c8f76-l62n7\" (UID: \"411da074-7224-4c1c-a75a-a5c3f29c0e92\") " pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" Mar 13 11:55:07 crc kubenswrapper[4837]: I0313 11:55:07.339762 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-l62n7\" (UID: \"411da074-7224-4c1c-a75a-a5c3f29c0e92\") " pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" Mar 13 11:55:07 crc kubenswrapper[4837]: I0313 11:55:07.416544 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/411da074-7224-4c1c-a75a-a5c3f29c0e92-registry-tls\") pod \"image-registry-66df7c8f76-l62n7\" (UID: \"411da074-7224-4c1c-a75a-a5c3f29c0e92\") " pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" Mar 13 11:55:07 crc kubenswrapper[4837]: I0313 11:55:07.416596 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxjlr\" (UniqueName: \"kubernetes.io/projected/411da074-7224-4c1c-a75a-a5c3f29c0e92-kube-api-access-gxjlr\") pod \"image-registry-66df7c8f76-l62n7\" (UID: \"411da074-7224-4c1c-a75a-a5c3f29c0e92\") " pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" Mar 13 11:55:07 crc kubenswrapper[4837]: I0313 11:55:07.416628 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/411da074-7224-4c1c-a75a-a5c3f29c0e92-registry-certificates\") pod \"image-registry-66df7c8f76-l62n7\" (UID: \"411da074-7224-4c1c-a75a-a5c3f29c0e92\") " pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" Mar 13 11:55:07 crc kubenswrapper[4837]: I0313 11:55:07.416680 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/411da074-7224-4c1c-a75a-a5c3f29c0e92-bound-sa-token\") pod \"image-registry-66df7c8f76-l62n7\" (UID: \"411da074-7224-4c1c-a75a-a5c3f29c0e92\") " pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" Mar 13 11:55:07 crc kubenswrapper[4837]: I0313 11:55:07.416705 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/411da074-7224-4c1c-a75a-a5c3f29c0e92-installation-pull-secrets\") pod \"image-registry-66df7c8f76-l62n7\" (UID: \"411da074-7224-4c1c-a75a-a5c3f29c0e92\") " pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" Mar 13 11:55:07 crc kubenswrapper[4837]: I0313 11:55:07.416730 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/411da074-7224-4c1c-a75a-a5c3f29c0e92-trusted-ca\") pod \"image-registry-66df7c8f76-l62n7\" (UID: \"411da074-7224-4c1c-a75a-a5c3f29c0e92\") " pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" Mar 13 11:55:07 crc kubenswrapper[4837]: I0313 11:55:07.416758 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/411da074-7224-4c1c-a75a-a5c3f29c0e92-ca-trust-extracted\") pod \"image-registry-66df7c8f76-l62n7\" (UID: \"411da074-7224-4c1c-a75a-a5c3f29c0e92\") " pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" Mar 13 11:55:07 crc kubenswrapper[4837]: I0313 11:55:07.417394 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/411da074-7224-4c1c-a75a-a5c3f29c0e92-ca-trust-extracted\") pod \"image-registry-66df7c8f76-l62n7\" (UID: \"411da074-7224-4c1c-a75a-a5c3f29c0e92\") " pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" Mar 13 11:55:07 crc kubenswrapper[4837]: I0313 11:55:07.418687 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/411da074-7224-4c1c-a75a-a5c3f29c0e92-trusted-ca\") pod \"image-registry-66df7c8f76-l62n7\" (UID: \"411da074-7224-4c1c-a75a-a5c3f29c0e92\") " pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" Mar 13 11:55:07 crc kubenswrapper[4837]: I0313 11:55:07.418746 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/411da074-7224-4c1c-a75a-a5c3f29c0e92-registry-certificates\") pod \"image-registry-66df7c8f76-l62n7\" (UID: \"411da074-7224-4c1c-a75a-a5c3f29c0e92\") " pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" Mar 13 11:55:07 crc kubenswrapper[4837]: I0313 11:55:07.425465 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/411da074-7224-4c1c-a75a-a5c3f29c0e92-registry-tls\") pod \"image-registry-66df7c8f76-l62n7\" (UID: \"411da074-7224-4c1c-a75a-a5c3f29c0e92\") " pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" Mar 13 11:55:07 crc kubenswrapper[4837]: I0313 11:55:07.434445 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/411da074-7224-4c1c-a75a-a5c3f29c0e92-installation-pull-secrets\") pod \"image-registry-66df7c8f76-l62n7\" (UID: \"411da074-7224-4c1c-a75a-a5c3f29c0e92\") " pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" Mar 13 11:55:07 crc kubenswrapper[4837]: I0313 11:55:07.438730 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxjlr\" (UniqueName: \"kubernetes.io/projected/411da074-7224-4c1c-a75a-a5c3f29c0e92-kube-api-access-gxjlr\") pod \"image-registry-66df7c8f76-l62n7\" (UID: \"411da074-7224-4c1c-a75a-a5c3f29c0e92\") " pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" Mar 13 11:55:07 crc kubenswrapper[4837]: I0313 11:55:07.440812 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/411da074-7224-4c1c-a75a-a5c3f29c0e92-bound-sa-token\") pod \"image-registry-66df7c8f76-l62n7\" (UID: \"411da074-7224-4c1c-a75a-a5c3f29c0e92\") " pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" Mar 13 11:55:07 crc kubenswrapper[4837]: I0313 11:55:07.593895 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" Mar 13 11:55:08 crc kubenswrapper[4837]: I0313 11:55:08.009613 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-l62n7"] Mar 13 11:55:08 crc kubenswrapper[4837]: I0313 11:55:08.591064 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" event={"ID":"411da074-7224-4c1c-a75a-a5c3f29c0e92","Type":"ContainerStarted","Data":"fa9d50ab5efd4579c0c94984a013857fdff7fde0ce55c9c51952d9d7399085ac"} Mar 13 11:55:08 crc kubenswrapper[4837]: I0313 11:55:08.591558 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" event={"ID":"411da074-7224-4c1c-a75a-a5c3f29c0e92","Type":"ContainerStarted","Data":"6608129f77fac4bfe6ff46697f6b6f8e01a96ebc0c5e019b1e4e787315379858"} Mar 13 11:55:08 crc kubenswrapper[4837]: I0313 11:55:08.591724 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" Mar 13 11:55:08 crc kubenswrapper[4837]: I0313 11:55:08.617834 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" podStartSLOduration=1.617813887 podStartE2EDuration="1.617813887s" podCreationTimestamp="2026-03-13 11:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:55:08.612154666 +0000 UTC m=+424.250421429" watchObservedRunningTime="2026-03-13 11:55:08.617813887 +0000 UTC m=+424.256080650" Mar 13 11:55:27 crc kubenswrapper[4837]: I0313 11:55:27.600090 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-l62n7" Mar 13 11:55:27 crc kubenswrapper[4837]: I0313 11:55:27.643719 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2w96t"] Mar 13 11:55:35 crc kubenswrapper[4837]: I0313 11:55:35.483617 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 11:55:35 crc kubenswrapper[4837]: I0313 11:55:35.484144 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 11:55:52 crc kubenswrapper[4837]: I0313 11:55:52.691323 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" podUID="9da9cfd5-f798-42e0-af98-8378cf8d1e5f" containerName="registry" containerID="cri-o://7f6dc77957ef0c3112728bef3166915837fb45018b662ba23f21fb9a5b1d11d9" gracePeriod=30 Mar 13 11:55:52 crc kubenswrapper[4837]: I0313 11:55:52.839372 4837 generic.go:334] "Generic (PLEG): container finished" podID="9da9cfd5-f798-42e0-af98-8378cf8d1e5f" containerID="7f6dc77957ef0c3112728bef3166915837fb45018b662ba23f21fb9a5b1d11d9" exitCode=0 Mar 13 11:55:52 crc kubenswrapper[4837]: I0313 11:55:52.839417 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" event={"ID":"9da9cfd5-f798-42e0-af98-8378cf8d1e5f","Type":"ContainerDied","Data":"7f6dc77957ef0c3112728bef3166915837fb45018b662ba23f21fb9a5b1d11d9"} Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.044110 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.224911 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-trusted-ca\") pod \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.224983 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-registry-certificates\") pod \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.225053 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-bound-sa-token\") pod \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.225410 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.225471 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-ca-trust-extracted\") pod \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.225518 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-installation-pull-secrets\") pod \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.225579 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-registry-tls\") pod \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.225694 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7xs7\" (UniqueName: \"kubernetes.io/projected/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-kube-api-access-g7xs7\") pod \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.226140 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9da9cfd5-f798-42e0-af98-8378cf8d1e5f" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.226827 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "9da9cfd5-f798-42e0-af98-8378cf8d1e5f" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.231256 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-kube-api-access-g7xs7" (OuterVolumeSpecName: "kube-api-access-g7xs7") pod "9da9cfd5-f798-42e0-af98-8378cf8d1e5f" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f"). InnerVolumeSpecName "kube-api-access-g7xs7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.235919 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "9da9cfd5-f798-42e0-af98-8378cf8d1e5f" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.235921 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "9da9cfd5-f798-42e0-af98-8378cf8d1e5f" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.236344 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "9da9cfd5-f798-42e0-af98-8378cf8d1e5f" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:55:53 crc kubenswrapper[4837]: E0313 11:55:53.237070 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:9da9cfd5-f798-42e0-af98-8378cf8d1e5f nodeName:}" failed. No retries permitted until 2026-03-13 11:55:53.737032374 +0000 UTC m=+469.375299207 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "registry-storage" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "9da9cfd5-f798-42e0-af98-8378cf8d1e5f" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f") : kubernetes.io/csi: Unmounter.TearDownAt failed: rpc error: code = Unknown desc = check target path: could not get consistent content of /proc/mounts after 3 attempts Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.240445 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "9da9cfd5-f798-42e0-af98-8378cf8d1e5f" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.326725 4837 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.326771 4837 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.326785 4837 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.326797 4837 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.326807 4837 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.326819 4837 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.326830 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7xs7\" (UniqueName: \"kubernetes.io/projected/9da9cfd5-f798-42e0-af98-8378cf8d1e5f-kube-api-access-g7xs7\") on node \"crc\" DevicePath \"\"" Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.833571 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\" (UID: \"9da9cfd5-f798-42e0-af98-8378cf8d1e5f\") " Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.841367 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "9da9cfd5-f798-42e0-af98-8378cf8d1e5f" (UID: "9da9cfd5-f798-42e0-af98-8378cf8d1e5f"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.850185 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" event={"ID":"9da9cfd5-f798-42e0-af98-8378cf8d1e5f","Type":"ContainerDied","Data":"791f2e4e796f079af101ec362853eaa486bb3e46d120e36fdb1c000b9b27a22e"} Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.850232 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-2w96t" Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.850242 4837 scope.go:117] "RemoveContainer" containerID="7f6dc77957ef0c3112728bef3166915837fb45018b662ba23f21fb9a5b1d11d9" Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.879686 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2w96t"] Mar 13 11:55:53 crc kubenswrapper[4837]: I0313 11:55:53.884195 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2w96t"] Mar 13 11:55:55 crc kubenswrapper[4837]: I0313 11:55:55.056323 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9da9cfd5-f798-42e0-af98-8378cf8d1e5f" path="/var/lib/kubelet/pods/9da9cfd5-f798-42e0-af98-8378cf8d1e5f/volumes" Mar 13 11:56:00 crc kubenswrapper[4837]: I0313 11:56:00.125775 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556716-csq4j"] Mar 13 11:56:00 crc kubenswrapper[4837]: E0313 11:56:00.127569 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9da9cfd5-f798-42e0-af98-8378cf8d1e5f" containerName="registry" Mar 13 11:56:00 crc kubenswrapper[4837]: I0313 11:56:00.127606 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="9da9cfd5-f798-42e0-af98-8378cf8d1e5f" containerName="registry" Mar 13 11:56:00 crc kubenswrapper[4837]: I0313 11:56:00.127748 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="9da9cfd5-f798-42e0-af98-8378cf8d1e5f" containerName="registry" Mar 13 11:56:00 crc kubenswrapper[4837]: I0313 11:56:00.128408 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556716-csq4j" Mar 13 11:56:00 crc kubenswrapper[4837]: I0313 11:56:00.131438 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 11:56:00 crc kubenswrapper[4837]: I0313 11:56:00.132008 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 11:56:00 crc kubenswrapper[4837]: I0313 11:56:00.132528 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 11:56:00 crc kubenswrapper[4837]: I0313 11:56:00.137985 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556716-csq4j"] Mar 13 11:56:00 crc kubenswrapper[4837]: I0313 11:56:00.309025 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltgvw\" (UniqueName: \"kubernetes.io/projected/0a7b275e-9d21-4da0-8bb8-0fee8434ce82-kube-api-access-ltgvw\") pod \"auto-csr-approver-29556716-csq4j\" (UID: \"0a7b275e-9d21-4da0-8bb8-0fee8434ce82\") " pod="openshift-infra/auto-csr-approver-29556716-csq4j" Mar 13 11:56:00 crc kubenswrapper[4837]: I0313 11:56:00.410785 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltgvw\" (UniqueName: \"kubernetes.io/projected/0a7b275e-9d21-4da0-8bb8-0fee8434ce82-kube-api-access-ltgvw\") pod \"auto-csr-approver-29556716-csq4j\" (UID: \"0a7b275e-9d21-4da0-8bb8-0fee8434ce82\") " pod="openshift-infra/auto-csr-approver-29556716-csq4j" Mar 13 11:56:00 crc kubenswrapper[4837]: I0313 11:56:00.430353 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltgvw\" (UniqueName: \"kubernetes.io/projected/0a7b275e-9d21-4da0-8bb8-0fee8434ce82-kube-api-access-ltgvw\") pod \"auto-csr-approver-29556716-csq4j\" (UID: \"0a7b275e-9d21-4da0-8bb8-0fee8434ce82\") " pod="openshift-infra/auto-csr-approver-29556716-csq4j" Mar 13 11:56:00 crc kubenswrapper[4837]: I0313 11:56:00.463715 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556716-csq4j" Mar 13 11:56:00 crc kubenswrapper[4837]: I0313 11:56:00.634386 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556716-csq4j"] Mar 13 11:56:00 crc kubenswrapper[4837]: I0313 11:56:00.885698 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556716-csq4j" event={"ID":"0a7b275e-9d21-4da0-8bb8-0fee8434ce82","Type":"ContainerStarted","Data":"abd71683adb08a7e3429bf21f122cd526828d14dc11d94dfe8a93cf1f3cae919"} Mar 13 11:56:02 crc kubenswrapper[4837]: I0313 11:56:02.900356 4837 generic.go:334] "Generic (PLEG): container finished" podID="0a7b275e-9d21-4da0-8bb8-0fee8434ce82" containerID="b8629809cebf6aa743a349229b16e8ffb9aaa032ac5c2d5f39b44ba6478a1a13" exitCode=0 Mar 13 11:56:02 crc kubenswrapper[4837]: I0313 11:56:02.900468 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556716-csq4j" event={"ID":"0a7b275e-9d21-4da0-8bb8-0fee8434ce82","Type":"ContainerDied","Data":"b8629809cebf6aa743a349229b16e8ffb9aaa032ac5c2d5f39b44ba6478a1a13"} Mar 13 11:56:04 crc kubenswrapper[4837]: I0313 11:56:04.132304 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556716-csq4j" Mar 13 11:56:04 crc kubenswrapper[4837]: I0313 11:56:04.261153 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltgvw\" (UniqueName: \"kubernetes.io/projected/0a7b275e-9d21-4da0-8bb8-0fee8434ce82-kube-api-access-ltgvw\") pod \"0a7b275e-9d21-4da0-8bb8-0fee8434ce82\" (UID: \"0a7b275e-9d21-4da0-8bb8-0fee8434ce82\") " Mar 13 11:56:04 crc kubenswrapper[4837]: I0313 11:56:04.266318 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a7b275e-9d21-4da0-8bb8-0fee8434ce82-kube-api-access-ltgvw" (OuterVolumeSpecName: "kube-api-access-ltgvw") pod "0a7b275e-9d21-4da0-8bb8-0fee8434ce82" (UID: "0a7b275e-9d21-4da0-8bb8-0fee8434ce82"). InnerVolumeSpecName "kube-api-access-ltgvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:56:04 crc kubenswrapper[4837]: I0313 11:56:04.362569 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltgvw\" (UniqueName: \"kubernetes.io/projected/0a7b275e-9d21-4da0-8bb8-0fee8434ce82-kube-api-access-ltgvw\") on node \"crc\" DevicePath \"\"" Mar 13 11:56:04 crc kubenswrapper[4837]: I0313 11:56:04.914940 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556716-csq4j" event={"ID":"0a7b275e-9d21-4da0-8bb8-0fee8434ce82","Type":"ContainerDied","Data":"abd71683adb08a7e3429bf21f122cd526828d14dc11d94dfe8a93cf1f3cae919"} Mar 13 11:56:04 crc kubenswrapper[4837]: I0313 11:56:04.915269 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abd71683adb08a7e3429bf21f122cd526828d14dc11d94dfe8a93cf1f3cae919" Mar 13 11:56:04 crc kubenswrapper[4837]: I0313 11:56:04.914993 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556716-csq4j" Mar 13 11:56:05 crc kubenswrapper[4837]: I0313 11:56:05.200034 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556710-lcprh"] Mar 13 11:56:05 crc kubenswrapper[4837]: I0313 11:56:05.203271 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556710-lcprh"] Mar 13 11:56:05 crc kubenswrapper[4837]: I0313 11:56:05.483716 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 11:56:05 crc kubenswrapper[4837]: I0313 11:56:05.483807 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 11:56:05 crc kubenswrapper[4837]: I0313 11:56:05.483881 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 11:56:05 crc kubenswrapper[4837]: I0313 11:56:05.484860 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ea590165224c827e615cf9230078895eabcfa03489c1d34d92662f043fe58752"} pod="openshift-machine-config-operator/machine-config-daemon-2td4d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 11:56:05 crc kubenswrapper[4837]: I0313 11:56:05.485030 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" containerID="cri-o://ea590165224c827e615cf9230078895eabcfa03489c1d34d92662f043fe58752" gracePeriod=600 Mar 13 11:56:05 crc kubenswrapper[4837]: I0313 11:56:05.922723 4837 generic.go:334] "Generic (PLEG): container finished" podID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerID="ea590165224c827e615cf9230078895eabcfa03489c1d34d92662f043fe58752" exitCode=0 Mar 13 11:56:05 crc kubenswrapper[4837]: I0313 11:56:05.922763 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerDied","Data":"ea590165224c827e615cf9230078895eabcfa03489c1d34d92662f043fe58752"} Mar 13 11:56:05 crc kubenswrapper[4837]: I0313 11:56:05.923014 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerStarted","Data":"2bc74d238c7c3c8f94dfb05f2715b04d643751479532bb38893e7ef8db5a10d2"} Mar 13 11:56:05 crc kubenswrapper[4837]: I0313 11:56:05.923034 4837 scope.go:117] "RemoveContainer" containerID="87e8fbda4a5050c062e330cf8670520af017565db798af0df232b0dbb4564a7a" Mar 13 11:56:07 crc kubenswrapper[4837]: I0313 11:56:07.056332 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0484d991-f239-47a2-80ff-0237945c27ac" path="/var/lib/kubelet/pods/0484d991-f239-47a2-80ff-0237945c27ac/volumes" Mar 13 11:58:00 crc kubenswrapper[4837]: I0313 11:58:00.143883 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556718-7z6qj"] Mar 13 11:58:00 crc kubenswrapper[4837]: E0313 11:58:00.144947 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a7b275e-9d21-4da0-8bb8-0fee8434ce82" containerName="oc" Mar 13 11:58:00 crc kubenswrapper[4837]: I0313 11:58:00.144967 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a7b275e-9d21-4da0-8bb8-0fee8434ce82" containerName="oc" Mar 13 11:58:00 crc kubenswrapper[4837]: I0313 11:58:00.145088 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a7b275e-9d21-4da0-8bb8-0fee8434ce82" containerName="oc" Mar 13 11:58:00 crc kubenswrapper[4837]: I0313 11:58:00.145624 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556718-7z6qj" Mar 13 11:58:00 crc kubenswrapper[4837]: I0313 11:58:00.149899 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 11:58:00 crc kubenswrapper[4837]: I0313 11:58:00.149995 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 11:58:00 crc kubenswrapper[4837]: I0313 11:58:00.150012 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 11:58:00 crc kubenswrapper[4837]: I0313 11:58:00.150329 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556718-7z6qj"] Mar 13 11:58:00 crc kubenswrapper[4837]: I0313 11:58:00.177044 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qskx\" (UniqueName: \"kubernetes.io/projected/aa01e7a4-71d3-4c91-8319-52a575269601-kube-api-access-2qskx\") pod \"auto-csr-approver-29556718-7z6qj\" (UID: \"aa01e7a4-71d3-4c91-8319-52a575269601\") " pod="openshift-infra/auto-csr-approver-29556718-7z6qj" Mar 13 11:58:00 crc kubenswrapper[4837]: I0313 11:58:00.278032 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qskx\" (UniqueName: \"kubernetes.io/projected/aa01e7a4-71d3-4c91-8319-52a575269601-kube-api-access-2qskx\") pod \"auto-csr-approver-29556718-7z6qj\" (UID: \"aa01e7a4-71d3-4c91-8319-52a575269601\") " pod="openshift-infra/auto-csr-approver-29556718-7z6qj" Mar 13 11:58:00 crc kubenswrapper[4837]: I0313 11:58:00.298382 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qskx\" (UniqueName: \"kubernetes.io/projected/aa01e7a4-71d3-4c91-8319-52a575269601-kube-api-access-2qskx\") pod \"auto-csr-approver-29556718-7z6qj\" (UID: \"aa01e7a4-71d3-4c91-8319-52a575269601\") " pod="openshift-infra/auto-csr-approver-29556718-7z6qj" Mar 13 11:58:00 crc kubenswrapper[4837]: I0313 11:58:00.471595 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556718-7z6qj" Mar 13 11:58:00 crc kubenswrapper[4837]: I0313 11:58:00.682910 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556718-7z6qj"] Mar 13 11:58:00 crc kubenswrapper[4837]: I0313 11:58:00.688362 4837 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 11:58:01 crc kubenswrapper[4837]: I0313 11:58:01.613757 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556718-7z6qj" event={"ID":"aa01e7a4-71d3-4c91-8319-52a575269601","Type":"ContainerStarted","Data":"82ceeb88401e1a8931c1864b4f9ba89d57c9e6f7a5005ce9219be818b9826cdb"} Mar 13 11:58:02 crc kubenswrapper[4837]: I0313 11:58:02.621148 4837 generic.go:334] "Generic (PLEG): container finished" podID="aa01e7a4-71d3-4c91-8319-52a575269601" containerID="f165f764ee51b6b29672c3c9a0ac54376301b2d6f3ce983abfa09b63813909b9" exitCode=0 Mar 13 11:58:02 crc kubenswrapper[4837]: I0313 11:58:02.621200 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556718-7z6qj" event={"ID":"aa01e7a4-71d3-4c91-8319-52a575269601","Type":"ContainerDied","Data":"f165f764ee51b6b29672c3c9a0ac54376301b2d6f3ce983abfa09b63813909b9"} Mar 13 11:58:03 crc kubenswrapper[4837]: I0313 11:58:03.870697 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556718-7z6qj" Mar 13 11:58:04 crc kubenswrapper[4837]: I0313 11:58:04.022021 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qskx\" (UniqueName: \"kubernetes.io/projected/aa01e7a4-71d3-4c91-8319-52a575269601-kube-api-access-2qskx\") pod \"aa01e7a4-71d3-4c91-8319-52a575269601\" (UID: \"aa01e7a4-71d3-4c91-8319-52a575269601\") " Mar 13 11:58:04 crc kubenswrapper[4837]: I0313 11:58:04.027773 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa01e7a4-71d3-4c91-8319-52a575269601-kube-api-access-2qskx" (OuterVolumeSpecName: "kube-api-access-2qskx") pod "aa01e7a4-71d3-4c91-8319-52a575269601" (UID: "aa01e7a4-71d3-4c91-8319-52a575269601"). InnerVolumeSpecName "kube-api-access-2qskx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:58:04 crc kubenswrapper[4837]: I0313 11:58:04.123351 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qskx\" (UniqueName: \"kubernetes.io/projected/aa01e7a4-71d3-4c91-8319-52a575269601-kube-api-access-2qskx\") on node \"crc\" DevicePath \"\"" Mar 13 11:58:04 crc kubenswrapper[4837]: I0313 11:58:04.638548 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556718-7z6qj" event={"ID":"aa01e7a4-71d3-4c91-8319-52a575269601","Type":"ContainerDied","Data":"82ceeb88401e1a8931c1864b4f9ba89d57c9e6f7a5005ce9219be818b9826cdb"} Mar 13 11:58:04 crc kubenswrapper[4837]: I0313 11:58:04.638594 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82ceeb88401e1a8931c1864b4f9ba89d57c9e6f7a5005ce9219be818b9826cdb" Mar 13 11:58:04 crc kubenswrapper[4837]: I0313 11:58:04.638681 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556718-7z6qj" Mar 13 11:58:04 crc kubenswrapper[4837]: I0313 11:58:04.938702 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556712-g8877"] Mar 13 11:58:04 crc kubenswrapper[4837]: I0313 11:58:04.944908 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556712-g8877"] Mar 13 11:58:05 crc kubenswrapper[4837]: I0313 11:58:05.059875 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87edec8a-33b2-44c0-bbcb-1e4f5dded1b2" path="/var/lib/kubelet/pods/87edec8a-33b2-44c0-bbcb-1e4f5dded1b2/volumes" Mar 13 11:58:05 crc kubenswrapper[4837]: I0313 11:58:05.484512 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 11:58:05 crc kubenswrapper[4837]: I0313 11:58:05.484589 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 11:58:21 crc kubenswrapper[4837]: I0313 11:58:21.629800 4837 scope.go:117] "RemoveContainer" containerID="7788f0babcbd0ba3005289dc42abd3560a56f1f0efe57b0376342454820793c4" Mar 13 11:58:35 crc kubenswrapper[4837]: I0313 11:58:35.484151 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 11:58:35 crc kubenswrapper[4837]: I0313 11:58:35.484777 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 11:59:05 crc kubenswrapper[4837]: I0313 11:59:05.483619 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 11:59:05 crc kubenswrapper[4837]: I0313 11:59:05.484199 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 11:59:05 crc kubenswrapper[4837]: I0313 11:59:05.484244 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 11:59:05 crc kubenswrapper[4837]: I0313 11:59:05.484848 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2bc74d238c7c3c8f94dfb05f2715b04d643751479532bb38893e7ef8db5a10d2"} pod="openshift-machine-config-operator/machine-config-daemon-2td4d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 11:59:05 crc kubenswrapper[4837]: I0313 11:59:05.484939 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" containerID="cri-o://2bc74d238c7c3c8f94dfb05f2715b04d643751479532bb38893e7ef8db5a10d2" gracePeriod=600 Mar 13 11:59:06 crc kubenswrapper[4837]: I0313 11:59:06.031539 4837 generic.go:334] "Generic (PLEG): container finished" podID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerID="2bc74d238c7c3c8f94dfb05f2715b04d643751479532bb38893e7ef8db5a10d2" exitCode=0 Mar 13 11:59:06 crc kubenswrapper[4837]: I0313 11:59:06.031870 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerDied","Data":"2bc74d238c7c3c8f94dfb05f2715b04d643751479532bb38893e7ef8db5a10d2"} Mar 13 11:59:06 crc kubenswrapper[4837]: I0313 11:59:06.031974 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerStarted","Data":"86010f8ae6e03e22840b0db405e4816a52e1a80af0eff6188dd5d3d81e63937a"} Mar 13 11:59:06 crc kubenswrapper[4837]: I0313 11:59:06.032016 4837 scope.go:117] "RemoveContainer" containerID="ea590165224c827e615cf9230078895eabcfa03489c1d34d92662f043fe58752" Mar 13 11:59:21 crc kubenswrapper[4837]: I0313 11:59:21.670629 4837 scope.go:117] "RemoveContainer" containerID="8c4d75bce91d26c5c90ccce3126b557507017a92b0dd1db884cee46957fc8b2f" Mar 13 11:59:21 crc kubenswrapper[4837]: I0313 11:59:21.722387 4837 scope.go:117] "RemoveContainer" containerID="b2ba4ee22041e914a3b1573c300fce67d3ac337c4d9b3d85c86421a82bc9711f" Mar 13 11:59:40 crc kubenswrapper[4837]: I0313 11:59:40.661279 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-xzv5h"] Mar 13 11:59:40 crc kubenswrapper[4837]: E0313 11:59:40.662088 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa01e7a4-71d3-4c91-8319-52a575269601" containerName="oc" Mar 13 11:59:40 crc kubenswrapper[4837]: I0313 11:59:40.662105 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa01e7a4-71d3-4c91-8319-52a575269601" containerName="oc" Mar 13 11:59:40 crc kubenswrapper[4837]: I0313 11:59:40.662232 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa01e7a4-71d3-4c91-8319-52a575269601" containerName="oc" Mar 13 11:59:40 crc kubenswrapper[4837]: I0313 11:59:40.662682 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-xzv5h" Mar 13 11:59:40 crc kubenswrapper[4837]: I0313 11:59:40.664867 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 13 11:59:40 crc kubenswrapper[4837]: I0313 11:59:40.665712 4837 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-xdfzs" Mar 13 11:59:40 crc kubenswrapper[4837]: I0313 11:59:40.666414 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 13 11:59:40 crc kubenswrapper[4837]: I0313 11:59:40.693562 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-xzv5h"] Mar 13 11:59:40 crc kubenswrapper[4837]: I0313 11:59:40.701794 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-dlspp"] Mar 13 11:59:40 crc kubenswrapper[4837]: I0313 11:59:40.702613 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-dlspp" Mar 13 11:59:40 crc kubenswrapper[4837]: I0313 11:59:40.705845 4837 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-jnncb" Mar 13 11:59:40 crc kubenswrapper[4837]: I0313 11:59:40.709501 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-ht9vn"] Mar 13 11:59:40 crc kubenswrapper[4837]: I0313 11:59:40.710576 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-ht9vn" Mar 13 11:59:40 crc kubenswrapper[4837]: I0313 11:59:40.716489 4837 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-25pqh" Mar 13 11:59:40 crc kubenswrapper[4837]: I0313 11:59:40.718953 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-dlspp"] Mar 13 11:59:40 crc kubenswrapper[4837]: I0313 11:59:40.723336 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-ht9vn"] Mar 13 11:59:40 crc kubenswrapper[4837]: I0313 11:59:40.770048 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47djl\" (UniqueName: \"kubernetes.io/projected/67507b8e-35d5-4dff-9239-45b5ef997e53-kube-api-access-47djl\") pod \"cert-manager-cainjector-cf98fcc89-xzv5h\" (UID: \"67507b8e-35d5-4dff-9239-45b5ef997e53\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-xzv5h" Mar 13 11:59:40 crc kubenswrapper[4837]: I0313 11:59:40.770105 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgn78\" (UniqueName: \"kubernetes.io/projected/0e500b82-1f14-4a1e-937d-00248f195033-kube-api-access-xgn78\") pod \"cert-manager-webhook-687f57d79b-ht9vn\" (UID: \"0e500b82-1f14-4a1e-937d-00248f195033\") " pod="cert-manager/cert-manager-webhook-687f57d79b-ht9vn" Mar 13 11:59:40 crc kubenswrapper[4837]: I0313 11:59:40.770132 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv2kk\" (UniqueName: \"kubernetes.io/projected/5ecc1237-3421-41d5-8efb-a62399ae1d73-kube-api-access-hv2kk\") pod \"cert-manager-858654f9db-dlspp\" (UID: \"5ecc1237-3421-41d5-8efb-a62399ae1d73\") " pod="cert-manager/cert-manager-858654f9db-dlspp" Mar 13 11:59:40 crc kubenswrapper[4837]: I0313 11:59:40.871390 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv2kk\" (UniqueName: \"kubernetes.io/projected/5ecc1237-3421-41d5-8efb-a62399ae1d73-kube-api-access-hv2kk\") pod \"cert-manager-858654f9db-dlspp\" (UID: \"5ecc1237-3421-41d5-8efb-a62399ae1d73\") " pod="cert-manager/cert-manager-858654f9db-dlspp" Mar 13 11:59:40 crc kubenswrapper[4837]: I0313 11:59:40.871703 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47djl\" (UniqueName: \"kubernetes.io/projected/67507b8e-35d5-4dff-9239-45b5ef997e53-kube-api-access-47djl\") pod \"cert-manager-cainjector-cf98fcc89-xzv5h\" (UID: \"67507b8e-35d5-4dff-9239-45b5ef997e53\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-xzv5h" Mar 13 11:59:40 crc kubenswrapper[4837]: I0313 11:59:40.871803 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgn78\" (UniqueName: \"kubernetes.io/projected/0e500b82-1f14-4a1e-937d-00248f195033-kube-api-access-xgn78\") pod \"cert-manager-webhook-687f57d79b-ht9vn\" (UID: \"0e500b82-1f14-4a1e-937d-00248f195033\") " pod="cert-manager/cert-manager-webhook-687f57d79b-ht9vn" Mar 13 11:59:40 crc kubenswrapper[4837]: I0313 11:59:40.889506 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47djl\" (UniqueName: \"kubernetes.io/projected/67507b8e-35d5-4dff-9239-45b5ef997e53-kube-api-access-47djl\") pod \"cert-manager-cainjector-cf98fcc89-xzv5h\" (UID: \"67507b8e-35d5-4dff-9239-45b5ef997e53\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-xzv5h" Mar 13 11:59:40 crc kubenswrapper[4837]: I0313 11:59:40.894290 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgn78\" (UniqueName: \"kubernetes.io/projected/0e500b82-1f14-4a1e-937d-00248f195033-kube-api-access-xgn78\") pod \"cert-manager-webhook-687f57d79b-ht9vn\" (UID: \"0e500b82-1f14-4a1e-937d-00248f195033\") " pod="cert-manager/cert-manager-webhook-687f57d79b-ht9vn" Mar 13 11:59:40 crc kubenswrapper[4837]: I0313 11:59:40.894311 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv2kk\" (UniqueName: \"kubernetes.io/projected/5ecc1237-3421-41d5-8efb-a62399ae1d73-kube-api-access-hv2kk\") pod \"cert-manager-858654f9db-dlspp\" (UID: \"5ecc1237-3421-41d5-8efb-a62399ae1d73\") " pod="cert-manager/cert-manager-858654f9db-dlspp" Mar 13 11:59:40 crc kubenswrapper[4837]: I0313 11:59:40.996595 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-xzv5h" Mar 13 11:59:41 crc kubenswrapper[4837]: I0313 11:59:41.021834 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-dlspp" Mar 13 11:59:41 crc kubenswrapper[4837]: I0313 11:59:41.035167 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-ht9vn" Mar 13 11:59:41 crc kubenswrapper[4837]: I0313 11:59:41.255772 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-dlspp"] Mar 13 11:59:41 crc kubenswrapper[4837]: W0313 11:59:41.264014 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ecc1237_3421_41d5_8efb_a62399ae1d73.slice/crio-7b0568ddce0a7122ede669bc4540653f97b3616bf1c2a18fd7ad5b933bc2ddcd WatchSource:0}: Error finding container 7b0568ddce0a7122ede669bc4540653f97b3616bf1c2a18fd7ad5b933bc2ddcd: Status 404 returned error can't find the container with id 7b0568ddce0a7122ede669bc4540653f97b3616bf1c2a18fd7ad5b933bc2ddcd Mar 13 11:59:41 crc kubenswrapper[4837]: I0313 11:59:41.299496 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-ht9vn"] Mar 13 11:59:41 crc kubenswrapper[4837]: W0313 11:59:41.303798 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e500b82_1f14_4a1e_937d_00248f195033.slice/crio-ee9015113da4c7d3fbcd027aed4b686615b86b9a2292c4739914d0c68ed415d4 WatchSource:0}: Error finding container ee9015113da4c7d3fbcd027aed4b686615b86b9a2292c4739914d0c68ed415d4: Status 404 returned error can't find the container with id ee9015113da4c7d3fbcd027aed4b686615b86b9a2292c4739914d0c68ed415d4 Mar 13 11:59:41 crc kubenswrapper[4837]: I0313 11:59:41.416409 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-xzv5h"] Mar 13 11:59:41 crc kubenswrapper[4837]: W0313 11:59:41.418106 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67507b8e_35d5_4dff_9239_45b5ef997e53.slice/crio-159f09b062282a35a2aaedb5d409c6e711dd40d0c03a3546319f1b4cc4145fe0 WatchSource:0}: Error finding container 159f09b062282a35a2aaedb5d409c6e711dd40d0c03a3546319f1b4cc4145fe0: Status 404 returned error can't find the container with id 159f09b062282a35a2aaedb5d409c6e711dd40d0c03a3546319f1b4cc4145fe0 Mar 13 11:59:42 crc kubenswrapper[4837]: I0313 11:59:42.256997 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-xzv5h" event={"ID":"67507b8e-35d5-4dff-9239-45b5ef997e53","Type":"ContainerStarted","Data":"159f09b062282a35a2aaedb5d409c6e711dd40d0c03a3546319f1b4cc4145fe0"} Mar 13 11:59:42 crc kubenswrapper[4837]: I0313 11:59:42.258741 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-dlspp" event={"ID":"5ecc1237-3421-41d5-8efb-a62399ae1d73","Type":"ContainerStarted","Data":"7b0568ddce0a7122ede669bc4540653f97b3616bf1c2a18fd7ad5b933bc2ddcd"} Mar 13 11:59:42 crc kubenswrapper[4837]: I0313 11:59:42.261471 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-ht9vn" event={"ID":"0e500b82-1f14-4a1e-937d-00248f195033","Type":"ContainerStarted","Data":"ee9015113da4c7d3fbcd027aed4b686615b86b9a2292c4739914d0c68ed415d4"} Mar 13 11:59:45 crc kubenswrapper[4837]: I0313 11:59:45.281032 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-dlspp" event={"ID":"5ecc1237-3421-41d5-8efb-a62399ae1d73","Type":"ContainerStarted","Data":"44d7dab06b3023c31fc534449c16fc8ad640daae562b4e6c0be834a4f3240fd7"} Mar 13 11:59:45 crc kubenswrapper[4837]: I0313 11:59:45.282915 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-xzv5h" event={"ID":"67507b8e-35d5-4dff-9239-45b5ef997e53","Type":"ContainerStarted","Data":"1a9bec7e517ddcc37bb4fd44363c9eda050021d2f6a4bf27d0650acae2c529a7"} Mar 13 11:59:45 crc kubenswrapper[4837]: I0313 11:59:45.307666 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-dlspp" podStartSLOduration=2.8141037669999998 podStartE2EDuration="5.307614523s" podCreationTimestamp="2026-03-13 11:59:40 +0000 UTC" firstStartedPulling="2026-03-13 11:59:41.270708346 +0000 UTC m=+696.908975109" lastFinishedPulling="2026-03-13 11:59:43.764219082 +0000 UTC m=+699.402485865" observedRunningTime="2026-03-13 11:59:45.302256652 +0000 UTC m=+700.940523415" watchObservedRunningTime="2026-03-13 11:59:45.307614523 +0000 UTC m=+700.945881286" Mar 13 11:59:45 crc kubenswrapper[4837]: I0313 11:59:45.319738 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-xzv5h" podStartSLOduration=2.416130128 podStartE2EDuration="5.319714617s" podCreationTimestamp="2026-03-13 11:59:40 +0000 UTC" firstStartedPulling="2026-03-13 11:59:41.420380754 +0000 UTC m=+697.058647517" lastFinishedPulling="2026-03-13 11:59:44.323965253 +0000 UTC m=+699.962232006" observedRunningTime="2026-03-13 11:59:45.316865316 +0000 UTC m=+700.955132099" watchObservedRunningTime="2026-03-13 11:59:45.319714617 +0000 UTC m=+700.957981400" Mar 13 11:59:46 crc kubenswrapper[4837]: I0313 11:59:46.294947 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-ht9vn" event={"ID":"0e500b82-1f14-4a1e-937d-00248f195033","Type":"ContainerStarted","Data":"778bc71e71fb427e60132faf96dc5d3b619b9b60802da5fc5908a5736d49e00f"} Mar 13 11:59:46 crc kubenswrapper[4837]: I0313 11:59:46.321254 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-ht9vn" podStartSLOduration=2.403676423 podStartE2EDuration="6.321225614s" podCreationTimestamp="2026-03-13 11:59:40 +0000 UTC" firstStartedPulling="2026-03-13 11:59:41.305900292 +0000 UTC m=+696.944167055" lastFinishedPulling="2026-03-13 11:59:45.223449483 +0000 UTC m=+700.861716246" observedRunningTime="2026-03-13 11:59:46.314450289 +0000 UTC m=+701.952717062" watchObservedRunningTime="2026-03-13 11:59:46.321225614 +0000 UTC m=+701.959492387" Mar 13 11:59:47 crc kubenswrapper[4837]: I0313 11:59:47.300101 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-ht9vn" Mar 13 11:59:50 crc kubenswrapper[4837]: I0313 11:59:50.728250 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4zzrs"] Mar 13 11:59:50 crc kubenswrapper[4837]: I0313 11:59:50.730298 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="kube-rbac-proxy-node" containerID="cri-o://bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2" gracePeriod=30 Mar 13 11:59:50 crc kubenswrapper[4837]: I0313 11:59:50.730423 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793" gracePeriod=30 Mar 13 11:59:50 crc kubenswrapper[4837]: I0313 11:59:50.730435 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="ovn-acl-logging" containerID="cri-o://c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d" gracePeriod=30 Mar 13 11:59:50 crc kubenswrapper[4837]: I0313 11:59:50.730335 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="northd" containerID="cri-o://7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7" gracePeriod=30 Mar 13 11:59:50 crc kubenswrapper[4837]: I0313 11:59:50.730528 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="sbdb" containerID="cri-o://60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a" gracePeriod=30 Mar 13 11:59:50 crc kubenswrapper[4837]: I0313 11:59:50.730252 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="ovn-controller" containerID="cri-o://b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1" gracePeriod=30 Mar 13 11:59:50 crc kubenswrapper[4837]: I0313 11:59:50.733550 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="nbdb" containerID="cri-o://80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39" gracePeriod=30 Mar 13 11:59:50 crc kubenswrapper[4837]: I0313 11:59:50.757283 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="ovnkube-controller" containerID="cri-o://f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282" gracePeriod=30 Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.038124 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-ht9vn" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.107710 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4zzrs_43df29f7-1351-41f5-bfca-17f804837cb4/ovnkube-controller/3.log" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.110458 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4zzrs_43df29f7-1351-41f5-bfca-17f804837cb4/ovn-acl-logging/0.log" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.111047 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4zzrs_43df29f7-1351-41f5-bfca-17f804837cb4/ovn-controller/0.log" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.111577 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.172624 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-6bbfz"] Mar 13 11:59:51 crc kubenswrapper[4837]: E0313 11:59:51.172861 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="kube-rbac-proxy-ovn-metrics" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.172876 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="kube-rbac-proxy-ovn-metrics" Mar 13 11:59:51 crc kubenswrapper[4837]: E0313 11:59:51.172887 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="ovnkube-controller" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.172893 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="ovnkube-controller" Mar 13 11:59:51 crc kubenswrapper[4837]: E0313 11:59:51.172901 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="ovnkube-controller" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.172907 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="ovnkube-controller" Mar 13 11:59:51 crc kubenswrapper[4837]: E0313 11:59:51.172914 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="ovn-acl-logging" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.172922 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="ovn-acl-logging" Mar 13 11:59:51 crc kubenswrapper[4837]: E0313 11:59:51.172933 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="ovnkube-controller" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.172940 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="ovnkube-controller" Mar 13 11:59:51 crc kubenswrapper[4837]: E0313 11:59:51.172951 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="ovnkube-controller" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.172960 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="ovnkube-controller" Mar 13 11:59:51 crc kubenswrapper[4837]: E0313 11:59:51.172971 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="northd" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.172976 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="northd" Mar 13 11:59:51 crc kubenswrapper[4837]: E0313 11:59:51.172986 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="sbdb" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.172992 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="sbdb" Mar 13 11:59:51 crc kubenswrapper[4837]: E0313 11:59:51.173002 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="nbdb" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.173009 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="nbdb" Mar 13 11:59:51 crc kubenswrapper[4837]: E0313 11:59:51.173017 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="kube-rbac-proxy-node" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.173025 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="kube-rbac-proxy-node" Mar 13 11:59:51 crc kubenswrapper[4837]: E0313 11:59:51.173035 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="kubecfg-setup" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.173041 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="kubecfg-setup" Mar 13 11:59:51 crc kubenswrapper[4837]: E0313 11:59:51.173051 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="ovn-controller" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.173056 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="ovn-controller" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.173170 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="northd" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.173182 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="ovnkube-controller" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.173191 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="sbdb" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.173200 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="nbdb" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.173209 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="ovnkube-controller" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.173219 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="kube-rbac-proxy-node" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.173228 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="ovnkube-controller" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.173237 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="ovnkube-controller" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.173252 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="ovn-controller" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.173263 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="ovn-acl-logging" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.173272 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="kube-rbac-proxy-ovn-metrics" Mar 13 11:59:51 crc kubenswrapper[4837]: E0313 11:59:51.173468 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="ovnkube-controller" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.173479 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="ovnkube-controller" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.173612 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" containerName="ovnkube-controller" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.178545 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.218840 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-kubelet\") pod \"43df29f7-1351-41f5-bfca-17f804837cb4\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.218944 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85hll\" (UniqueName: \"kubernetes.io/projected/43df29f7-1351-41f5-bfca-17f804837cb4-kube-api-access-85hll\") pod \"43df29f7-1351-41f5-bfca-17f804837cb4\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.218984 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "43df29f7-1351-41f5-bfca-17f804837cb4" (UID: "43df29f7-1351-41f5-bfca-17f804837cb4"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.219005 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-run-openvswitch\") pod \"43df29f7-1351-41f5-bfca-17f804837cb4\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.219078 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-systemd-units\") pod \"43df29f7-1351-41f5-bfca-17f804837cb4\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.219139 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-run-ovn\") pod \"43df29f7-1351-41f5-bfca-17f804837cb4\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.219081 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "43df29f7-1351-41f5-bfca-17f804837cb4" (UID: "43df29f7-1351-41f5-bfca-17f804837cb4"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.219213 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "43df29f7-1351-41f5-bfca-17f804837cb4" (UID: "43df29f7-1351-41f5-bfca-17f804837cb4"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.219255 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-log-socket" (OuterVolumeSpecName: "log-socket") pod "43df29f7-1351-41f5-bfca-17f804837cb4" (UID: "43df29f7-1351-41f5-bfca-17f804837cb4"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.219207 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-log-socket\") pod \"43df29f7-1351-41f5-bfca-17f804837cb4\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.219281 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "43df29f7-1351-41f5-bfca-17f804837cb4" (UID: "43df29f7-1351-41f5-bfca-17f804837cb4"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.219287 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-run-systemd\") pod \"43df29f7-1351-41f5-bfca-17f804837cb4\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.219325 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-cni-netd\") pod \"43df29f7-1351-41f5-bfca-17f804837cb4\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.219347 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-var-lib-openvswitch\") pod \"43df29f7-1351-41f5-bfca-17f804837cb4\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.219374 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-cni-bin\") pod \"43df29f7-1351-41f5-bfca-17f804837cb4\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.219404 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"43df29f7-1351-41f5-bfca-17f804837cb4\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.219434 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/43df29f7-1351-41f5-bfca-17f804837cb4-ovn-node-metrics-cert\") pod \"43df29f7-1351-41f5-bfca-17f804837cb4\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.219462 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-etc-openvswitch\") pod \"43df29f7-1351-41f5-bfca-17f804837cb4\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.219486 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/43df29f7-1351-41f5-bfca-17f804837cb4-ovnkube-config\") pod \"43df29f7-1351-41f5-bfca-17f804837cb4\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.219509 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-run-netns\") pod \"43df29f7-1351-41f5-bfca-17f804837cb4\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.219536 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-run-ovn-kubernetes\") pod \"43df29f7-1351-41f5-bfca-17f804837cb4\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.219555 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-slash\") pod \"43df29f7-1351-41f5-bfca-17f804837cb4\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.219588 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/43df29f7-1351-41f5-bfca-17f804837cb4-ovnkube-script-lib\") pod \"43df29f7-1351-41f5-bfca-17f804837cb4\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.219609 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/43df29f7-1351-41f5-bfca-17f804837cb4-env-overrides\") pod \"43df29f7-1351-41f5-bfca-17f804837cb4\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.219630 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-node-log\") pod \"43df29f7-1351-41f5-bfca-17f804837cb4\" (UID: \"43df29f7-1351-41f5-bfca-17f804837cb4\") " Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.219880 4837 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.219905 4837 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.219920 4837 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.219932 4837 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-log-socket\") on node \"crc\" DevicePath \"\"" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.219943 4837 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.219976 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-node-log" (OuterVolumeSpecName: "node-log") pod "43df29f7-1351-41f5-bfca-17f804837cb4" (UID: "43df29f7-1351-41f5-bfca-17f804837cb4"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.220001 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "43df29f7-1351-41f5-bfca-17f804837cb4" (UID: "43df29f7-1351-41f5-bfca-17f804837cb4"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.220025 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "43df29f7-1351-41f5-bfca-17f804837cb4" (UID: "43df29f7-1351-41f5-bfca-17f804837cb4"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.220047 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "43df29f7-1351-41f5-bfca-17f804837cb4" (UID: "43df29f7-1351-41f5-bfca-17f804837cb4"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.220073 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "43df29f7-1351-41f5-bfca-17f804837cb4" (UID: "43df29f7-1351-41f5-bfca-17f804837cb4"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.220393 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "43df29f7-1351-41f5-bfca-17f804837cb4" (UID: "43df29f7-1351-41f5-bfca-17f804837cb4"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.220481 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "43df29f7-1351-41f5-bfca-17f804837cb4" (UID: "43df29f7-1351-41f5-bfca-17f804837cb4"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.220556 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "43df29f7-1351-41f5-bfca-17f804837cb4" (UID: "43df29f7-1351-41f5-bfca-17f804837cb4"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.220834 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43df29f7-1351-41f5-bfca-17f804837cb4-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "43df29f7-1351-41f5-bfca-17f804837cb4" (UID: "43df29f7-1351-41f5-bfca-17f804837cb4"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.220849 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43df29f7-1351-41f5-bfca-17f804837cb4-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "43df29f7-1351-41f5-bfca-17f804837cb4" (UID: "43df29f7-1351-41f5-bfca-17f804837cb4"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.220702 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-slash" (OuterVolumeSpecName: "host-slash") pod "43df29f7-1351-41f5-bfca-17f804837cb4" (UID: "43df29f7-1351-41f5-bfca-17f804837cb4"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.221512 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43df29f7-1351-41f5-bfca-17f804837cb4-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "43df29f7-1351-41f5-bfca-17f804837cb4" (UID: "43df29f7-1351-41f5-bfca-17f804837cb4"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.226458 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43df29f7-1351-41f5-bfca-17f804837cb4-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "43df29f7-1351-41f5-bfca-17f804837cb4" (UID: "43df29f7-1351-41f5-bfca-17f804837cb4"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.226741 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43df29f7-1351-41f5-bfca-17f804837cb4-kube-api-access-85hll" (OuterVolumeSpecName: "kube-api-access-85hll") pod "43df29f7-1351-41f5-bfca-17f804837cb4" (UID: "43df29f7-1351-41f5-bfca-17f804837cb4"). InnerVolumeSpecName "kube-api-access-85hll". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.235720 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "43df29f7-1351-41f5-bfca-17f804837cb4" (UID: "43df29f7-1351-41f5-bfca-17f804837cb4"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.321449 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-host-run-netns\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.321516 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-run-openvswitch\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.321541 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-run-systemd\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.321565 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-etc-openvswitch\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.321586 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-node-log\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.321613 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.321718 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-systemd-units\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.321746 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-log-socket\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.321781 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-host-slash\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.321810 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7b564b0f-ab5a-454b-8588-a645fdec0058-ovn-node-metrics-cert\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.321834 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-var-lib-openvswitch\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.322005 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7b564b0f-ab5a-454b-8588-a645fdec0058-ovnkube-script-lib\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.322082 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-host-kubelet\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.322126 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-host-cni-bin\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.322172 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7b564b0f-ab5a-454b-8588-a645fdec0058-env-overrides\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.322194 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj4xf\" (UniqueName: \"kubernetes.io/projected/7b564b0f-ab5a-454b-8588-a645fdec0058-kube-api-access-dj4xf\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.322247 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-host-cni-netd\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.322294 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7b564b0f-ab5a-454b-8588-a645fdec0058-ovnkube-config\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.322313 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-host-run-ovn-kubernetes\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.322385 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-run-ovn\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.322447 4837 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.322486 4837 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.322498 4837 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.322506 4837 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.322520 4837 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.322530 4837 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/43df29f7-1351-41f5-bfca-17f804837cb4-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.322540 4837 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.322549 4837 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/43df29f7-1351-41f5-bfca-17f804837cb4-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.322559 4837 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.322568 4837 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.322578 4837 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-host-slash\") on node \"crc\" DevicePath \"\"" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.322587 4837 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/43df29f7-1351-41f5-bfca-17f804837cb4-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.322595 4837 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/43df29f7-1351-41f5-bfca-17f804837cb4-node-log\") on node \"crc\" DevicePath \"\"" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.322604 4837 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/43df29f7-1351-41f5-bfca-17f804837cb4-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.322613 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85hll\" (UniqueName: \"kubernetes.io/projected/43df29f7-1351-41f5-bfca-17f804837cb4-kube-api-access-85hll\") on node \"crc\" DevicePath \"\"" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.338500 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qg957_cbb3f4c6-a6c5-4059-8beb-04179d70aff5/kube-multus/2.log" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.339030 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qg957_cbb3f4c6-a6c5-4059-8beb-04179d70aff5/kube-multus/1.log" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.339078 4837 generic.go:334] "Generic (PLEG): container finished" podID="cbb3f4c6-a6c5-4059-8beb-04179d70aff5" containerID="1effae1c86d3c4f5369295262f269b1dad692c561321e1c868d2b4fe7f736d7c" exitCode=2 Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.339154 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qg957" event={"ID":"cbb3f4c6-a6c5-4059-8beb-04179d70aff5","Type":"ContainerDied","Data":"1effae1c86d3c4f5369295262f269b1dad692c561321e1c868d2b4fe7f736d7c"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.339195 4837 scope.go:117] "RemoveContainer" containerID="19b8a72f10c691a74098997e9d2383adf1aeb1811ad22dc8a74b5a47945d1e3e" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.339710 4837 scope.go:117] "RemoveContainer" containerID="1effae1c86d3c4f5369295262f269b1dad692c561321e1c868d2b4fe7f736d7c" Mar 13 11:59:51 crc kubenswrapper[4837]: E0313 11:59:51.339925 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-qg957_openshift-multus(cbb3f4c6-a6c5-4059-8beb-04179d70aff5)\"" pod="openshift-multus/multus-qg957" podUID="cbb3f4c6-a6c5-4059-8beb-04179d70aff5" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.341546 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4zzrs_43df29f7-1351-41f5-bfca-17f804837cb4/ovnkube-controller/3.log" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.344949 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4zzrs_43df29f7-1351-41f5-bfca-17f804837cb4/ovn-acl-logging/0.log" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.345455 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4zzrs_43df29f7-1351-41f5-bfca-17f804837cb4/ovn-controller/0.log" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346019 4837 generic.go:334] "Generic (PLEG): container finished" podID="43df29f7-1351-41f5-bfca-17f804837cb4" containerID="f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282" exitCode=0 Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346184 4837 generic.go:334] "Generic (PLEG): container finished" podID="43df29f7-1351-41f5-bfca-17f804837cb4" containerID="60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a" exitCode=0 Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346198 4837 generic.go:334] "Generic (PLEG): container finished" podID="43df29f7-1351-41f5-bfca-17f804837cb4" containerID="80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39" exitCode=0 Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346208 4837 generic.go:334] "Generic (PLEG): container finished" podID="43df29f7-1351-41f5-bfca-17f804837cb4" containerID="7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7" exitCode=0 Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346104 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" event={"ID":"43df29f7-1351-41f5-bfca-17f804837cb4","Type":"ContainerDied","Data":"f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346316 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" event={"ID":"43df29f7-1351-41f5-bfca-17f804837cb4","Type":"ContainerDied","Data":"60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346166 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346366 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" event={"ID":"43df29f7-1351-41f5-bfca-17f804837cb4","Type":"ContainerDied","Data":"80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346385 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" event={"ID":"43df29f7-1351-41f5-bfca-17f804837cb4","Type":"ContainerDied","Data":"7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346406 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" event={"ID":"43df29f7-1351-41f5-bfca-17f804837cb4","Type":"ContainerDied","Data":"954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346218 4837 generic.go:334] "Generic (PLEG): container finished" podID="43df29f7-1351-41f5-bfca-17f804837cb4" containerID="954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793" exitCode=0 Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346443 4837 generic.go:334] "Generic (PLEG): container finished" podID="43df29f7-1351-41f5-bfca-17f804837cb4" containerID="bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2" exitCode=0 Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346462 4837 generic.go:334] "Generic (PLEG): container finished" podID="43df29f7-1351-41f5-bfca-17f804837cb4" containerID="c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d" exitCode=143 Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346477 4837 generic.go:334] "Generic (PLEG): container finished" podID="43df29f7-1351-41f5-bfca-17f804837cb4" containerID="b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1" exitCode=143 Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346500 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" event={"ID":"43df29f7-1351-41f5-bfca-17f804837cb4","Type":"ContainerDied","Data":"bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346524 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346540 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346570 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346581 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346592 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346602 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346613 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346620 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346627 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346654 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346684 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" event={"ID":"43df29f7-1351-41f5-bfca-17f804837cb4","Type":"ContainerDied","Data":"c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346700 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346709 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346717 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346724 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346732 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346739 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346746 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346754 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346762 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346769 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346779 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" event={"ID":"43df29f7-1351-41f5-bfca-17f804837cb4","Type":"ContainerDied","Data":"b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346791 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346798 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346805 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346812 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346819 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346825 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346835 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346842 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346850 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346857 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346867 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4zzrs" event={"ID":"43df29f7-1351-41f5-bfca-17f804837cb4","Type":"ContainerDied","Data":"17148b76b47a8d352ae2adca8c21dbaa4b189a84d57c2f7678c2d83f59bfc901"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346880 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346888 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346896 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346905 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346912 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346920 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346927 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346933 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346941 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.346947 4837 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60"} Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.382804 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4zzrs"] Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.388555 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4zzrs"] Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.424451 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-etc-openvswitch\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.424531 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-node-log\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.424564 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.424587 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-systemd-units\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.424613 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-log-socket\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.424623 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-etc-openvswitch\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.424692 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-host-slash\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.424698 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-log-socket\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.424735 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-host-slash\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.424746 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7b564b0f-ab5a-454b-8588-a645fdec0058-ovn-node-metrics-cert\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.424766 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-node-log\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.424795 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-var-lib-openvswitch\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.424828 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7b564b0f-ab5a-454b-8588-a645fdec0058-ovnkube-script-lib\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.424851 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-host-kubelet\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.424879 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-host-cni-bin\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.424900 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7b564b0f-ab5a-454b-8588-a645fdec0058-env-overrides\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.424924 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dj4xf\" (UniqueName: \"kubernetes.io/projected/7b564b0f-ab5a-454b-8588-a645fdec0058-kube-api-access-dj4xf\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.424948 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-host-cni-netd\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.424938 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-systemd-units\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.424981 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7b564b0f-ab5a-454b-8588-a645fdec0058-ovnkube-config\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.424730 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.425272 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-host-cni-bin\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.425299 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-var-lib-openvswitch\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.425304 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-host-run-ovn-kubernetes\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.425348 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-run-ovn\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.425443 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-host-run-netns\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.425477 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-run-openvswitch\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.425533 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-run-systemd\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.425553 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-run-ovn\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.425710 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-run-openvswitch\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.425744 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-host-kubelet\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.425753 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-run-systemd\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.425788 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-host-run-ovn-kubernetes\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.425825 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-host-cni-netd\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.425845 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7b564b0f-ab5a-454b-8588-a645fdec0058-host-run-netns\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.425867 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7b564b0f-ab5a-454b-8588-a645fdec0058-ovnkube-config\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.425888 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7b564b0f-ab5a-454b-8588-a645fdec0058-ovnkube-script-lib\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.426514 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7b564b0f-ab5a-454b-8588-a645fdec0058-env-overrides\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.428539 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7b564b0f-ab5a-454b-8588-a645fdec0058-ovn-node-metrics-cert\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.444146 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj4xf\" (UniqueName: \"kubernetes.io/projected/7b564b0f-ab5a-454b-8588-a645fdec0058-kube-api-access-dj4xf\") pod \"ovnkube-node-6bbfz\" (UID: \"7b564b0f-ab5a-454b-8588-a645fdec0058\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.451792 4837 scope.go:117] "RemoveContainer" containerID="f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.481972 4837 scope.go:117] "RemoveContainer" containerID="01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.499108 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.512163 4837 scope.go:117] "RemoveContainer" containerID="60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.547400 4837 scope.go:117] "RemoveContainer" containerID="80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.571781 4837 scope.go:117] "RemoveContainer" containerID="7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.593550 4837 scope.go:117] "RemoveContainer" containerID="954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.614258 4837 scope.go:117] "RemoveContainer" containerID="bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.636952 4837 scope.go:117] "RemoveContainer" containerID="c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.655348 4837 scope.go:117] "RemoveContainer" containerID="b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.678493 4837 scope.go:117] "RemoveContainer" containerID="4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.705720 4837 scope.go:117] "RemoveContainer" containerID="f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282" Mar 13 11:59:51 crc kubenswrapper[4837]: E0313 11:59:51.706821 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282\": container with ID starting with f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282 not found: ID does not exist" containerID="f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.706885 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282"} err="failed to get container status \"f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282\": rpc error: code = NotFound desc = could not find container \"f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282\": container with ID starting with f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282 not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.706912 4837 scope.go:117] "RemoveContainer" containerID="01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c" Mar 13 11:59:51 crc kubenswrapper[4837]: E0313 11:59:51.707321 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c\": container with ID starting with 01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c not found: ID does not exist" containerID="01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.707390 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c"} err="failed to get container status \"01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c\": rpc error: code = NotFound desc = could not find container \"01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c\": container with ID starting with 01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.707431 4837 scope.go:117] "RemoveContainer" containerID="60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a" Mar 13 11:59:51 crc kubenswrapper[4837]: E0313 11:59:51.707970 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\": container with ID starting with 60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a not found: ID does not exist" containerID="60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.708001 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a"} err="failed to get container status \"60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\": rpc error: code = NotFound desc = could not find container \"60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\": container with ID starting with 60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.708022 4837 scope.go:117] "RemoveContainer" containerID="80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39" Mar 13 11:59:51 crc kubenswrapper[4837]: E0313 11:59:51.708316 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\": container with ID starting with 80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39 not found: ID does not exist" containerID="80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.708340 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39"} err="failed to get container status \"80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\": rpc error: code = NotFound desc = could not find container \"80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\": container with ID starting with 80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39 not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.708356 4837 scope.go:117] "RemoveContainer" containerID="7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7" Mar 13 11:59:51 crc kubenswrapper[4837]: E0313 11:59:51.708710 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\": container with ID starting with 7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7 not found: ID does not exist" containerID="7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.708819 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7"} err="failed to get container status \"7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\": rpc error: code = NotFound desc = could not find container \"7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\": container with ID starting with 7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7 not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.708842 4837 scope.go:117] "RemoveContainer" containerID="954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793" Mar 13 11:59:51 crc kubenswrapper[4837]: E0313 11:59:51.709118 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\": container with ID starting with 954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793 not found: ID does not exist" containerID="954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.709142 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793"} err="failed to get container status \"954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\": rpc error: code = NotFound desc = could not find container \"954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\": container with ID starting with 954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793 not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.709158 4837 scope.go:117] "RemoveContainer" containerID="bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2" Mar 13 11:59:51 crc kubenswrapper[4837]: E0313 11:59:51.709467 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\": container with ID starting with bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2 not found: ID does not exist" containerID="bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.709496 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2"} err="failed to get container status \"bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\": rpc error: code = NotFound desc = could not find container \"bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\": container with ID starting with bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2 not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.709517 4837 scope.go:117] "RemoveContainer" containerID="c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d" Mar 13 11:59:51 crc kubenswrapper[4837]: E0313 11:59:51.709830 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\": container with ID starting with c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d not found: ID does not exist" containerID="c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.709861 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d"} err="failed to get container status \"c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\": rpc error: code = NotFound desc = could not find container \"c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\": container with ID starting with c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.709884 4837 scope.go:117] "RemoveContainer" containerID="b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1" Mar 13 11:59:51 crc kubenswrapper[4837]: E0313 11:59:51.710146 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\": container with ID starting with b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1 not found: ID does not exist" containerID="b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.710173 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1"} err="failed to get container status \"b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\": rpc error: code = NotFound desc = could not find container \"b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\": container with ID starting with b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1 not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.710193 4837 scope.go:117] "RemoveContainer" containerID="4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60" Mar 13 11:59:51 crc kubenswrapper[4837]: E0313 11:59:51.710408 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\": container with ID starting with 4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60 not found: ID does not exist" containerID="4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.710430 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60"} err="failed to get container status \"4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\": rpc error: code = NotFound desc = could not find container \"4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\": container with ID starting with 4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60 not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.710444 4837 scope.go:117] "RemoveContainer" containerID="f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.711832 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282"} err="failed to get container status \"f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282\": rpc error: code = NotFound desc = could not find container \"f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282\": container with ID starting with f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282 not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.711870 4837 scope.go:117] "RemoveContainer" containerID="01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.712146 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c"} err="failed to get container status \"01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c\": rpc error: code = NotFound desc = could not find container \"01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c\": container with ID starting with 01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.712175 4837 scope.go:117] "RemoveContainer" containerID="60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.712528 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a"} err="failed to get container status \"60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\": rpc error: code = NotFound desc = could not find container \"60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\": container with ID starting with 60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.712565 4837 scope.go:117] "RemoveContainer" containerID="80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.712921 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39"} err="failed to get container status \"80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\": rpc error: code = NotFound desc = could not find container \"80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\": container with ID starting with 80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39 not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.712945 4837 scope.go:117] "RemoveContainer" containerID="7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.713824 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7"} err="failed to get container status \"7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\": rpc error: code = NotFound desc = could not find container \"7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\": container with ID starting with 7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7 not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.713854 4837 scope.go:117] "RemoveContainer" containerID="954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.714163 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793"} err="failed to get container status \"954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\": rpc error: code = NotFound desc = could not find container \"954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\": container with ID starting with 954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793 not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.714188 4837 scope.go:117] "RemoveContainer" containerID="bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.714487 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2"} err="failed to get container status \"bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\": rpc error: code = NotFound desc = could not find container \"bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\": container with ID starting with bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2 not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.714501 4837 scope.go:117] "RemoveContainer" containerID="c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.715368 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d"} err="failed to get container status \"c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\": rpc error: code = NotFound desc = could not find container \"c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\": container with ID starting with c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.715388 4837 scope.go:117] "RemoveContainer" containerID="b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.715815 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1"} err="failed to get container status \"b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\": rpc error: code = NotFound desc = could not find container \"b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\": container with ID starting with b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1 not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.715841 4837 scope.go:117] "RemoveContainer" containerID="4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.716045 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60"} err="failed to get container status \"4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\": rpc error: code = NotFound desc = could not find container \"4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\": container with ID starting with 4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60 not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.716060 4837 scope.go:117] "RemoveContainer" containerID="f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.716267 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282"} err="failed to get container status \"f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282\": rpc error: code = NotFound desc = could not find container \"f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282\": container with ID starting with f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282 not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.716284 4837 scope.go:117] "RemoveContainer" containerID="01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.716485 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c"} err="failed to get container status \"01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c\": rpc error: code = NotFound desc = could not find container \"01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c\": container with ID starting with 01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.716502 4837 scope.go:117] "RemoveContainer" containerID="60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.716730 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a"} err="failed to get container status \"60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\": rpc error: code = NotFound desc = could not find container \"60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\": container with ID starting with 60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.716750 4837 scope.go:117] "RemoveContainer" containerID="80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.718365 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39"} err="failed to get container status \"80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\": rpc error: code = NotFound desc = could not find container \"80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\": container with ID starting with 80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39 not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.718393 4837 scope.go:117] "RemoveContainer" containerID="7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.718663 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7"} err="failed to get container status \"7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\": rpc error: code = NotFound desc = could not find container \"7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\": container with ID starting with 7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7 not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.718692 4837 scope.go:117] "RemoveContainer" containerID="954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.719074 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793"} err="failed to get container status \"954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\": rpc error: code = NotFound desc = could not find container \"954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\": container with ID starting with 954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793 not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.719099 4837 scope.go:117] "RemoveContainer" containerID="bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.719382 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2"} err="failed to get container status \"bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\": rpc error: code = NotFound desc = could not find container \"bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\": container with ID starting with bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2 not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.719407 4837 scope.go:117] "RemoveContainer" containerID="c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.720683 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d"} err="failed to get container status \"c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\": rpc error: code = NotFound desc = could not find container \"c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\": container with ID starting with c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.720718 4837 scope.go:117] "RemoveContainer" containerID="b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.721061 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1"} err="failed to get container status \"b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\": rpc error: code = NotFound desc = could not find container \"b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\": container with ID starting with b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1 not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.721105 4837 scope.go:117] "RemoveContainer" containerID="4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.721417 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60"} err="failed to get container status \"4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\": rpc error: code = NotFound desc = could not find container \"4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\": container with ID starting with 4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60 not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.721453 4837 scope.go:117] "RemoveContainer" containerID="f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.721729 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282"} err="failed to get container status \"f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282\": rpc error: code = NotFound desc = could not find container \"f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282\": container with ID starting with f372f76d94f347bed3cba6f20ca7f85f6137b2444cd34f244ac90b2d4ac58282 not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.721752 4837 scope.go:117] "RemoveContainer" containerID="01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.721999 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c"} err="failed to get container status \"01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c\": rpc error: code = NotFound desc = could not find container \"01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c\": container with ID starting with 01e70762247df5ba4a9c62669441b805f6d383ff6d85ec89de9a49acaf23669c not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.722029 4837 scope.go:117] "RemoveContainer" containerID="60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.722255 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a"} err="failed to get container status \"60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\": rpc error: code = NotFound desc = could not find container \"60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a\": container with ID starting with 60f0427d0696a93ac350078e4555381a8ac08be223580c656703e21d1b7dbc9a not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.722284 4837 scope.go:117] "RemoveContainer" containerID="80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.722577 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39"} err="failed to get container status \"80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\": rpc error: code = NotFound desc = could not find container \"80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39\": container with ID starting with 80132867fec058c31f7bd95300824315cb52c36ed3b567d2e85165185da43e39 not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.722652 4837 scope.go:117] "RemoveContainer" containerID="7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.722920 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7"} err="failed to get container status \"7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\": rpc error: code = NotFound desc = could not find container \"7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7\": container with ID starting with 7659c5e02ee15dbf0bf356aeaa0ff0b3020f60ca68cca412792723f94cca13b7 not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.722945 4837 scope.go:117] "RemoveContainer" containerID="954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.723145 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793"} err="failed to get container status \"954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\": rpc error: code = NotFound desc = could not find container \"954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793\": container with ID starting with 954136e258aa821f886ba7dd6ed22c9ad3585341d07f6671f8b3ef8a6e975793 not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.723191 4837 scope.go:117] "RemoveContainer" containerID="bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.723434 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2"} err="failed to get container status \"bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\": rpc error: code = NotFound desc = could not find container \"bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2\": container with ID starting with bd1b2524562cda51f2cff2438d46853d3a7c6536eae7b29445bc6183b6ae92e2 not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.723456 4837 scope.go:117] "RemoveContainer" containerID="c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.723609 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d"} err="failed to get container status \"c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\": rpc error: code = NotFound desc = could not find container \"c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d\": container with ID starting with c4ac232adc54600316e55f9c20ab2991506303b92e412a4ab7606b9ba532822d not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.723627 4837 scope.go:117] "RemoveContainer" containerID="b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.723840 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1"} err="failed to get container status \"b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\": rpc error: code = NotFound desc = could not find container \"b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1\": container with ID starting with b4c6b0a6a60accbadf0f4465bfab99996880cdf1acf6a61df18ac43fe61630d1 not found: ID does not exist" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.723875 4837 scope.go:117] "RemoveContainer" containerID="4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60" Mar 13 11:59:51 crc kubenswrapper[4837]: I0313 11:59:51.724066 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60"} err="failed to get container status \"4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\": rpc error: code = NotFound desc = could not find container \"4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60\": container with ID starting with 4888278b4c8a17dd984be26b4d94bd2949f9bf2dac4fd43116bb509fb36b5e60 not found: ID does not exist" Mar 13 11:59:52 crc kubenswrapper[4837]: I0313 11:59:52.358322 4837 generic.go:334] "Generic (PLEG): container finished" podID="7b564b0f-ab5a-454b-8588-a645fdec0058" containerID="bb5ee5517ad5a43762ab2251e410f5318f82ec2a81d734f67f2f3182b5ffbaac" exitCode=0 Mar 13 11:59:52 crc kubenswrapper[4837]: I0313 11:59:52.358424 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" event={"ID":"7b564b0f-ab5a-454b-8588-a645fdec0058","Type":"ContainerDied","Data":"bb5ee5517ad5a43762ab2251e410f5318f82ec2a81d734f67f2f3182b5ffbaac"} Mar 13 11:59:52 crc kubenswrapper[4837]: I0313 11:59:52.360019 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" event={"ID":"7b564b0f-ab5a-454b-8588-a645fdec0058","Type":"ContainerStarted","Data":"6afb7fa965f127a6881f0f6df8d1b6b9e17a876a7592677e4d80c493ed85fc49"} Mar 13 11:59:52 crc kubenswrapper[4837]: I0313 11:59:52.364468 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qg957_cbb3f4c6-a6c5-4059-8beb-04179d70aff5/kube-multus/2.log" Mar 13 11:59:53 crc kubenswrapper[4837]: I0313 11:59:53.056088 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43df29f7-1351-41f5-bfca-17f804837cb4" path="/var/lib/kubelet/pods/43df29f7-1351-41f5-bfca-17f804837cb4/volumes" Mar 13 11:59:53 crc kubenswrapper[4837]: I0313 11:59:53.373291 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" event={"ID":"7b564b0f-ab5a-454b-8588-a645fdec0058","Type":"ContainerStarted","Data":"92962e881a495a2e0ac4153f3505318b6a007e7e8b0cc140b0f3ba578e6d7723"} Mar 13 11:59:53 crc kubenswrapper[4837]: I0313 11:59:53.373329 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" event={"ID":"7b564b0f-ab5a-454b-8588-a645fdec0058","Type":"ContainerStarted","Data":"866307818a57731c0c4ee24805a1470f96af533343d809502d6e4e2525011118"} Mar 13 11:59:53 crc kubenswrapper[4837]: I0313 11:59:53.373343 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" event={"ID":"7b564b0f-ab5a-454b-8588-a645fdec0058","Type":"ContainerStarted","Data":"1ffd37843b0efe2597117fe6f66f589d3258198d3a2d361ff5fc4bbc1d55a53e"} Mar 13 11:59:53 crc kubenswrapper[4837]: I0313 11:59:53.373354 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" event={"ID":"7b564b0f-ab5a-454b-8588-a645fdec0058","Type":"ContainerStarted","Data":"6d035b72086b8923cb3e9c835c1d7ab969f81dac5d95be31c196c38feb879837"} Mar 13 11:59:53 crc kubenswrapper[4837]: I0313 11:59:53.373363 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" event={"ID":"7b564b0f-ab5a-454b-8588-a645fdec0058","Type":"ContainerStarted","Data":"19e6a3f68d9f047b7f0c6cf4e1627a27f8a817a282970d9795805df7cd12052f"} Mar 13 11:59:53 crc kubenswrapper[4837]: I0313 11:59:53.373370 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" event={"ID":"7b564b0f-ab5a-454b-8588-a645fdec0058","Type":"ContainerStarted","Data":"896ceaa574226ecb36e0da9d7fba5e511a2dd2595dbf5da03f24f83259009ea4"} Mar 13 11:59:55 crc kubenswrapper[4837]: I0313 11:59:55.390332 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" event={"ID":"7b564b0f-ab5a-454b-8588-a645fdec0058","Type":"ContainerStarted","Data":"6dde7ba59f89410c93c1507806ee885ec11a606cc8922ccd36d5c63c161a3f8a"} Mar 13 11:59:58 crc kubenswrapper[4837]: I0313 11:59:58.416606 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" event={"ID":"7b564b0f-ab5a-454b-8588-a645fdec0058","Type":"ContainerStarted","Data":"03d8f6aca9a68604183217b6f548ae221326d4cdb9fd9cba920c5ad2cf17b2a4"} Mar 13 11:59:58 crc kubenswrapper[4837]: I0313 11:59:58.417182 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:58 crc kubenswrapper[4837]: I0313 11:59:58.417204 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:58 crc kubenswrapper[4837]: I0313 11:59:58.446368 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:58 crc kubenswrapper[4837]: I0313 11:59:58.460927 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" podStartSLOduration=7.460909185 podStartE2EDuration="7.460909185s" podCreationTimestamp="2026-03-13 11:59:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 11:59:58.457441174 +0000 UTC m=+714.095707937" watchObservedRunningTime="2026-03-13 11:59:58.460909185 +0000 UTC m=+714.099175958" Mar 13 11:59:59 crc kubenswrapper[4837]: I0313 11:59:59.422596 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 11:59:59 crc kubenswrapper[4837]: I0313 11:59:59.448350 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 12:00:00 crc kubenswrapper[4837]: I0313 12:00:00.133875 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556720-wqrqr"] Mar 13 12:00:00 crc kubenswrapper[4837]: I0313 12:00:00.134622 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556720-wqrqr" Mar 13 12:00:00 crc kubenswrapper[4837]: I0313 12:00:00.137703 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:00:00 crc kubenswrapper[4837]: I0313 12:00:00.137786 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:00:00 crc kubenswrapper[4837]: I0313 12:00:00.137897 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:00:00 crc kubenswrapper[4837]: I0313 12:00:00.145424 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556720-wqrqr"] Mar 13 12:00:00 crc kubenswrapper[4837]: I0313 12:00:00.237127 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc"] Mar 13 12:00:00 crc kubenswrapper[4837]: I0313 12:00:00.237978 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" Mar 13 12:00:00 crc kubenswrapper[4837]: I0313 12:00:00.241257 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 12:00:00 crc kubenswrapper[4837]: I0313 12:00:00.243936 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc"] Mar 13 12:00:00 crc kubenswrapper[4837]: I0313 12:00:00.244131 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fflv4\" (UniqueName: \"kubernetes.io/projected/1335d65b-c0fb-4085-86eb-d948f797ef68-kube-api-access-fflv4\") pod \"auto-csr-approver-29556720-wqrqr\" (UID: \"1335d65b-c0fb-4085-86eb-d948f797ef68\") " pod="openshift-infra/auto-csr-approver-29556720-wqrqr" Mar 13 12:00:00 crc kubenswrapper[4837]: I0313 12:00:00.244469 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 12:00:00 crc kubenswrapper[4837]: I0313 12:00:00.345438 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fflv4\" (UniqueName: \"kubernetes.io/projected/1335d65b-c0fb-4085-86eb-d948f797ef68-kube-api-access-fflv4\") pod \"auto-csr-approver-29556720-wqrqr\" (UID: \"1335d65b-c0fb-4085-86eb-d948f797ef68\") " pod="openshift-infra/auto-csr-approver-29556720-wqrqr" Mar 13 12:00:00 crc kubenswrapper[4837]: I0313 12:00:00.345486 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a6d18151-32fe-4457-814f-33c3ed53dab8-secret-volume\") pod \"collect-profiles-29556720-slvpc\" (UID: \"a6d18151-32fe-4457-814f-33c3ed53dab8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" Mar 13 12:00:00 crc kubenswrapper[4837]: I0313 12:00:00.345511 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfjfp\" (UniqueName: \"kubernetes.io/projected/a6d18151-32fe-4457-814f-33c3ed53dab8-kube-api-access-zfjfp\") pod \"collect-profiles-29556720-slvpc\" (UID: \"a6d18151-32fe-4457-814f-33c3ed53dab8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" Mar 13 12:00:00 crc kubenswrapper[4837]: I0313 12:00:00.345556 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a6d18151-32fe-4457-814f-33c3ed53dab8-config-volume\") pod \"collect-profiles-29556720-slvpc\" (UID: \"a6d18151-32fe-4457-814f-33c3ed53dab8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" Mar 13 12:00:00 crc kubenswrapper[4837]: I0313 12:00:00.365424 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fflv4\" (UniqueName: \"kubernetes.io/projected/1335d65b-c0fb-4085-86eb-d948f797ef68-kube-api-access-fflv4\") pod \"auto-csr-approver-29556720-wqrqr\" (UID: \"1335d65b-c0fb-4085-86eb-d948f797ef68\") " pod="openshift-infra/auto-csr-approver-29556720-wqrqr" Mar 13 12:00:00 crc kubenswrapper[4837]: I0313 12:00:00.446497 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a6d18151-32fe-4457-814f-33c3ed53dab8-config-volume\") pod \"collect-profiles-29556720-slvpc\" (UID: \"a6d18151-32fe-4457-814f-33c3ed53dab8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" Mar 13 12:00:00 crc kubenswrapper[4837]: I0313 12:00:00.447344 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a6d18151-32fe-4457-814f-33c3ed53dab8-config-volume\") pod \"collect-profiles-29556720-slvpc\" (UID: \"a6d18151-32fe-4457-814f-33c3ed53dab8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" Mar 13 12:00:00 crc kubenswrapper[4837]: I0313 12:00:00.447793 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a6d18151-32fe-4457-814f-33c3ed53dab8-secret-volume\") pod \"collect-profiles-29556720-slvpc\" (UID: \"a6d18151-32fe-4457-814f-33c3ed53dab8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" Mar 13 12:00:00 crc kubenswrapper[4837]: I0313 12:00:00.447982 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfjfp\" (UniqueName: \"kubernetes.io/projected/a6d18151-32fe-4457-814f-33c3ed53dab8-kube-api-access-zfjfp\") pod \"collect-profiles-29556720-slvpc\" (UID: \"a6d18151-32fe-4457-814f-33c3ed53dab8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" Mar 13 12:00:00 crc kubenswrapper[4837]: I0313 12:00:00.452274 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a6d18151-32fe-4457-814f-33c3ed53dab8-secret-volume\") pod \"collect-profiles-29556720-slvpc\" (UID: \"a6d18151-32fe-4457-814f-33c3ed53dab8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" Mar 13 12:00:00 crc kubenswrapper[4837]: I0313 12:00:00.458900 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556720-wqrqr" Mar 13 12:00:00 crc kubenswrapper[4837]: I0313 12:00:00.464209 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfjfp\" (UniqueName: \"kubernetes.io/projected/a6d18151-32fe-4457-814f-33c3ed53dab8-kube-api-access-zfjfp\") pod \"collect-profiles-29556720-slvpc\" (UID: \"a6d18151-32fe-4457-814f-33c3ed53dab8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" Mar 13 12:00:00 crc kubenswrapper[4837]: E0313 12:00:00.487772 4837 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29556720-wqrqr_openshift-infra_1335d65b-c0fb-4085-86eb-d948f797ef68_0(3b42cbee7461239d7eda46c0d8702483566a37f1f91506d382b292950e70f220): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 12:00:00 crc kubenswrapper[4837]: E0313 12:00:00.487875 4837 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29556720-wqrqr_openshift-infra_1335d65b-c0fb-4085-86eb-d948f797ef68_0(3b42cbee7461239d7eda46c0d8702483566a37f1f91506d382b292950e70f220): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29556720-wqrqr" Mar 13 12:00:00 crc kubenswrapper[4837]: E0313 12:00:00.487912 4837 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29556720-wqrqr_openshift-infra_1335d65b-c0fb-4085-86eb-d948f797ef68_0(3b42cbee7461239d7eda46c0d8702483566a37f1f91506d382b292950e70f220): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29556720-wqrqr" Mar 13 12:00:00 crc kubenswrapper[4837]: E0313 12:00:00.487988 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29556720-wqrqr_openshift-infra(1335d65b-c0fb-4085-86eb-d948f797ef68)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29556720-wqrqr_openshift-infra(1335d65b-c0fb-4085-86eb-d948f797ef68)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29556720-wqrqr_openshift-infra_1335d65b-c0fb-4085-86eb-d948f797ef68_0(3b42cbee7461239d7eda46c0d8702483566a37f1f91506d382b292950e70f220): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29556720-wqrqr" podUID="1335d65b-c0fb-4085-86eb-d948f797ef68" Mar 13 12:00:00 crc kubenswrapper[4837]: I0313 12:00:00.552697 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" Mar 13 12:00:00 crc kubenswrapper[4837]: E0313 12:00:00.576759 4837 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29556720-slvpc_openshift-operator-lifecycle-manager_a6d18151-32fe-4457-814f-33c3ed53dab8_0(0dd387fe97dac6ebba71b653e34d9c2de468b5f529efcc97f6ac0d3f47cdbb41): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 12:00:00 crc kubenswrapper[4837]: E0313 12:00:00.576834 4837 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29556720-slvpc_openshift-operator-lifecycle-manager_a6d18151-32fe-4457-814f-33c3ed53dab8_0(0dd387fe97dac6ebba71b653e34d9c2de468b5f529efcc97f6ac0d3f47cdbb41): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" Mar 13 12:00:00 crc kubenswrapper[4837]: E0313 12:00:00.576863 4837 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29556720-slvpc_openshift-operator-lifecycle-manager_a6d18151-32fe-4457-814f-33c3ed53dab8_0(0dd387fe97dac6ebba71b653e34d9c2de468b5f529efcc97f6ac0d3f47cdbb41): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" Mar 13 12:00:00 crc kubenswrapper[4837]: E0313 12:00:00.576927 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"collect-profiles-29556720-slvpc_openshift-operator-lifecycle-manager(a6d18151-32fe-4457-814f-33c3ed53dab8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"collect-profiles-29556720-slvpc_openshift-operator-lifecycle-manager(a6d18151-32fe-4457-814f-33c3ed53dab8)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29556720-slvpc_openshift-operator-lifecycle-manager_a6d18151-32fe-4457-814f-33c3ed53dab8_0(0dd387fe97dac6ebba71b653e34d9c2de468b5f529efcc97f6ac0d3f47cdbb41): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" podUID="a6d18151-32fe-4457-814f-33c3ed53dab8" Mar 13 12:00:01 crc kubenswrapper[4837]: I0313 12:00:01.432538 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556720-wqrqr" Mar 13 12:00:01 crc kubenswrapper[4837]: I0313 12:00:01.432597 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" Mar 13 12:00:01 crc kubenswrapper[4837]: I0313 12:00:01.433136 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556720-wqrqr" Mar 13 12:00:01 crc kubenswrapper[4837]: I0313 12:00:01.433256 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" Mar 13 12:00:01 crc kubenswrapper[4837]: E0313 12:00:01.461035 4837 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29556720-wqrqr_openshift-infra_1335d65b-c0fb-4085-86eb-d948f797ef68_0(4eb7e214cd1b7a56654cd4d7e82c2556d3e79d5da56a054f1b205ee2a1f2bc6a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 12:00:01 crc kubenswrapper[4837]: E0313 12:00:01.461097 4837 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29556720-wqrqr_openshift-infra_1335d65b-c0fb-4085-86eb-d948f797ef68_0(4eb7e214cd1b7a56654cd4d7e82c2556d3e79d5da56a054f1b205ee2a1f2bc6a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29556720-wqrqr" Mar 13 12:00:01 crc kubenswrapper[4837]: E0313 12:00:01.461117 4837 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29556720-wqrqr_openshift-infra_1335d65b-c0fb-4085-86eb-d948f797ef68_0(4eb7e214cd1b7a56654cd4d7e82c2556d3e79d5da56a054f1b205ee2a1f2bc6a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29556720-wqrqr" Mar 13 12:00:01 crc kubenswrapper[4837]: E0313 12:00:01.461163 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29556720-wqrqr_openshift-infra(1335d65b-c0fb-4085-86eb-d948f797ef68)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29556720-wqrqr_openshift-infra(1335d65b-c0fb-4085-86eb-d948f797ef68)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29556720-wqrqr_openshift-infra_1335d65b-c0fb-4085-86eb-d948f797ef68_0(4eb7e214cd1b7a56654cd4d7e82c2556d3e79d5da56a054f1b205ee2a1f2bc6a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29556720-wqrqr" podUID="1335d65b-c0fb-4085-86eb-d948f797ef68" Mar 13 12:00:01 crc kubenswrapper[4837]: E0313 12:00:01.466763 4837 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29556720-slvpc_openshift-operator-lifecycle-manager_a6d18151-32fe-4457-814f-33c3ed53dab8_0(57f223134f24ddc5890336e84be957881b1f2e52ef9d200b0157237c48d1f945): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 12:00:01 crc kubenswrapper[4837]: E0313 12:00:01.466811 4837 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29556720-slvpc_openshift-operator-lifecycle-manager_a6d18151-32fe-4457-814f-33c3ed53dab8_0(57f223134f24ddc5890336e84be957881b1f2e52ef9d200b0157237c48d1f945): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" Mar 13 12:00:01 crc kubenswrapper[4837]: E0313 12:00:01.466834 4837 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29556720-slvpc_openshift-operator-lifecycle-manager_a6d18151-32fe-4457-814f-33c3ed53dab8_0(57f223134f24ddc5890336e84be957881b1f2e52ef9d200b0157237c48d1f945): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" Mar 13 12:00:01 crc kubenswrapper[4837]: E0313 12:00:01.466888 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"collect-profiles-29556720-slvpc_openshift-operator-lifecycle-manager(a6d18151-32fe-4457-814f-33c3ed53dab8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"collect-profiles-29556720-slvpc_openshift-operator-lifecycle-manager(a6d18151-32fe-4457-814f-33c3ed53dab8)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29556720-slvpc_openshift-operator-lifecycle-manager_a6d18151-32fe-4457-814f-33c3ed53dab8_0(57f223134f24ddc5890336e84be957881b1f2e52ef9d200b0157237c48d1f945): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" podUID="a6d18151-32fe-4457-814f-33c3ed53dab8" Mar 13 12:00:04 crc kubenswrapper[4837]: I0313 12:00:04.047821 4837 scope.go:117] "RemoveContainer" containerID="1effae1c86d3c4f5369295262f269b1dad692c561321e1c868d2b4fe7f736d7c" Mar 13 12:00:04 crc kubenswrapper[4837]: E0313 12:00:04.048313 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-qg957_openshift-multus(cbb3f4c6-a6c5-4059-8beb-04179d70aff5)\"" pod="openshift-multus/multus-qg957" podUID="cbb3f4c6-a6c5-4059-8beb-04179d70aff5" Mar 13 12:00:14 crc kubenswrapper[4837]: I0313 12:00:14.047779 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" Mar 13 12:00:14 crc kubenswrapper[4837]: I0313 12:00:14.049244 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" Mar 13 12:00:14 crc kubenswrapper[4837]: E0313 12:00:14.097827 4837 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29556720-slvpc_openshift-operator-lifecycle-manager_a6d18151-32fe-4457-814f-33c3ed53dab8_0(3ccb157306fc69ff6691b77cd3a42f47ded828fe96e3883eab7134165784595a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 12:00:14 crc kubenswrapper[4837]: E0313 12:00:14.098164 4837 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29556720-slvpc_openshift-operator-lifecycle-manager_a6d18151-32fe-4457-814f-33c3ed53dab8_0(3ccb157306fc69ff6691b77cd3a42f47ded828fe96e3883eab7134165784595a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" Mar 13 12:00:14 crc kubenswrapper[4837]: E0313 12:00:14.098183 4837 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29556720-slvpc_openshift-operator-lifecycle-manager_a6d18151-32fe-4457-814f-33c3ed53dab8_0(3ccb157306fc69ff6691b77cd3a42f47ded828fe96e3883eab7134165784595a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" Mar 13 12:00:14 crc kubenswrapper[4837]: E0313 12:00:14.098226 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"collect-profiles-29556720-slvpc_openshift-operator-lifecycle-manager(a6d18151-32fe-4457-814f-33c3ed53dab8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"collect-profiles-29556720-slvpc_openshift-operator-lifecycle-manager(a6d18151-32fe-4457-814f-33c3ed53dab8)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29556720-slvpc_openshift-operator-lifecycle-manager_a6d18151-32fe-4457-814f-33c3ed53dab8_0(3ccb157306fc69ff6691b77cd3a42f47ded828fe96e3883eab7134165784595a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" podUID="a6d18151-32fe-4457-814f-33c3ed53dab8" Mar 13 12:00:16 crc kubenswrapper[4837]: I0313 12:00:16.047686 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556720-wqrqr" Mar 13 12:00:16 crc kubenswrapper[4837]: I0313 12:00:16.049120 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556720-wqrqr" Mar 13 12:00:16 crc kubenswrapper[4837]: E0313 12:00:16.083969 4837 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29556720-wqrqr_openshift-infra_1335d65b-c0fb-4085-86eb-d948f797ef68_0(b313d1c76170bd48b4bbcc367ae0fed8f9e48566cf1dfcb70b7a29f437fb08e5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 12:00:16 crc kubenswrapper[4837]: E0313 12:00:16.084103 4837 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29556720-wqrqr_openshift-infra_1335d65b-c0fb-4085-86eb-d948f797ef68_0(b313d1c76170bd48b4bbcc367ae0fed8f9e48566cf1dfcb70b7a29f437fb08e5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29556720-wqrqr" Mar 13 12:00:16 crc kubenswrapper[4837]: E0313 12:00:16.084141 4837 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29556720-wqrqr_openshift-infra_1335d65b-c0fb-4085-86eb-d948f797ef68_0(b313d1c76170bd48b4bbcc367ae0fed8f9e48566cf1dfcb70b7a29f437fb08e5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29556720-wqrqr" Mar 13 12:00:16 crc kubenswrapper[4837]: E0313 12:00:16.084211 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29556720-wqrqr_openshift-infra(1335d65b-c0fb-4085-86eb-d948f797ef68)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29556720-wqrqr_openshift-infra(1335d65b-c0fb-4085-86eb-d948f797ef68)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29556720-wqrqr_openshift-infra_1335d65b-c0fb-4085-86eb-d948f797ef68_0(b313d1c76170bd48b4bbcc367ae0fed8f9e48566cf1dfcb70b7a29f437fb08e5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29556720-wqrqr" podUID="1335d65b-c0fb-4085-86eb-d948f797ef68" Mar 13 12:00:17 crc kubenswrapper[4837]: I0313 12:00:17.048053 4837 scope.go:117] "RemoveContainer" containerID="1effae1c86d3c4f5369295262f269b1dad692c561321e1c868d2b4fe7f736d7c" Mar 13 12:00:17 crc kubenswrapper[4837]: I0313 12:00:17.518226 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qg957_cbb3f4c6-a6c5-4059-8beb-04179d70aff5/kube-multus/2.log" Mar 13 12:00:17 crc kubenswrapper[4837]: I0313 12:00:17.519878 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qg957" event={"ID":"cbb3f4c6-a6c5-4059-8beb-04179d70aff5","Type":"ContainerStarted","Data":"af2a6c239ad0d8b155fd9808f142bbb42034d2d57141d3abc86f61d28daa588e"} Mar 13 12:00:21 crc kubenswrapper[4837]: I0313 12:00:21.529208 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6bbfz" Mar 13 12:00:26 crc kubenswrapper[4837]: I0313 12:00:26.556591 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h"] Mar 13 12:00:26 crc kubenswrapper[4837]: I0313 12:00:26.558107 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h" Mar 13 12:00:26 crc kubenswrapper[4837]: I0313 12:00:26.560923 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 13 12:00:26 crc kubenswrapper[4837]: I0313 12:00:26.571747 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h"] Mar 13 12:00:26 crc kubenswrapper[4837]: I0313 12:00:26.575539 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c49e70e5-a4f6-4782-aa38-2faeb20ec38a-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h\" (UID: \"c49e70e5-a4f6-4782-aa38-2faeb20ec38a\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h" Mar 13 12:00:26 crc kubenswrapper[4837]: I0313 12:00:26.575573 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh7sc\" (UniqueName: \"kubernetes.io/projected/c49e70e5-a4f6-4782-aa38-2faeb20ec38a-kube-api-access-hh7sc\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h\" (UID: \"c49e70e5-a4f6-4782-aa38-2faeb20ec38a\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h" Mar 13 12:00:26 crc kubenswrapper[4837]: I0313 12:00:26.575610 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c49e70e5-a4f6-4782-aa38-2faeb20ec38a-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h\" (UID: \"c49e70e5-a4f6-4782-aa38-2faeb20ec38a\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h" Mar 13 12:00:26 crc kubenswrapper[4837]: I0313 12:00:26.676616 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c49e70e5-a4f6-4782-aa38-2faeb20ec38a-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h\" (UID: \"c49e70e5-a4f6-4782-aa38-2faeb20ec38a\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h" Mar 13 12:00:26 crc kubenswrapper[4837]: I0313 12:00:26.676684 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hh7sc\" (UniqueName: \"kubernetes.io/projected/c49e70e5-a4f6-4782-aa38-2faeb20ec38a-kube-api-access-hh7sc\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h\" (UID: \"c49e70e5-a4f6-4782-aa38-2faeb20ec38a\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h" Mar 13 12:00:26 crc kubenswrapper[4837]: I0313 12:00:26.676734 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c49e70e5-a4f6-4782-aa38-2faeb20ec38a-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h\" (UID: \"c49e70e5-a4f6-4782-aa38-2faeb20ec38a\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h" Mar 13 12:00:26 crc kubenswrapper[4837]: I0313 12:00:26.677283 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c49e70e5-a4f6-4782-aa38-2faeb20ec38a-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h\" (UID: \"c49e70e5-a4f6-4782-aa38-2faeb20ec38a\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h" Mar 13 12:00:26 crc kubenswrapper[4837]: I0313 12:00:26.677329 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c49e70e5-a4f6-4782-aa38-2faeb20ec38a-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h\" (UID: \"c49e70e5-a4f6-4782-aa38-2faeb20ec38a\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h" Mar 13 12:00:26 crc kubenswrapper[4837]: I0313 12:00:26.696241 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh7sc\" (UniqueName: \"kubernetes.io/projected/c49e70e5-a4f6-4782-aa38-2faeb20ec38a-kube-api-access-hh7sc\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h\" (UID: \"c49e70e5-a4f6-4782-aa38-2faeb20ec38a\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h" Mar 13 12:00:26 crc kubenswrapper[4837]: I0313 12:00:26.874901 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h" Mar 13 12:00:27 crc kubenswrapper[4837]: I0313 12:00:27.047991 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" Mar 13 12:00:27 crc kubenswrapper[4837]: I0313 12:00:27.048746 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" Mar 13 12:00:27 crc kubenswrapper[4837]: I0313 12:00:27.131247 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h"] Mar 13 12:00:27 crc kubenswrapper[4837]: I0313 12:00:27.268012 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc"] Mar 13 12:00:27 crc kubenswrapper[4837]: W0313 12:00:27.271345 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6d18151_32fe_4457_814f_33c3ed53dab8.slice/crio-e11c2e11328d466588f9e5269543514fe5ebff4bbda4de6454f2bd5db61ff79e WatchSource:0}: Error finding container e11c2e11328d466588f9e5269543514fe5ebff4bbda4de6454f2bd5db61ff79e: Status 404 returned error can't find the container with id e11c2e11328d466588f9e5269543514fe5ebff4bbda4de6454f2bd5db61ff79e Mar 13 12:00:27 crc kubenswrapper[4837]: I0313 12:00:27.574951 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h" event={"ID":"c49e70e5-a4f6-4782-aa38-2faeb20ec38a","Type":"ContainerStarted","Data":"fd40f2ee3c46e2914f7c79e21fff2402975f3e74643fb0fdb37eea494930a16a"} Mar 13 12:00:27 crc kubenswrapper[4837]: I0313 12:00:27.575247 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h" event={"ID":"c49e70e5-a4f6-4782-aa38-2faeb20ec38a","Type":"ContainerStarted","Data":"197d1987346ba3a53d3f2e66c5ace726d54ba6e0c9cc65dbb51ca5434993db91"} Mar 13 12:00:27 crc kubenswrapper[4837]: I0313 12:00:27.577629 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" event={"ID":"a6d18151-32fe-4457-814f-33c3ed53dab8","Type":"ContainerStarted","Data":"2d2bfd751903359f1fbdf915afe9614d288e33b823b0215d4cd3578202f69f1c"} Mar 13 12:00:27 crc kubenswrapper[4837]: I0313 12:00:27.577696 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" event={"ID":"a6d18151-32fe-4457-814f-33c3ed53dab8","Type":"ContainerStarted","Data":"e11c2e11328d466588f9e5269543514fe5ebff4bbda4de6454f2bd5db61ff79e"} Mar 13 12:00:27 crc kubenswrapper[4837]: I0313 12:00:27.611743 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" podStartSLOduration=27.611701705 podStartE2EDuration="27.611701705s" podCreationTimestamp="2026-03-13 12:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:00:27.607397188 +0000 UTC m=+743.245663971" watchObservedRunningTime="2026-03-13 12:00:27.611701705 +0000 UTC m=+743.249968488" Mar 13 12:00:28 crc kubenswrapper[4837]: I0313 12:00:28.047718 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556720-wqrqr" Mar 13 12:00:28 crc kubenswrapper[4837]: I0313 12:00:28.048374 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556720-wqrqr" Mar 13 12:00:28 crc kubenswrapper[4837]: I0313 12:00:28.228842 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556720-wqrqr"] Mar 13 12:00:28 crc kubenswrapper[4837]: W0313 12:00:28.235937 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1335d65b_c0fb_4085_86eb_d948f797ef68.slice/crio-dc09c091aad8a901e4525c66559bbe36864c7168b149b94e87403d64c4f1e9a6 WatchSource:0}: Error finding container dc09c091aad8a901e4525c66559bbe36864c7168b149b94e87403d64c4f1e9a6: Status 404 returned error can't find the container with id dc09c091aad8a901e4525c66559bbe36864c7168b149b94e87403d64c4f1e9a6 Mar 13 12:00:28 crc kubenswrapper[4837]: I0313 12:00:28.584793 4837 generic.go:334] "Generic (PLEG): container finished" podID="c49e70e5-a4f6-4782-aa38-2faeb20ec38a" containerID="fd40f2ee3c46e2914f7c79e21fff2402975f3e74643fb0fdb37eea494930a16a" exitCode=0 Mar 13 12:00:28 crc kubenswrapper[4837]: I0313 12:00:28.584873 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h" event={"ID":"c49e70e5-a4f6-4782-aa38-2faeb20ec38a","Type":"ContainerDied","Data":"fd40f2ee3c46e2914f7c79e21fff2402975f3e74643fb0fdb37eea494930a16a"} Mar 13 12:00:28 crc kubenswrapper[4837]: I0313 12:00:28.586330 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556720-wqrqr" event={"ID":"1335d65b-c0fb-4085-86eb-d948f797ef68","Type":"ContainerStarted","Data":"dc09c091aad8a901e4525c66559bbe36864c7168b149b94e87403d64c4f1e9a6"} Mar 13 12:00:28 crc kubenswrapper[4837]: I0313 12:00:28.589738 4837 generic.go:334] "Generic (PLEG): container finished" podID="a6d18151-32fe-4457-814f-33c3ed53dab8" containerID="2d2bfd751903359f1fbdf915afe9614d288e33b823b0215d4cd3578202f69f1c" exitCode=0 Mar 13 12:00:28 crc kubenswrapper[4837]: I0313 12:00:28.589789 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" event={"ID":"a6d18151-32fe-4457-814f-33c3ed53dab8","Type":"ContainerDied","Data":"2d2bfd751903359f1fbdf915afe9614d288e33b823b0215d4cd3578202f69f1c"} Mar 13 12:00:29 crc kubenswrapper[4837]: I0313 12:00:29.596918 4837 generic.go:334] "Generic (PLEG): container finished" podID="1335d65b-c0fb-4085-86eb-d948f797ef68" containerID="1a04d5901dd1375cafd0fc584ce462f13000b8c9b02a1c2603aedb866420cd51" exitCode=0 Mar 13 12:00:29 crc kubenswrapper[4837]: I0313 12:00:29.597822 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556720-wqrqr" event={"ID":"1335d65b-c0fb-4085-86eb-d948f797ef68","Type":"ContainerDied","Data":"1a04d5901dd1375cafd0fc584ce462f13000b8c9b02a1c2603aedb866420cd51"} Mar 13 12:00:29 crc kubenswrapper[4837]: I0313 12:00:29.810427 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" Mar 13 12:00:29 crc kubenswrapper[4837]: I0313 12:00:29.930078 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfjfp\" (UniqueName: \"kubernetes.io/projected/a6d18151-32fe-4457-814f-33c3ed53dab8-kube-api-access-zfjfp\") pod \"a6d18151-32fe-4457-814f-33c3ed53dab8\" (UID: \"a6d18151-32fe-4457-814f-33c3ed53dab8\") " Mar 13 12:00:29 crc kubenswrapper[4837]: I0313 12:00:29.930200 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a6d18151-32fe-4457-814f-33c3ed53dab8-config-volume\") pod \"a6d18151-32fe-4457-814f-33c3ed53dab8\" (UID: \"a6d18151-32fe-4457-814f-33c3ed53dab8\") " Mar 13 12:00:29 crc kubenswrapper[4837]: I0313 12:00:29.930265 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a6d18151-32fe-4457-814f-33c3ed53dab8-secret-volume\") pod \"a6d18151-32fe-4457-814f-33c3ed53dab8\" (UID: \"a6d18151-32fe-4457-814f-33c3ed53dab8\") " Mar 13 12:00:29 crc kubenswrapper[4837]: I0313 12:00:29.931158 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6d18151-32fe-4457-814f-33c3ed53dab8-config-volume" (OuterVolumeSpecName: "config-volume") pod "a6d18151-32fe-4457-814f-33c3ed53dab8" (UID: "a6d18151-32fe-4457-814f-33c3ed53dab8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:00:29 crc kubenswrapper[4837]: I0313 12:00:29.935970 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6d18151-32fe-4457-814f-33c3ed53dab8-kube-api-access-zfjfp" (OuterVolumeSpecName: "kube-api-access-zfjfp") pod "a6d18151-32fe-4457-814f-33c3ed53dab8" (UID: "a6d18151-32fe-4457-814f-33c3ed53dab8"). InnerVolumeSpecName "kube-api-access-zfjfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:00:29 crc kubenswrapper[4837]: I0313 12:00:29.936078 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6d18151-32fe-4457-814f-33c3ed53dab8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a6d18151-32fe-4457-814f-33c3ed53dab8" (UID: "a6d18151-32fe-4457-814f-33c3ed53dab8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:00:30 crc kubenswrapper[4837]: I0313 12:00:30.032258 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfjfp\" (UniqueName: \"kubernetes.io/projected/a6d18151-32fe-4457-814f-33c3ed53dab8-kube-api-access-zfjfp\") on node \"crc\" DevicePath \"\"" Mar 13 12:00:30 crc kubenswrapper[4837]: I0313 12:00:30.032296 4837 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a6d18151-32fe-4457-814f-33c3ed53dab8-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 12:00:30 crc kubenswrapper[4837]: I0313 12:00:30.032305 4837 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a6d18151-32fe-4457-814f-33c3ed53dab8-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 12:00:30 crc kubenswrapper[4837]: I0313 12:00:30.608506 4837 generic.go:334] "Generic (PLEG): container finished" podID="c49e70e5-a4f6-4782-aa38-2faeb20ec38a" containerID="98d1e6ef469c7b5964bf18569c9c28706c631a5559be2f9d43b97d26249a2d7c" exitCode=0 Mar 13 12:00:30 crc kubenswrapper[4837]: I0313 12:00:30.608573 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h" event={"ID":"c49e70e5-a4f6-4782-aa38-2faeb20ec38a","Type":"ContainerDied","Data":"98d1e6ef469c7b5964bf18569c9c28706c631a5559be2f9d43b97d26249a2d7c"} Mar 13 12:00:30 crc kubenswrapper[4837]: I0313 12:00:30.612589 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" Mar 13 12:00:30 crc kubenswrapper[4837]: I0313 12:00:30.613853 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc" event={"ID":"a6d18151-32fe-4457-814f-33c3ed53dab8","Type":"ContainerDied","Data":"e11c2e11328d466588f9e5269543514fe5ebff4bbda4de6454f2bd5db61ff79e"} Mar 13 12:00:30 crc kubenswrapper[4837]: I0313 12:00:30.613913 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e11c2e11328d466588f9e5269543514fe5ebff4bbda4de6454f2bd5db61ff79e" Mar 13 12:00:30 crc kubenswrapper[4837]: I0313 12:00:30.835124 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556720-wqrqr" Mar 13 12:00:30 crc kubenswrapper[4837]: I0313 12:00:30.944574 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fflv4\" (UniqueName: \"kubernetes.io/projected/1335d65b-c0fb-4085-86eb-d948f797ef68-kube-api-access-fflv4\") pod \"1335d65b-c0fb-4085-86eb-d948f797ef68\" (UID: \"1335d65b-c0fb-4085-86eb-d948f797ef68\") " Mar 13 12:00:30 crc kubenswrapper[4837]: I0313 12:00:30.949477 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1335d65b-c0fb-4085-86eb-d948f797ef68-kube-api-access-fflv4" (OuterVolumeSpecName: "kube-api-access-fflv4") pod "1335d65b-c0fb-4085-86eb-d948f797ef68" (UID: "1335d65b-c0fb-4085-86eb-d948f797ef68"). InnerVolumeSpecName "kube-api-access-fflv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:00:31 crc kubenswrapper[4837]: I0313 12:00:31.045763 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fflv4\" (UniqueName: \"kubernetes.io/projected/1335d65b-c0fb-4085-86eb-d948f797ef68-kube-api-access-fflv4\") on node \"crc\" DevicePath \"\"" Mar 13 12:00:31 crc kubenswrapper[4837]: I0313 12:00:31.622341 4837 generic.go:334] "Generic (PLEG): container finished" podID="c49e70e5-a4f6-4782-aa38-2faeb20ec38a" containerID="518a2132bb0b0d605b34515994dcb95f1f8ab534bc9ae285b7a96e1e9d3840e5" exitCode=0 Mar 13 12:00:31 crc kubenswrapper[4837]: I0313 12:00:31.622410 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h" event={"ID":"c49e70e5-a4f6-4782-aa38-2faeb20ec38a","Type":"ContainerDied","Data":"518a2132bb0b0d605b34515994dcb95f1f8ab534bc9ae285b7a96e1e9d3840e5"} Mar 13 12:00:31 crc kubenswrapper[4837]: I0313 12:00:31.624992 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556720-wqrqr" event={"ID":"1335d65b-c0fb-4085-86eb-d948f797ef68","Type":"ContainerDied","Data":"dc09c091aad8a901e4525c66559bbe36864c7168b149b94e87403d64c4f1e9a6"} Mar 13 12:00:31 crc kubenswrapper[4837]: I0313 12:00:31.625026 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc09c091aad8a901e4525c66559bbe36864c7168b149b94e87403d64c4f1e9a6" Mar 13 12:00:31 crc kubenswrapper[4837]: I0313 12:00:31.625074 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556720-wqrqr" Mar 13 12:00:31 crc kubenswrapper[4837]: I0313 12:00:31.909567 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556714-jzzgx"] Mar 13 12:00:31 crc kubenswrapper[4837]: I0313 12:00:31.913602 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556714-jzzgx"] Mar 13 12:00:32 crc kubenswrapper[4837]: I0313 12:00:32.867505 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h" Mar 13 12:00:32 crc kubenswrapper[4837]: I0313 12:00:32.968542 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c49e70e5-a4f6-4782-aa38-2faeb20ec38a-util\") pod \"c49e70e5-a4f6-4782-aa38-2faeb20ec38a\" (UID: \"c49e70e5-a4f6-4782-aa38-2faeb20ec38a\") " Mar 13 12:00:32 crc kubenswrapper[4837]: I0313 12:00:32.968698 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hh7sc\" (UniqueName: \"kubernetes.io/projected/c49e70e5-a4f6-4782-aa38-2faeb20ec38a-kube-api-access-hh7sc\") pod \"c49e70e5-a4f6-4782-aa38-2faeb20ec38a\" (UID: \"c49e70e5-a4f6-4782-aa38-2faeb20ec38a\") " Mar 13 12:00:32 crc kubenswrapper[4837]: I0313 12:00:32.968828 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c49e70e5-a4f6-4782-aa38-2faeb20ec38a-bundle\") pod \"c49e70e5-a4f6-4782-aa38-2faeb20ec38a\" (UID: \"c49e70e5-a4f6-4782-aa38-2faeb20ec38a\") " Mar 13 12:00:32 crc kubenswrapper[4837]: I0313 12:00:32.969925 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c49e70e5-a4f6-4782-aa38-2faeb20ec38a-bundle" (OuterVolumeSpecName: "bundle") pod "c49e70e5-a4f6-4782-aa38-2faeb20ec38a" (UID: "c49e70e5-a4f6-4782-aa38-2faeb20ec38a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:00:32 crc kubenswrapper[4837]: I0313 12:00:32.976523 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c49e70e5-a4f6-4782-aa38-2faeb20ec38a-kube-api-access-hh7sc" (OuterVolumeSpecName: "kube-api-access-hh7sc") pod "c49e70e5-a4f6-4782-aa38-2faeb20ec38a" (UID: "c49e70e5-a4f6-4782-aa38-2faeb20ec38a"). InnerVolumeSpecName "kube-api-access-hh7sc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:00:32 crc kubenswrapper[4837]: I0313 12:00:32.993368 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c49e70e5-a4f6-4782-aa38-2faeb20ec38a-util" (OuterVolumeSpecName: "util") pod "c49e70e5-a4f6-4782-aa38-2faeb20ec38a" (UID: "c49e70e5-a4f6-4782-aa38-2faeb20ec38a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:00:33 crc kubenswrapper[4837]: I0313 12:00:33.058845 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b7a269a-3d94-4758-922d-9886312f2a25" path="/var/lib/kubelet/pods/2b7a269a-3d94-4758-922d-9886312f2a25/volumes" Mar 13 12:00:33 crc kubenswrapper[4837]: I0313 12:00:33.070522 4837 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c49e70e5-a4f6-4782-aa38-2faeb20ec38a-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:00:33 crc kubenswrapper[4837]: I0313 12:00:33.070556 4837 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c49e70e5-a4f6-4782-aa38-2faeb20ec38a-util\") on node \"crc\" DevicePath \"\"" Mar 13 12:00:33 crc kubenswrapper[4837]: I0313 12:00:33.070566 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hh7sc\" (UniqueName: \"kubernetes.io/projected/c49e70e5-a4f6-4782-aa38-2faeb20ec38a-kube-api-access-hh7sc\") on node \"crc\" DevicePath \"\"" Mar 13 12:00:33 crc kubenswrapper[4837]: I0313 12:00:33.640900 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h" event={"ID":"c49e70e5-a4f6-4782-aa38-2faeb20ec38a","Type":"ContainerDied","Data":"197d1987346ba3a53d3f2e66c5ace726d54ba6e0c9cc65dbb51ca5434993db91"} Mar 13 12:00:33 crc kubenswrapper[4837]: I0313 12:00:33.640966 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="197d1987346ba3a53d3f2e66c5ace726d54ba6e0c9cc65dbb51ca5434993db91" Mar 13 12:00:33 crc kubenswrapper[4837]: I0313 12:00:33.641064 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h" Mar 13 12:00:38 crc kubenswrapper[4837]: I0313 12:00:38.112877 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-zf78q"] Mar 13 12:00:38 crc kubenswrapper[4837]: E0313 12:00:38.113586 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c49e70e5-a4f6-4782-aa38-2faeb20ec38a" containerName="util" Mar 13 12:00:38 crc kubenswrapper[4837]: I0313 12:00:38.113600 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="c49e70e5-a4f6-4782-aa38-2faeb20ec38a" containerName="util" Mar 13 12:00:38 crc kubenswrapper[4837]: E0313 12:00:38.113613 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c49e70e5-a4f6-4782-aa38-2faeb20ec38a" containerName="extract" Mar 13 12:00:38 crc kubenswrapper[4837]: I0313 12:00:38.113618 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="c49e70e5-a4f6-4782-aa38-2faeb20ec38a" containerName="extract" Mar 13 12:00:38 crc kubenswrapper[4837]: E0313 12:00:38.113631 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c49e70e5-a4f6-4782-aa38-2faeb20ec38a" containerName="pull" Mar 13 12:00:38 crc kubenswrapper[4837]: I0313 12:00:38.113654 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="c49e70e5-a4f6-4782-aa38-2faeb20ec38a" containerName="pull" Mar 13 12:00:38 crc kubenswrapper[4837]: E0313 12:00:38.113669 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1335d65b-c0fb-4085-86eb-d948f797ef68" containerName="oc" Mar 13 12:00:38 crc kubenswrapper[4837]: I0313 12:00:38.113674 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="1335d65b-c0fb-4085-86eb-d948f797ef68" containerName="oc" Mar 13 12:00:38 crc kubenswrapper[4837]: E0313 12:00:38.113686 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6d18151-32fe-4457-814f-33c3ed53dab8" containerName="collect-profiles" Mar 13 12:00:38 crc kubenswrapper[4837]: I0313 12:00:38.113691 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6d18151-32fe-4457-814f-33c3ed53dab8" containerName="collect-profiles" Mar 13 12:00:38 crc kubenswrapper[4837]: I0313 12:00:38.113776 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6d18151-32fe-4457-814f-33c3ed53dab8" containerName="collect-profiles" Mar 13 12:00:38 crc kubenswrapper[4837]: I0313 12:00:38.113788 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="c49e70e5-a4f6-4782-aa38-2faeb20ec38a" containerName="extract" Mar 13 12:00:38 crc kubenswrapper[4837]: I0313 12:00:38.113799 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="1335d65b-c0fb-4085-86eb-d948f797ef68" containerName="oc" Mar 13 12:00:38 crc kubenswrapper[4837]: I0313 12:00:38.114140 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-zf78q" Mar 13 12:00:38 crc kubenswrapper[4837]: I0313 12:00:38.115997 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 13 12:00:38 crc kubenswrapper[4837]: I0313 12:00:38.116486 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-cfzms" Mar 13 12:00:38 crc kubenswrapper[4837]: I0313 12:00:38.116579 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 13 12:00:38 crc kubenswrapper[4837]: I0313 12:00:38.122424 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-zf78q"] Mar 13 12:00:38 crc kubenswrapper[4837]: I0313 12:00:38.237301 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rqd6\" (UniqueName: \"kubernetes.io/projected/ef7096b9-861a-4889-9318-535c35151777-kube-api-access-9rqd6\") pod \"nmstate-operator-796d4cfff4-zf78q\" (UID: \"ef7096b9-861a-4889-9318-535c35151777\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-zf78q" Mar 13 12:00:38 crc kubenswrapper[4837]: I0313 12:00:38.338835 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rqd6\" (UniqueName: \"kubernetes.io/projected/ef7096b9-861a-4889-9318-535c35151777-kube-api-access-9rqd6\") pod \"nmstate-operator-796d4cfff4-zf78q\" (UID: \"ef7096b9-861a-4889-9318-535c35151777\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-zf78q" Mar 13 12:00:38 crc kubenswrapper[4837]: I0313 12:00:38.364305 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rqd6\" (UniqueName: \"kubernetes.io/projected/ef7096b9-861a-4889-9318-535c35151777-kube-api-access-9rqd6\") pod \"nmstate-operator-796d4cfff4-zf78q\" (UID: \"ef7096b9-861a-4889-9318-535c35151777\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-zf78q" Mar 13 12:00:38 crc kubenswrapper[4837]: I0313 12:00:38.429964 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-zf78q" Mar 13 12:00:38 crc kubenswrapper[4837]: I0313 12:00:38.629573 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-zf78q"] Mar 13 12:00:38 crc kubenswrapper[4837]: W0313 12:00:38.638411 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef7096b9_861a_4889_9318_535c35151777.slice/crio-7f6edabb0070547e15deb962b244bbed98a0578ef1a7447be43c187885888f8f WatchSource:0}: Error finding container 7f6edabb0070547e15deb962b244bbed98a0578ef1a7447be43c187885888f8f: Status 404 returned error can't find the container with id 7f6edabb0070547e15deb962b244bbed98a0578ef1a7447be43c187885888f8f Mar 13 12:00:38 crc kubenswrapper[4837]: I0313 12:00:38.672703 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-zf78q" event={"ID":"ef7096b9-861a-4889-9318-535c35151777","Type":"ContainerStarted","Data":"7f6edabb0070547e15deb962b244bbed98a0578ef1a7447be43c187885888f8f"} Mar 13 12:00:41 crc kubenswrapper[4837]: I0313 12:00:41.690540 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-zf78q" event={"ID":"ef7096b9-861a-4889-9318-535c35151777","Type":"ContainerStarted","Data":"b547724cd9bc88b4ef0a860c645ac542cb68b7437143e52f8ae4ff67ee817dc2"} Mar 13 12:00:41 crc kubenswrapper[4837]: I0313 12:00:41.707369 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-zf78q" podStartSLOduration=1.197140837 podStartE2EDuration="3.707348569s" podCreationTimestamp="2026-03-13 12:00:38 +0000 UTC" firstStartedPulling="2026-03-13 12:00:38.640914777 +0000 UTC m=+754.279181550" lastFinishedPulling="2026-03-13 12:00:41.151122519 +0000 UTC m=+756.789389282" observedRunningTime="2026-03-13 12:00:41.705400206 +0000 UTC m=+757.343666989" watchObservedRunningTime="2026-03-13 12:00:41.707348569 +0000 UTC m=+757.345615332" Mar 13 12:00:46 crc kubenswrapper[4837]: I0313 12:00:46.876371 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-8xzdk"] Mar 13 12:00:46 crc kubenswrapper[4837]: I0313 12:00:46.877984 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-8xzdk" Mar 13 12:00:46 crc kubenswrapper[4837]: I0313 12:00:46.880231 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-m78qn" Mar 13 12:00:46 crc kubenswrapper[4837]: I0313 12:00:46.888520 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-8xzdk"] Mar 13 12:00:46 crc kubenswrapper[4837]: I0313 12:00:46.893184 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-6cx5h"] Mar 13 12:00:46 crc kubenswrapper[4837]: I0313 12:00:46.894037 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-6cx5h" Mar 13 12:00:46 crc kubenswrapper[4837]: I0313 12:00:46.896471 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 13 12:00:46 crc kubenswrapper[4837]: I0313 12:00:46.911250 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-vqqqz"] Mar 13 12:00:46 crc kubenswrapper[4837]: I0313 12:00:46.912086 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-vqqqz" Mar 13 12:00:46 crc kubenswrapper[4837]: I0313 12:00:46.925794 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-6cx5h"] Mar 13 12:00:46 crc kubenswrapper[4837]: I0313 12:00:46.944147 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8h2h\" (UniqueName: \"kubernetes.io/projected/5d1f2d02-86ab-4679-a4e4-530ad37e4302-kube-api-access-m8h2h\") pod \"nmstate-metrics-9b8c8685d-8xzdk\" (UID: \"5d1f2d02-86ab-4679-a4e4-530ad37e4302\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-8xzdk" Mar 13 12:00:46 crc kubenswrapper[4837]: I0313 12:00:46.944245 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/0b06c77a-f41d-41a6-b115-f12cc5109c0c-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-6cx5h\" (UID: \"0b06c77a-f41d-41a6-b115-f12cc5109c0c\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-6cx5h" Mar 13 12:00:46 crc kubenswrapper[4837]: I0313 12:00:46.944283 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4985t\" (UniqueName: \"kubernetes.io/projected/0b06c77a-f41d-41a6-b115-f12cc5109c0c-kube-api-access-4985t\") pod \"nmstate-webhook-5f558f5558-6cx5h\" (UID: \"0b06c77a-f41d-41a6-b115-f12cc5109c0c\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-6cx5h" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.010182 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-fpxmr"] Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.010841 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-fpxmr" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.012684 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.013827 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.014236 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-xpb8c" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.019597 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-fpxmr"] Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.045513 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ebe31727-805d-472e-89d3-e99b11435be1-ovs-socket\") pod \"nmstate-handler-vqqqz\" (UID: \"ebe31727-805d-472e-89d3-e99b11435be1\") " pod="openshift-nmstate/nmstate-handler-vqqqz" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.045555 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm2mf\" (UniqueName: \"kubernetes.io/projected/00b31b3f-b520-493a-ad26-679e09376e81-kube-api-access-cm2mf\") pod \"nmstate-console-plugin-86f58fcf4-fpxmr\" (UID: \"00b31b3f-b520-493a-ad26-679e09376e81\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-fpxmr" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.045587 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ebe31727-805d-472e-89d3-e99b11435be1-nmstate-lock\") pod \"nmstate-handler-vqqqz\" (UID: \"ebe31727-805d-472e-89d3-e99b11435be1\") " pod="openshift-nmstate/nmstate-handler-vqqqz" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.045618 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8h2h\" (UniqueName: \"kubernetes.io/projected/5d1f2d02-86ab-4679-a4e4-530ad37e4302-kube-api-access-m8h2h\") pod \"nmstate-metrics-9b8c8685d-8xzdk\" (UID: \"5d1f2d02-86ab-4679-a4e4-530ad37e4302\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-8xzdk" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.045650 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ebe31727-805d-472e-89d3-e99b11435be1-dbus-socket\") pod \"nmstate-handler-vqqqz\" (UID: \"ebe31727-805d-472e-89d3-e99b11435be1\") " pod="openshift-nmstate/nmstate-handler-vqqqz" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.045791 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/00b31b3f-b520-493a-ad26-679e09376e81-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-fpxmr\" (UID: \"00b31b3f-b520-493a-ad26-679e09376e81\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-fpxmr" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.045873 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/00b31b3f-b520-493a-ad26-679e09376e81-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-fpxmr\" (UID: \"00b31b3f-b520-493a-ad26-679e09376e81\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-fpxmr" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.045922 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78lxd\" (UniqueName: \"kubernetes.io/projected/ebe31727-805d-472e-89d3-e99b11435be1-kube-api-access-78lxd\") pod \"nmstate-handler-vqqqz\" (UID: \"ebe31727-805d-472e-89d3-e99b11435be1\") " pod="openshift-nmstate/nmstate-handler-vqqqz" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.045953 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/0b06c77a-f41d-41a6-b115-f12cc5109c0c-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-6cx5h\" (UID: \"0b06c77a-f41d-41a6-b115-f12cc5109c0c\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-6cx5h" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.046005 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4985t\" (UniqueName: \"kubernetes.io/projected/0b06c77a-f41d-41a6-b115-f12cc5109c0c-kube-api-access-4985t\") pod \"nmstate-webhook-5f558f5558-6cx5h\" (UID: \"0b06c77a-f41d-41a6-b115-f12cc5109c0c\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-6cx5h" Mar 13 12:00:47 crc kubenswrapper[4837]: E0313 12:00:47.046152 4837 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 13 12:00:47 crc kubenswrapper[4837]: E0313 12:00:47.046227 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b06c77a-f41d-41a6-b115-f12cc5109c0c-tls-key-pair podName:0b06c77a-f41d-41a6-b115-f12cc5109c0c nodeName:}" failed. No retries permitted until 2026-03-13 12:00:47.546208177 +0000 UTC m=+763.184474940 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/0b06c77a-f41d-41a6-b115-f12cc5109c0c-tls-key-pair") pod "nmstate-webhook-5f558f5558-6cx5h" (UID: "0b06c77a-f41d-41a6-b115-f12cc5109c0c") : secret "openshift-nmstate-webhook" not found Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.071130 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8h2h\" (UniqueName: \"kubernetes.io/projected/5d1f2d02-86ab-4679-a4e4-530ad37e4302-kube-api-access-m8h2h\") pod \"nmstate-metrics-9b8c8685d-8xzdk\" (UID: \"5d1f2d02-86ab-4679-a4e4-530ad37e4302\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-8xzdk" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.078135 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4985t\" (UniqueName: \"kubernetes.io/projected/0b06c77a-f41d-41a6-b115-f12cc5109c0c-kube-api-access-4985t\") pod \"nmstate-webhook-5f558f5558-6cx5h\" (UID: \"0b06c77a-f41d-41a6-b115-f12cc5109c0c\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-6cx5h" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.147242 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ebe31727-805d-472e-89d3-e99b11435be1-dbus-socket\") pod \"nmstate-handler-vqqqz\" (UID: \"ebe31727-805d-472e-89d3-e99b11435be1\") " pod="openshift-nmstate/nmstate-handler-vqqqz" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.147292 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/00b31b3f-b520-493a-ad26-679e09376e81-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-fpxmr\" (UID: \"00b31b3f-b520-493a-ad26-679e09376e81\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-fpxmr" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.147342 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/00b31b3f-b520-493a-ad26-679e09376e81-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-fpxmr\" (UID: \"00b31b3f-b520-493a-ad26-679e09376e81\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-fpxmr" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.147361 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78lxd\" (UniqueName: \"kubernetes.io/projected/ebe31727-805d-472e-89d3-e99b11435be1-kube-api-access-78lxd\") pod \"nmstate-handler-vqqqz\" (UID: \"ebe31727-805d-472e-89d3-e99b11435be1\") " pod="openshift-nmstate/nmstate-handler-vqqqz" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.147407 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ebe31727-805d-472e-89d3-e99b11435be1-ovs-socket\") pod \"nmstate-handler-vqqqz\" (UID: \"ebe31727-805d-472e-89d3-e99b11435be1\") " pod="openshift-nmstate/nmstate-handler-vqqqz" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.147422 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm2mf\" (UniqueName: \"kubernetes.io/projected/00b31b3f-b520-493a-ad26-679e09376e81-kube-api-access-cm2mf\") pod \"nmstate-console-plugin-86f58fcf4-fpxmr\" (UID: \"00b31b3f-b520-493a-ad26-679e09376e81\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-fpxmr" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.147446 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ebe31727-805d-472e-89d3-e99b11435be1-nmstate-lock\") pod \"nmstate-handler-vqqqz\" (UID: \"ebe31727-805d-472e-89d3-e99b11435be1\") " pod="openshift-nmstate/nmstate-handler-vqqqz" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.147682 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ebe31727-805d-472e-89d3-e99b11435be1-dbus-socket\") pod \"nmstate-handler-vqqqz\" (UID: \"ebe31727-805d-472e-89d3-e99b11435be1\") " pod="openshift-nmstate/nmstate-handler-vqqqz" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.148207 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ebe31727-805d-472e-89d3-e99b11435be1-ovs-socket\") pod \"nmstate-handler-vqqqz\" (UID: \"ebe31727-805d-472e-89d3-e99b11435be1\") " pod="openshift-nmstate/nmstate-handler-vqqqz" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.148451 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ebe31727-805d-472e-89d3-e99b11435be1-nmstate-lock\") pod \"nmstate-handler-vqqqz\" (UID: \"ebe31727-805d-472e-89d3-e99b11435be1\") " pod="openshift-nmstate/nmstate-handler-vqqqz" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.149588 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/00b31b3f-b520-493a-ad26-679e09376e81-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-fpxmr\" (UID: \"00b31b3f-b520-493a-ad26-679e09376e81\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-fpxmr" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.154769 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/00b31b3f-b520-493a-ad26-679e09376e81-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-fpxmr\" (UID: \"00b31b3f-b520-493a-ad26-679e09376e81\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-fpxmr" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.172544 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78lxd\" (UniqueName: \"kubernetes.io/projected/ebe31727-805d-472e-89d3-e99b11435be1-kube-api-access-78lxd\") pod \"nmstate-handler-vqqqz\" (UID: \"ebe31727-805d-472e-89d3-e99b11435be1\") " pod="openshift-nmstate/nmstate-handler-vqqqz" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.176569 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm2mf\" (UniqueName: \"kubernetes.io/projected/00b31b3f-b520-493a-ad26-679e09376e81-kube-api-access-cm2mf\") pod \"nmstate-console-plugin-86f58fcf4-fpxmr\" (UID: \"00b31b3f-b520-493a-ad26-679e09376e81\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-fpxmr" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.193162 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-8xzdk" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.213230 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-854454756c-m4vqj"] Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.214074 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-854454756c-m4vqj" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.226216 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-854454756c-m4vqj"] Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.232195 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-vqqqz" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.327301 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-fpxmr" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.350542 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5a7ec137-20d8-418a-a85e-70034882f17b-console-oauth-config\") pod \"console-854454756c-m4vqj\" (UID: \"5a7ec137-20d8-418a-a85e-70034882f17b\") " pod="openshift-console/console-854454756c-m4vqj" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.350613 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5a7ec137-20d8-418a-a85e-70034882f17b-service-ca\") pod \"console-854454756c-m4vqj\" (UID: \"5a7ec137-20d8-418a-a85e-70034882f17b\") " pod="openshift-console/console-854454756c-m4vqj" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.350665 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpd9s\" (UniqueName: \"kubernetes.io/projected/5a7ec137-20d8-418a-a85e-70034882f17b-kube-api-access-vpd9s\") pod \"console-854454756c-m4vqj\" (UID: \"5a7ec137-20d8-418a-a85e-70034882f17b\") " pod="openshift-console/console-854454756c-m4vqj" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.350701 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5a7ec137-20d8-418a-a85e-70034882f17b-console-serving-cert\") pod \"console-854454756c-m4vqj\" (UID: \"5a7ec137-20d8-418a-a85e-70034882f17b\") " pod="openshift-console/console-854454756c-m4vqj" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.350749 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5a7ec137-20d8-418a-a85e-70034882f17b-console-config\") pod \"console-854454756c-m4vqj\" (UID: \"5a7ec137-20d8-418a-a85e-70034882f17b\") " pod="openshift-console/console-854454756c-m4vqj" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.350825 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a7ec137-20d8-418a-a85e-70034882f17b-trusted-ca-bundle\") pod \"console-854454756c-m4vqj\" (UID: \"5a7ec137-20d8-418a-a85e-70034882f17b\") " pod="openshift-console/console-854454756c-m4vqj" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.350875 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5a7ec137-20d8-418a-a85e-70034882f17b-oauth-serving-cert\") pod \"console-854454756c-m4vqj\" (UID: \"5a7ec137-20d8-418a-a85e-70034882f17b\") " pod="openshift-console/console-854454756c-m4vqj" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.452393 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5a7ec137-20d8-418a-a85e-70034882f17b-console-oauth-config\") pod \"console-854454756c-m4vqj\" (UID: \"5a7ec137-20d8-418a-a85e-70034882f17b\") " pod="openshift-console/console-854454756c-m4vqj" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.452453 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5a7ec137-20d8-418a-a85e-70034882f17b-service-ca\") pod \"console-854454756c-m4vqj\" (UID: \"5a7ec137-20d8-418a-a85e-70034882f17b\") " pod="openshift-console/console-854454756c-m4vqj" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.452489 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpd9s\" (UniqueName: \"kubernetes.io/projected/5a7ec137-20d8-418a-a85e-70034882f17b-kube-api-access-vpd9s\") pod \"console-854454756c-m4vqj\" (UID: \"5a7ec137-20d8-418a-a85e-70034882f17b\") " pod="openshift-console/console-854454756c-m4vqj" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.452512 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5a7ec137-20d8-418a-a85e-70034882f17b-console-serving-cert\") pod \"console-854454756c-m4vqj\" (UID: \"5a7ec137-20d8-418a-a85e-70034882f17b\") " pod="openshift-console/console-854454756c-m4vqj" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.452536 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5a7ec137-20d8-418a-a85e-70034882f17b-console-config\") pod \"console-854454756c-m4vqj\" (UID: \"5a7ec137-20d8-418a-a85e-70034882f17b\") " pod="openshift-console/console-854454756c-m4vqj" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.452555 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a7ec137-20d8-418a-a85e-70034882f17b-trusted-ca-bundle\") pod \"console-854454756c-m4vqj\" (UID: \"5a7ec137-20d8-418a-a85e-70034882f17b\") " pod="openshift-console/console-854454756c-m4vqj" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.452587 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5a7ec137-20d8-418a-a85e-70034882f17b-oauth-serving-cert\") pod \"console-854454756c-m4vqj\" (UID: \"5a7ec137-20d8-418a-a85e-70034882f17b\") " pod="openshift-console/console-854454756c-m4vqj" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.453712 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5a7ec137-20d8-418a-a85e-70034882f17b-oauth-serving-cert\") pod \"console-854454756c-m4vqj\" (UID: \"5a7ec137-20d8-418a-a85e-70034882f17b\") " pod="openshift-console/console-854454756c-m4vqj" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.453778 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5a7ec137-20d8-418a-a85e-70034882f17b-console-config\") pod \"console-854454756c-m4vqj\" (UID: \"5a7ec137-20d8-418a-a85e-70034882f17b\") " pod="openshift-console/console-854454756c-m4vqj" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.453863 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5a7ec137-20d8-418a-a85e-70034882f17b-service-ca\") pod \"console-854454756c-m4vqj\" (UID: \"5a7ec137-20d8-418a-a85e-70034882f17b\") " pod="openshift-console/console-854454756c-m4vqj" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.454384 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a7ec137-20d8-418a-a85e-70034882f17b-trusted-ca-bundle\") pod \"console-854454756c-m4vqj\" (UID: \"5a7ec137-20d8-418a-a85e-70034882f17b\") " pod="openshift-console/console-854454756c-m4vqj" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.457397 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5a7ec137-20d8-418a-a85e-70034882f17b-console-oauth-config\") pod \"console-854454756c-m4vqj\" (UID: \"5a7ec137-20d8-418a-a85e-70034882f17b\") " pod="openshift-console/console-854454756c-m4vqj" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.458534 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5a7ec137-20d8-418a-a85e-70034882f17b-console-serving-cert\") pod \"console-854454756c-m4vqj\" (UID: \"5a7ec137-20d8-418a-a85e-70034882f17b\") " pod="openshift-console/console-854454756c-m4vqj" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.472568 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpd9s\" (UniqueName: \"kubernetes.io/projected/5a7ec137-20d8-418a-a85e-70034882f17b-kube-api-access-vpd9s\") pod \"console-854454756c-m4vqj\" (UID: \"5a7ec137-20d8-418a-a85e-70034882f17b\") " pod="openshift-console/console-854454756c-m4vqj" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.529801 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-854454756c-m4vqj" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.553975 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/0b06c77a-f41d-41a6-b115-f12cc5109c0c-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-6cx5h\" (UID: \"0b06c77a-f41d-41a6-b115-f12cc5109c0c\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-6cx5h" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.557449 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/0b06c77a-f41d-41a6-b115-f12cc5109c0c-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-6cx5h\" (UID: \"0b06c77a-f41d-41a6-b115-f12cc5109c0c\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-6cx5h" Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.605714 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-8xzdk"] Mar 13 12:00:47 crc kubenswrapper[4837]: W0313 12:00:47.624945 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d1f2d02_86ab_4679_a4e4_530ad37e4302.slice/crio-a45c2b0e8e0bbe0b8df88fc14927993ad8f2a61aeb5953dd1784c7e56716e258 WatchSource:0}: Error finding container a45c2b0e8e0bbe0b8df88fc14927993ad8f2a61aeb5953dd1784c7e56716e258: Status 404 returned error can't find the container with id a45c2b0e8e0bbe0b8df88fc14927993ad8f2a61aeb5953dd1784c7e56716e258 Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.699521 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-854454756c-m4vqj"] Mar 13 12:00:47 crc kubenswrapper[4837]: W0313 12:00:47.703350 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a7ec137_20d8_418a_a85e_70034882f17b.slice/crio-9ca0676c9684bc9fdaf0b00e8c14402e7bc7de2ba404993ecda5a6ee276442b9 WatchSource:0}: Error finding container 9ca0676c9684bc9fdaf0b00e8c14402e7bc7de2ba404993ecda5a6ee276442b9: Status 404 returned error can't find the container with id 9ca0676c9684bc9fdaf0b00e8c14402e7bc7de2ba404993ecda5a6ee276442b9 Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.728314 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-fpxmr"] Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.733418 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-854454756c-m4vqj" event={"ID":"5a7ec137-20d8-418a-a85e-70034882f17b","Type":"ContainerStarted","Data":"9ca0676c9684bc9fdaf0b00e8c14402e7bc7de2ba404993ecda5a6ee276442b9"} Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.734478 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-vqqqz" event={"ID":"ebe31727-805d-472e-89d3-e99b11435be1","Type":"ContainerStarted","Data":"60fa02310af7adab0a694c629c6c25df0f989119c50ec725c46f2c14e712994f"} Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.735772 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-8xzdk" event={"ID":"5d1f2d02-86ab-4679-a4e4-530ad37e4302","Type":"ContainerStarted","Data":"a45c2b0e8e0bbe0b8df88fc14927993ad8f2a61aeb5953dd1784c7e56716e258"} Mar 13 12:00:47 crc kubenswrapper[4837]: I0313 12:00:47.807094 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-6cx5h" Mar 13 12:00:48 crc kubenswrapper[4837]: I0313 12:00:48.006836 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-6cx5h"] Mar 13 12:00:48 crc kubenswrapper[4837]: I0313 12:00:48.742655 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-fpxmr" event={"ID":"00b31b3f-b520-493a-ad26-679e09376e81","Type":"ContainerStarted","Data":"dcbed1a67c9dc1d66a41de49083bbe771dddbc7f5a38bbf0ab421de04ecfe33d"} Mar 13 12:00:48 crc kubenswrapper[4837]: I0313 12:00:48.743844 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-6cx5h" event={"ID":"0b06c77a-f41d-41a6-b115-f12cc5109c0c","Type":"ContainerStarted","Data":"ca50d1b1a8f24579c70b00e4de89f7644e846fd9eb1c8e85b4fac31c249d87b8"} Mar 13 12:00:48 crc kubenswrapper[4837]: I0313 12:00:48.745570 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-854454756c-m4vqj" event={"ID":"5a7ec137-20d8-418a-a85e-70034882f17b","Type":"ContainerStarted","Data":"a817a4b9822e441997d9c09c8a2b1479db54f18e4babb28b7cd340fe18b1bf1a"} Mar 13 12:00:48 crc kubenswrapper[4837]: I0313 12:00:48.771612 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-854454756c-m4vqj" podStartSLOduration=1.771594417 podStartE2EDuration="1.771594417s" podCreationTimestamp="2026-03-13 12:00:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:00:48.768818059 +0000 UTC m=+764.407084852" watchObservedRunningTime="2026-03-13 12:00:48.771594417 +0000 UTC m=+764.409861180" Mar 13 12:00:51 crc kubenswrapper[4837]: I0313 12:00:51.766024 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-6cx5h" event={"ID":"0b06c77a-f41d-41a6-b115-f12cc5109c0c","Type":"ContainerStarted","Data":"071ac06be957e77530585aa750a43b9b20e43da820b08a24ce544438007919af"} Mar 13 12:00:51 crc kubenswrapper[4837]: I0313 12:00:51.766698 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-6cx5h" Mar 13 12:00:51 crc kubenswrapper[4837]: I0313 12:00:51.768088 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-vqqqz" event={"ID":"ebe31727-805d-472e-89d3-e99b11435be1","Type":"ContainerStarted","Data":"3d4ed2e89ee1a42b4883155ccb45c1352121d16356c3190690720e7ed39eab4d"} Mar 13 12:00:51 crc kubenswrapper[4837]: I0313 12:00:51.768200 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-vqqqz" Mar 13 12:00:51 crc kubenswrapper[4837]: I0313 12:00:51.769877 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-fpxmr" event={"ID":"00b31b3f-b520-493a-ad26-679e09376e81","Type":"ContainerStarted","Data":"98059a45b58908574f3200c3b0d93a3aadd72da2cafdf805fa0b95c154ed526f"} Mar 13 12:00:51 crc kubenswrapper[4837]: I0313 12:00:51.771123 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-8xzdk" event={"ID":"5d1f2d02-86ab-4679-a4e4-530ad37e4302","Type":"ContainerStarted","Data":"c3ffde7165f32e645801163b2a292038815fc9c3f316f07353aff84a66ebc113"} Mar 13 12:00:51 crc kubenswrapper[4837]: I0313 12:00:51.786077 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-6cx5h" podStartSLOduration=3.182402109 podStartE2EDuration="5.786055436s" podCreationTimestamp="2026-03-13 12:00:46 +0000 UTC" firstStartedPulling="2026-03-13 12:00:48.014648502 +0000 UTC m=+763.652915265" lastFinishedPulling="2026-03-13 12:00:50.618301829 +0000 UTC m=+766.256568592" observedRunningTime="2026-03-13 12:00:51.784436464 +0000 UTC m=+767.422703227" watchObservedRunningTime="2026-03-13 12:00:51.786055436 +0000 UTC m=+767.424322199" Mar 13 12:00:51 crc kubenswrapper[4837]: I0313 12:00:51.806156 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-vqqqz" podStartSLOduration=2.482703446 podStartE2EDuration="5.806141314s" podCreationTimestamp="2026-03-13 12:00:46 +0000 UTC" firstStartedPulling="2026-03-13 12:00:47.273356865 +0000 UTC m=+762.911623628" lastFinishedPulling="2026-03-13 12:00:50.596794713 +0000 UTC m=+766.235061496" observedRunningTime="2026-03-13 12:00:51.804386189 +0000 UTC m=+767.442652952" watchObservedRunningTime="2026-03-13 12:00:51.806141314 +0000 UTC m=+767.444408077" Mar 13 12:00:51 crc kubenswrapper[4837]: I0313 12:00:51.825554 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-fpxmr" podStartSLOduration=2.971000153 podStartE2EDuration="5.825537392s" podCreationTimestamp="2026-03-13 12:00:46 +0000 UTC" firstStartedPulling="2026-03-13 12:00:47.73961049 +0000 UTC m=+763.377877253" lastFinishedPulling="2026-03-13 12:00:50.594147689 +0000 UTC m=+766.232414492" observedRunningTime="2026-03-13 12:00:51.821484553 +0000 UTC m=+767.459751326" watchObservedRunningTime="2026-03-13 12:00:51.825537392 +0000 UTC m=+767.463804155" Mar 13 12:00:53 crc kubenswrapper[4837]: I0313 12:00:53.783631 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-8xzdk" event={"ID":"5d1f2d02-86ab-4679-a4e4-530ad37e4302","Type":"ContainerStarted","Data":"77dff13be078c7601290f9b90ec50db8b6bef2a40f9e3f30dbb760d47ce80e19"} Mar 13 12:00:53 crc kubenswrapper[4837]: I0313 12:00:53.805408 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-8xzdk" podStartSLOduration=2.266604648 podStartE2EDuration="7.805390129s" podCreationTimestamp="2026-03-13 12:00:46 +0000 UTC" firstStartedPulling="2026-03-13 12:00:47.626591154 +0000 UTC m=+763.264857917" lastFinishedPulling="2026-03-13 12:00:53.165376635 +0000 UTC m=+768.803643398" observedRunningTime="2026-03-13 12:00:53.802676613 +0000 UTC m=+769.440943376" watchObservedRunningTime="2026-03-13 12:00:53.805390129 +0000 UTC m=+769.443656892" Mar 13 12:00:57 crc kubenswrapper[4837]: I0313 12:00:57.259959 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-vqqqz" Mar 13 12:00:57 crc kubenswrapper[4837]: I0313 12:00:57.530482 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-854454756c-m4vqj" Mar 13 12:00:57 crc kubenswrapper[4837]: I0313 12:00:57.530555 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-854454756c-m4vqj" Mar 13 12:00:57 crc kubenswrapper[4837]: I0313 12:00:57.536551 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-854454756c-m4vqj" Mar 13 12:00:57 crc kubenswrapper[4837]: I0313 12:00:57.820092 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-854454756c-m4vqj" Mar 13 12:00:57 crc kubenswrapper[4837]: I0313 12:00:57.873942 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-q2qpt"] Mar 13 12:01:05 crc kubenswrapper[4837]: I0313 12:01:05.484417 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:01:05 crc kubenswrapper[4837]: I0313 12:01:05.484969 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:01:07 crc kubenswrapper[4837]: I0313 12:01:07.816413 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-6cx5h" Mar 13 12:01:21 crc kubenswrapper[4837]: I0313 12:01:21.539574 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf"] Mar 13 12:01:21 crc kubenswrapper[4837]: I0313 12:01:21.541740 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf" Mar 13 12:01:21 crc kubenswrapper[4837]: I0313 12:01:21.544174 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 13 12:01:21 crc kubenswrapper[4837]: I0313 12:01:21.553063 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf"] Mar 13 12:01:21 crc kubenswrapper[4837]: I0313 12:01:21.702997 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdwzz\" (UniqueName: \"kubernetes.io/projected/b1863878-b849-4485-9e78-35c9f9856697-kube-api-access-kdwzz\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf\" (UID: \"b1863878-b849-4485-9e78-35c9f9856697\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf" Mar 13 12:01:21 crc kubenswrapper[4837]: I0313 12:01:21.703087 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b1863878-b849-4485-9e78-35c9f9856697-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf\" (UID: \"b1863878-b849-4485-9e78-35c9f9856697\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf" Mar 13 12:01:21 crc kubenswrapper[4837]: I0313 12:01:21.703318 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b1863878-b849-4485-9e78-35c9f9856697-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf\" (UID: \"b1863878-b849-4485-9e78-35c9f9856697\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf" Mar 13 12:01:21 crc kubenswrapper[4837]: I0313 12:01:21.804101 4837 scope.go:117] "RemoveContainer" containerID="35377d4210b529c8401b806fa107dba5beb6002cbc3a3ce3ea9ad22bd10d0960" Mar 13 12:01:21 crc kubenswrapper[4837]: I0313 12:01:21.805114 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdwzz\" (UniqueName: \"kubernetes.io/projected/b1863878-b849-4485-9e78-35c9f9856697-kube-api-access-kdwzz\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf\" (UID: \"b1863878-b849-4485-9e78-35c9f9856697\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf" Mar 13 12:01:21 crc kubenswrapper[4837]: I0313 12:01:21.805225 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b1863878-b849-4485-9e78-35c9f9856697-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf\" (UID: \"b1863878-b849-4485-9e78-35c9f9856697\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf" Mar 13 12:01:21 crc kubenswrapper[4837]: I0313 12:01:21.805325 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b1863878-b849-4485-9e78-35c9f9856697-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf\" (UID: \"b1863878-b849-4485-9e78-35c9f9856697\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf" Mar 13 12:01:21 crc kubenswrapper[4837]: I0313 12:01:21.806251 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b1863878-b849-4485-9e78-35c9f9856697-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf\" (UID: \"b1863878-b849-4485-9e78-35c9f9856697\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf" Mar 13 12:01:21 crc kubenswrapper[4837]: I0313 12:01:21.806740 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b1863878-b849-4485-9e78-35c9f9856697-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf\" (UID: \"b1863878-b849-4485-9e78-35c9f9856697\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf" Mar 13 12:01:21 crc kubenswrapper[4837]: I0313 12:01:21.833228 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdwzz\" (UniqueName: \"kubernetes.io/projected/b1863878-b849-4485-9e78-35c9f9856697-kube-api-access-kdwzz\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf\" (UID: \"b1863878-b849-4485-9e78-35c9f9856697\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf" Mar 13 12:01:21 crc kubenswrapper[4837]: I0313 12:01:21.871801 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf" Mar 13 12:01:22 crc kubenswrapper[4837]: I0313 12:01:22.071228 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf"] Mar 13 12:01:22 crc kubenswrapper[4837]: I0313 12:01:22.112370 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf" event={"ID":"b1863878-b849-4485-9e78-35c9f9856697","Type":"ContainerStarted","Data":"39ce74619aec31a4e35d3a5468f0f1734a404c9cf2b1ede413f01a38b7ff24cd"} Mar 13 12:01:22 crc kubenswrapper[4837]: I0313 12:01:22.936516 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-q2qpt" podUID="c83842ec-9933-4f84-bb4a-c84ca61a28e1" containerName="console" containerID="cri-o://c3e3e9b2ed47e2f7480af78d679ab1d816ea01c193c35244aa52793e0f02f112" gracePeriod=15 Mar 13 12:01:23 crc kubenswrapper[4837]: I0313 12:01:23.121987 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-q2qpt_c83842ec-9933-4f84-bb4a-c84ca61a28e1/console/0.log" Mar 13 12:01:23 crc kubenswrapper[4837]: I0313 12:01:23.122263 4837 generic.go:334] "Generic (PLEG): container finished" podID="c83842ec-9933-4f84-bb4a-c84ca61a28e1" containerID="c3e3e9b2ed47e2f7480af78d679ab1d816ea01c193c35244aa52793e0f02f112" exitCode=2 Mar 13 12:01:23 crc kubenswrapper[4837]: I0313 12:01:23.122363 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-q2qpt" event={"ID":"c83842ec-9933-4f84-bb4a-c84ca61a28e1","Type":"ContainerDied","Data":"c3e3e9b2ed47e2f7480af78d679ab1d816ea01c193c35244aa52793e0f02f112"} Mar 13 12:01:23 crc kubenswrapper[4837]: I0313 12:01:23.126911 4837 generic.go:334] "Generic (PLEG): container finished" podID="b1863878-b849-4485-9e78-35c9f9856697" containerID="d70d2a6694e0790bf492b578e3f25e7018641768f6c2060e8273ce8eba4c9dad" exitCode=0 Mar 13 12:01:23 crc kubenswrapper[4837]: I0313 12:01:23.126942 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf" event={"ID":"b1863878-b849-4485-9e78-35c9f9856697","Type":"ContainerDied","Data":"d70d2a6694e0790bf492b578e3f25e7018641768f6c2060e8273ce8eba4c9dad"} Mar 13 12:01:23 crc kubenswrapper[4837]: I0313 12:01:23.283127 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-q2qpt_c83842ec-9933-4f84-bb4a-c84ca61a28e1/console/0.log" Mar 13 12:01:23 crc kubenswrapper[4837]: I0313 12:01:23.283227 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 12:01:23 crc kubenswrapper[4837]: I0313 12:01:23.425922 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c83842ec-9933-4f84-bb4a-c84ca61a28e1-trusted-ca-bundle\") pod \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\" (UID: \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\") " Mar 13 12:01:23 crc kubenswrapper[4837]: I0313 12:01:23.426065 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c83842ec-9933-4f84-bb4a-c84ca61a28e1-console-serving-cert\") pod \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\" (UID: \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\") " Mar 13 12:01:23 crc kubenswrapper[4837]: I0313 12:01:23.426144 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jpjrd\" (UniqueName: \"kubernetes.io/projected/c83842ec-9933-4f84-bb4a-c84ca61a28e1-kube-api-access-jpjrd\") pod \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\" (UID: \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\") " Mar 13 12:01:23 crc kubenswrapper[4837]: I0313 12:01:23.426268 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c83842ec-9933-4f84-bb4a-c84ca61a28e1-console-config\") pod \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\" (UID: \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\") " Mar 13 12:01:23 crc kubenswrapper[4837]: I0313 12:01:23.426357 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c83842ec-9933-4f84-bb4a-c84ca61a28e1-console-oauth-config\") pod \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\" (UID: \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\") " Mar 13 12:01:23 crc kubenswrapper[4837]: I0313 12:01:23.426421 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c83842ec-9933-4f84-bb4a-c84ca61a28e1-service-ca\") pod \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\" (UID: \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\") " Mar 13 12:01:23 crc kubenswrapper[4837]: I0313 12:01:23.426475 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c83842ec-9933-4f84-bb4a-c84ca61a28e1-oauth-serving-cert\") pod \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\" (UID: \"c83842ec-9933-4f84-bb4a-c84ca61a28e1\") " Mar 13 12:01:23 crc kubenswrapper[4837]: I0313 12:01:23.427314 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c83842ec-9933-4f84-bb4a-c84ca61a28e1-console-config" (OuterVolumeSpecName: "console-config") pod "c83842ec-9933-4f84-bb4a-c84ca61a28e1" (UID: "c83842ec-9933-4f84-bb4a-c84ca61a28e1"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:01:23 crc kubenswrapper[4837]: I0313 12:01:23.427626 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c83842ec-9933-4f84-bb4a-c84ca61a28e1-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "c83842ec-9933-4f84-bb4a-c84ca61a28e1" (UID: "c83842ec-9933-4f84-bb4a-c84ca61a28e1"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:01:23 crc kubenswrapper[4837]: I0313 12:01:23.427720 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c83842ec-9933-4f84-bb4a-c84ca61a28e1-service-ca" (OuterVolumeSpecName: "service-ca") pod "c83842ec-9933-4f84-bb4a-c84ca61a28e1" (UID: "c83842ec-9933-4f84-bb4a-c84ca61a28e1"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:01:23 crc kubenswrapper[4837]: I0313 12:01:23.427800 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c83842ec-9933-4f84-bb4a-c84ca61a28e1-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c83842ec-9933-4f84-bb4a-c84ca61a28e1" (UID: "c83842ec-9933-4f84-bb4a-c84ca61a28e1"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:01:23 crc kubenswrapper[4837]: I0313 12:01:23.433576 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c83842ec-9933-4f84-bb4a-c84ca61a28e1-kube-api-access-jpjrd" (OuterVolumeSpecName: "kube-api-access-jpjrd") pod "c83842ec-9933-4f84-bb4a-c84ca61a28e1" (UID: "c83842ec-9933-4f84-bb4a-c84ca61a28e1"). InnerVolumeSpecName "kube-api-access-jpjrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:01:23 crc kubenswrapper[4837]: I0313 12:01:23.438854 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c83842ec-9933-4f84-bb4a-c84ca61a28e1-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c83842ec-9933-4f84-bb4a-c84ca61a28e1" (UID: "c83842ec-9933-4f84-bb4a-c84ca61a28e1"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:01:23 crc kubenswrapper[4837]: I0313 12:01:23.439147 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c83842ec-9933-4f84-bb4a-c84ca61a28e1-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c83842ec-9933-4f84-bb4a-c84ca61a28e1" (UID: "c83842ec-9933-4f84-bb4a-c84ca61a28e1"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:01:23 crc kubenswrapper[4837]: I0313 12:01:23.528486 4837 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c83842ec-9933-4f84-bb4a-c84ca61a28e1-console-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:01:23 crc kubenswrapper[4837]: I0313 12:01:23.528537 4837 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c83842ec-9933-4f84-bb4a-c84ca61a28e1-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:01:23 crc kubenswrapper[4837]: I0313 12:01:23.528559 4837 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c83842ec-9933-4f84-bb4a-c84ca61a28e1-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 12:01:23 crc kubenswrapper[4837]: I0313 12:01:23.528576 4837 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c83842ec-9933-4f84-bb4a-c84ca61a28e1-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 12:01:23 crc kubenswrapper[4837]: I0313 12:01:23.528595 4837 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c83842ec-9933-4f84-bb4a-c84ca61a28e1-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:01:23 crc kubenswrapper[4837]: I0313 12:01:23.528612 4837 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c83842ec-9933-4f84-bb4a-c84ca61a28e1-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 12:01:23 crc kubenswrapper[4837]: I0313 12:01:23.528629 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jpjrd\" (UniqueName: \"kubernetes.io/projected/c83842ec-9933-4f84-bb4a-c84ca61a28e1-kube-api-access-jpjrd\") on node \"crc\" DevicePath \"\"" Mar 13 12:01:24 crc kubenswrapper[4837]: I0313 12:01:24.133931 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-q2qpt_c83842ec-9933-4f84-bb4a-c84ca61a28e1/console/0.log" Mar 13 12:01:24 crc kubenswrapper[4837]: I0313 12:01:24.134049 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-q2qpt" Mar 13 12:01:24 crc kubenswrapper[4837]: I0313 12:01:24.134126 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-q2qpt" event={"ID":"c83842ec-9933-4f84-bb4a-c84ca61a28e1","Type":"ContainerDied","Data":"6d6886f8a08a9d6498bf2731a6faf601bf8b43c566b4a0dbe066c5557e5e15e0"} Mar 13 12:01:24 crc kubenswrapper[4837]: I0313 12:01:24.134196 4837 scope.go:117] "RemoveContainer" containerID="c3e3e9b2ed47e2f7480af78d679ab1d816ea01c193c35244aa52793e0f02f112" Mar 13 12:01:24 crc kubenswrapper[4837]: I0313 12:01:24.168352 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-q2qpt"] Mar 13 12:01:24 crc kubenswrapper[4837]: I0313 12:01:24.173705 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-q2qpt"] Mar 13 12:01:25 crc kubenswrapper[4837]: I0313 12:01:25.055913 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c83842ec-9933-4f84-bb4a-c84ca61a28e1" path="/var/lib/kubelet/pods/c83842ec-9933-4f84-bb4a-c84ca61a28e1/volumes" Mar 13 12:01:25 crc kubenswrapper[4837]: I0313 12:01:25.143418 4837 generic.go:334] "Generic (PLEG): container finished" podID="b1863878-b849-4485-9e78-35c9f9856697" containerID="8bd41bde4a756b05989d08b252d536c3fed0fbc6582602087dd30edfc3ffcfcd" exitCode=0 Mar 13 12:01:25 crc kubenswrapper[4837]: I0313 12:01:25.143471 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf" event={"ID":"b1863878-b849-4485-9e78-35c9f9856697","Type":"ContainerDied","Data":"8bd41bde4a756b05989d08b252d536c3fed0fbc6582602087dd30edfc3ffcfcd"} Mar 13 12:01:26 crc kubenswrapper[4837]: I0313 12:01:26.154883 4837 generic.go:334] "Generic (PLEG): container finished" podID="b1863878-b849-4485-9e78-35c9f9856697" containerID="b29e084b12ffcf45b76e91c7e0adf96d7385273533a2a302f84e65b130630738" exitCode=0 Mar 13 12:01:26 crc kubenswrapper[4837]: I0313 12:01:26.154940 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf" event={"ID":"b1863878-b849-4485-9e78-35c9f9856697","Type":"ContainerDied","Data":"b29e084b12ffcf45b76e91c7e0adf96d7385273533a2a302f84e65b130630738"} Mar 13 12:01:27 crc kubenswrapper[4837]: I0313 12:01:27.378786 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf" Mar 13 12:01:27 crc kubenswrapper[4837]: I0313 12:01:27.577396 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b1863878-b849-4485-9e78-35c9f9856697-util\") pod \"b1863878-b849-4485-9e78-35c9f9856697\" (UID: \"b1863878-b849-4485-9e78-35c9f9856697\") " Mar 13 12:01:27 crc kubenswrapper[4837]: I0313 12:01:27.577468 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b1863878-b849-4485-9e78-35c9f9856697-bundle\") pod \"b1863878-b849-4485-9e78-35c9f9856697\" (UID: \"b1863878-b849-4485-9e78-35c9f9856697\") " Mar 13 12:01:27 crc kubenswrapper[4837]: I0313 12:01:27.577496 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdwzz\" (UniqueName: \"kubernetes.io/projected/b1863878-b849-4485-9e78-35c9f9856697-kube-api-access-kdwzz\") pod \"b1863878-b849-4485-9e78-35c9f9856697\" (UID: \"b1863878-b849-4485-9e78-35c9f9856697\") " Mar 13 12:01:27 crc kubenswrapper[4837]: I0313 12:01:27.578846 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1863878-b849-4485-9e78-35c9f9856697-bundle" (OuterVolumeSpecName: "bundle") pod "b1863878-b849-4485-9e78-35c9f9856697" (UID: "b1863878-b849-4485-9e78-35c9f9856697"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:01:27 crc kubenswrapper[4837]: I0313 12:01:27.583130 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1863878-b849-4485-9e78-35c9f9856697-kube-api-access-kdwzz" (OuterVolumeSpecName: "kube-api-access-kdwzz") pod "b1863878-b849-4485-9e78-35c9f9856697" (UID: "b1863878-b849-4485-9e78-35c9f9856697"). InnerVolumeSpecName "kube-api-access-kdwzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:01:27 crc kubenswrapper[4837]: I0313 12:01:27.678940 4837 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b1863878-b849-4485-9e78-35c9f9856697-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:01:27 crc kubenswrapper[4837]: I0313 12:01:27.678981 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdwzz\" (UniqueName: \"kubernetes.io/projected/b1863878-b849-4485-9e78-35c9f9856697-kube-api-access-kdwzz\") on node \"crc\" DevicePath \"\"" Mar 13 12:01:27 crc kubenswrapper[4837]: I0313 12:01:27.727059 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1863878-b849-4485-9e78-35c9f9856697-util" (OuterVolumeSpecName: "util") pod "b1863878-b849-4485-9e78-35c9f9856697" (UID: "b1863878-b849-4485-9e78-35c9f9856697"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:01:27 crc kubenswrapper[4837]: I0313 12:01:27.780258 4837 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b1863878-b849-4485-9e78-35c9f9856697-util\") on node \"crc\" DevicePath \"\"" Mar 13 12:01:28 crc kubenswrapper[4837]: I0313 12:01:28.166267 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf" event={"ID":"b1863878-b849-4485-9e78-35c9f9856697","Type":"ContainerDied","Data":"39ce74619aec31a4e35d3a5468f0f1734a404c9cf2b1ede413f01a38b7ff24cd"} Mar 13 12:01:28 crc kubenswrapper[4837]: I0313 12:01:28.166302 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39ce74619aec31a4e35d3a5468f0f1734a404c9cf2b1ede413f01a38b7ff24cd" Mar 13 12:01:28 crc kubenswrapper[4837]: I0313 12:01:28.166308 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf" Mar 13 12:01:35 crc kubenswrapper[4837]: I0313 12:01:35.484100 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:01:35 crc kubenswrapper[4837]: I0313 12:01:35.484444 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.293989 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-dcfbdf95f-7x96d"] Mar 13 12:01:42 crc kubenswrapper[4837]: E0313 12:01:42.294855 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1863878-b849-4485-9e78-35c9f9856697" containerName="pull" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.294871 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1863878-b849-4485-9e78-35c9f9856697" containerName="pull" Mar 13 12:01:42 crc kubenswrapper[4837]: E0313 12:01:42.294919 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1863878-b849-4485-9e78-35c9f9856697" containerName="extract" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.294925 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1863878-b849-4485-9e78-35c9f9856697" containerName="extract" Mar 13 12:01:42 crc kubenswrapper[4837]: E0313 12:01:42.294935 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c83842ec-9933-4f84-bb4a-c84ca61a28e1" containerName="console" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.294941 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="c83842ec-9933-4f84-bb4a-c84ca61a28e1" containerName="console" Mar 13 12:01:42 crc kubenswrapper[4837]: E0313 12:01:42.294954 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1863878-b849-4485-9e78-35c9f9856697" containerName="util" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.294959 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1863878-b849-4485-9e78-35c9f9856697" containerName="util" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.295070 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="c83842ec-9933-4f84-bb4a-c84ca61a28e1" containerName="console" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.295083 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1863878-b849-4485-9e78-35c9f9856697" containerName="extract" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.295504 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-dcfbdf95f-7x96d" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.297202 4837 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.297546 4837 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.298191 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.298548 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.298748 4837 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-c94w4" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.308581 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-dcfbdf95f-7x96d"] Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.366711 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq8fd\" (UniqueName: \"kubernetes.io/projected/41898fd8-d078-444c-bb55-33f4fb6f3dcc-kube-api-access-cq8fd\") pod \"metallb-operator-controller-manager-dcfbdf95f-7x96d\" (UID: \"41898fd8-d078-444c-bb55-33f4fb6f3dcc\") " pod="metallb-system/metallb-operator-controller-manager-dcfbdf95f-7x96d" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.366859 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/41898fd8-d078-444c-bb55-33f4fb6f3dcc-webhook-cert\") pod \"metallb-operator-controller-manager-dcfbdf95f-7x96d\" (UID: \"41898fd8-d078-444c-bb55-33f4fb6f3dcc\") " pod="metallb-system/metallb-operator-controller-manager-dcfbdf95f-7x96d" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.367014 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/41898fd8-d078-444c-bb55-33f4fb6f3dcc-apiservice-cert\") pod \"metallb-operator-controller-manager-dcfbdf95f-7x96d\" (UID: \"41898fd8-d078-444c-bb55-33f4fb6f3dcc\") " pod="metallb-system/metallb-operator-controller-manager-dcfbdf95f-7x96d" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.467683 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/41898fd8-d078-444c-bb55-33f4fb6f3dcc-apiservice-cert\") pod \"metallb-operator-controller-manager-dcfbdf95f-7x96d\" (UID: \"41898fd8-d078-444c-bb55-33f4fb6f3dcc\") " pod="metallb-system/metallb-operator-controller-manager-dcfbdf95f-7x96d" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.467744 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cq8fd\" (UniqueName: \"kubernetes.io/projected/41898fd8-d078-444c-bb55-33f4fb6f3dcc-kube-api-access-cq8fd\") pod \"metallb-operator-controller-manager-dcfbdf95f-7x96d\" (UID: \"41898fd8-d078-444c-bb55-33f4fb6f3dcc\") " pod="metallb-system/metallb-operator-controller-manager-dcfbdf95f-7x96d" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.467777 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/41898fd8-d078-444c-bb55-33f4fb6f3dcc-webhook-cert\") pod \"metallb-operator-controller-manager-dcfbdf95f-7x96d\" (UID: \"41898fd8-d078-444c-bb55-33f4fb6f3dcc\") " pod="metallb-system/metallb-operator-controller-manager-dcfbdf95f-7x96d" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.474240 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/41898fd8-d078-444c-bb55-33f4fb6f3dcc-apiservice-cert\") pod \"metallb-operator-controller-manager-dcfbdf95f-7x96d\" (UID: \"41898fd8-d078-444c-bb55-33f4fb6f3dcc\") " pod="metallb-system/metallb-operator-controller-manager-dcfbdf95f-7x96d" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.476756 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/41898fd8-d078-444c-bb55-33f4fb6f3dcc-webhook-cert\") pod \"metallb-operator-controller-manager-dcfbdf95f-7x96d\" (UID: \"41898fd8-d078-444c-bb55-33f4fb6f3dcc\") " pod="metallb-system/metallb-operator-controller-manager-dcfbdf95f-7x96d" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.498490 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq8fd\" (UniqueName: \"kubernetes.io/projected/41898fd8-d078-444c-bb55-33f4fb6f3dcc-kube-api-access-cq8fd\") pod \"metallb-operator-controller-manager-dcfbdf95f-7x96d\" (UID: \"41898fd8-d078-444c-bb55-33f4fb6f3dcc\") " pod="metallb-system/metallb-operator-controller-manager-dcfbdf95f-7x96d" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.546295 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-59b847b88-lrvzm"] Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.547169 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-59b847b88-lrvzm" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.548827 4837 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.549185 4837 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-7tcx9" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.550095 4837 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.568554 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-59b847b88-lrvzm"] Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.613087 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-dcfbdf95f-7x96d" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.669411 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/eabfad13-4fe4-495d-8b6a-2da56ef3b826-apiservice-cert\") pod \"metallb-operator-webhook-server-59b847b88-lrvzm\" (UID: \"eabfad13-4fe4-495d-8b6a-2da56ef3b826\") " pod="metallb-system/metallb-operator-webhook-server-59b847b88-lrvzm" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.669480 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79kfq\" (UniqueName: \"kubernetes.io/projected/eabfad13-4fe4-495d-8b6a-2da56ef3b826-kube-api-access-79kfq\") pod \"metallb-operator-webhook-server-59b847b88-lrvzm\" (UID: \"eabfad13-4fe4-495d-8b6a-2da56ef3b826\") " pod="metallb-system/metallb-operator-webhook-server-59b847b88-lrvzm" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.669613 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/eabfad13-4fe4-495d-8b6a-2da56ef3b826-webhook-cert\") pod \"metallb-operator-webhook-server-59b847b88-lrvzm\" (UID: \"eabfad13-4fe4-495d-8b6a-2da56ef3b826\") " pod="metallb-system/metallb-operator-webhook-server-59b847b88-lrvzm" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.770326 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/eabfad13-4fe4-495d-8b6a-2da56ef3b826-webhook-cert\") pod \"metallb-operator-webhook-server-59b847b88-lrvzm\" (UID: \"eabfad13-4fe4-495d-8b6a-2da56ef3b826\") " pod="metallb-system/metallb-operator-webhook-server-59b847b88-lrvzm" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.770375 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/eabfad13-4fe4-495d-8b6a-2da56ef3b826-apiservice-cert\") pod \"metallb-operator-webhook-server-59b847b88-lrvzm\" (UID: \"eabfad13-4fe4-495d-8b6a-2da56ef3b826\") " pod="metallb-system/metallb-operator-webhook-server-59b847b88-lrvzm" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.770420 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79kfq\" (UniqueName: \"kubernetes.io/projected/eabfad13-4fe4-495d-8b6a-2da56ef3b826-kube-api-access-79kfq\") pod \"metallb-operator-webhook-server-59b847b88-lrvzm\" (UID: \"eabfad13-4fe4-495d-8b6a-2da56ef3b826\") " pod="metallb-system/metallb-operator-webhook-server-59b847b88-lrvzm" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.788884 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79kfq\" (UniqueName: \"kubernetes.io/projected/eabfad13-4fe4-495d-8b6a-2da56ef3b826-kube-api-access-79kfq\") pod \"metallb-operator-webhook-server-59b847b88-lrvzm\" (UID: \"eabfad13-4fe4-495d-8b6a-2da56ef3b826\") " pod="metallb-system/metallb-operator-webhook-server-59b847b88-lrvzm" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.789863 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/eabfad13-4fe4-495d-8b6a-2da56ef3b826-webhook-cert\") pod \"metallb-operator-webhook-server-59b847b88-lrvzm\" (UID: \"eabfad13-4fe4-495d-8b6a-2da56ef3b826\") " pod="metallb-system/metallb-operator-webhook-server-59b847b88-lrvzm" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.790757 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/eabfad13-4fe4-495d-8b6a-2da56ef3b826-apiservice-cert\") pod \"metallb-operator-webhook-server-59b847b88-lrvzm\" (UID: \"eabfad13-4fe4-495d-8b6a-2da56ef3b826\") " pod="metallb-system/metallb-operator-webhook-server-59b847b88-lrvzm" Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.834530 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-dcfbdf95f-7x96d"] Mar 13 12:01:42 crc kubenswrapper[4837]: W0313 12:01:42.843955 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41898fd8_d078_444c_bb55_33f4fb6f3dcc.slice/crio-556281005f3a1cf373a5555b2550b92e059922df58c9f858f29a146f510ef88b WatchSource:0}: Error finding container 556281005f3a1cf373a5555b2550b92e059922df58c9f858f29a146f510ef88b: Status 404 returned error can't find the container with id 556281005f3a1cf373a5555b2550b92e059922df58c9f858f29a146f510ef88b Mar 13 12:01:42 crc kubenswrapper[4837]: I0313 12:01:42.861921 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-59b847b88-lrvzm" Mar 13 12:01:43 crc kubenswrapper[4837]: I0313 12:01:43.166204 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-59b847b88-lrvzm"] Mar 13 12:01:43 crc kubenswrapper[4837]: W0313 12:01:43.172126 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeabfad13_4fe4_495d_8b6a_2da56ef3b826.slice/crio-d8bdc8d8711afb68826ef12318f9938164f8d7d30cc7cb4e21f4a38ce87b3fbd WatchSource:0}: Error finding container d8bdc8d8711afb68826ef12318f9938164f8d7d30cc7cb4e21f4a38ce87b3fbd: Status 404 returned error can't find the container with id d8bdc8d8711afb68826ef12318f9938164f8d7d30cc7cb4e21f4a38ce87b3fbd Mar 13 12:01:43 crc kubenswrapper[4837]: I0313 12:01:43.249274 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-dcfbdf95f-7x96d" event={"ID":"41898fd8-d078-444c-bb55-33f4fb6f3dcc","Type":"ContainerStarted","Data":"556281005f3a1cf373a5555b2550b92e059922df58c9f858f29a146f510ef88b"} Mar 13 12:01:43 crc kubenswrapper[4837]: I0313 12:01:43.250837 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-59b847b88-lrvzm" event={"ID":"eabfad13-4fe4-495d-8b6a-2da56ef3b826","Type":"ContainerStarted","Data":"d8bdc8d8711afb68826ef12318f9938164f8d7d30cc7cb4e21f4a38ce87b3fbd"} Mar 13 12:01:43 crc kubenswrapper[4837]: I0313 12:01:43.543106 4837 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 13 12:01:48 crc kubenswrapper[4837]: I0313 12:01:48.277700 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-59b847b88-lrvzm" event={"ID":"eabfad13-4fe4-495d-8b6a-2da56ef3b826","Type":"ContainerStarted","Data":"5867f75d893ec94df4853f1b22129c251b87de3e8df0d6428afe29859087ae4d"} Mar 13 12:01:48 crc kubenswrapper[4837]: I0313 12:01:48.278303 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-59b847b88-lrvzm" Mar 13 12:01:48 crc kubenswrapper[4837]: I0313 12:01:48.279495 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-dcfbdf95f-7x96d" event={"ID":"41898fd8-d078-444c-bb55-33f4fb6f3dcc","Type":"ContainerStarted","Data":"6b34c95f375b9e255d75b6073d42cd050581e42da8a9db66004cbbfaa97b1979"} Mar 13 12:01:48 crc kubenswrapper[4837]: I0313 12:01:48.279668 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-dcfbdf95f-7x96d" Mar 13 12:01:48 crc kubenswrapper[4837]: I0313 12:01:48.302299 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-59b847b88-lrvzm" podStartSLOduration=1.7485411819999999 podStartE2EDuration="6.30227717s" podCreationTimestamp="2026-03-13 12:01:42 +0000 UTC" firstStartedPulling="2026-03-13 12:01:43.176392445 +0000 UTC m=+818.814659218" lastFinishedPulling="2026-03-13 12:01:47.730128433 +0000 UTC m=+823.368395206" observedRunningTime="2026-03-13 12:01:48.297896902 +0000 UTC m=+823.936163665" watchObservedRunningTime="2026-03-13 12:01:48.30227717 +0000 UTC m=+823.940543933" Mar 13 12:01:48 crc kubenswrapper[4837]: I0313 12:01:48.318925 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-dcfbdf95f-7x96d" podStartSLOduration=1.440459591 podStartE2EDuration="6.318872886s" podCreationTimestamp="2026-03-13 12:01:42 +0000 UTC" firstStartedPulling="2026-03-13 12:01:42.845338776 +0000 UTC m=+818.483605539" lastFinishedPulling="2026-03-13 12:01:47.723752061 +0000 UTC m=+823.362018834" observedRunningTime="2026-03-13 12:01:48.318770453 +0000 UTC m=+823.957037216" watchObservedRunningTime="2026-03-13 12:01:48.318872886 +0000 UTC m=+823.957139669" Mar 13 12:02:00 crc kubenswrapper[4837]: I0313 12:02:00.132800 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556722-h599x"] Mar 13 12:02:00 crc kubenswrapper[4837]: I0313 12:02:00.134005 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556722-h599x" Mar 13 12:02:00 crc kubenswrapper[4837]: I0313 12:02:00.136065 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:02:00 crc kubenswrapper[4837]: I0313 12:02:00.138272 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:02:00 crc kubenswrapper[4837]: I0313 12:02:00.138276 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:02:00 crc kubenswrapper[4837]: I0313 12:02:00.145225 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556722-h599x"] Mar 13 12:02:00 crc kubenswrapper[4837]: I0313 12:02:00.314626 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj6p2\" (UniqueName: \"kubernetes.io/projected/be033789-27be-444d-b72e-7abbbb34b285-kube-api-access-jj6p2\") pod \"auto-csr-approver-29556722-h599x\" (UID: \"be033789-27be-444d-b72e-7abbbb34b285\") " pod="openshift-infra/auto-csr-approver-29556722-h599x" Mar 13 12:02:00 crc kubenswrapper[4837]: I0313 12:02:00.416077 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jj6p2\" (UniqueName: \"kubernetes.io/projected/be033789-27be-444d-b72e-7abbbb34b285-kube-api-access-jj6p2\") pod \"auto-csr-approver-29556722-h599x\" (UID: \"be033789-27be-444d-b72e-7abbbb34b285\") " pod="openshift-infra/auto-csr-approver-29556722-h599x" Mar 13 12:02:00 crc kubenswrapper[4837]: I0313 12:02:00.454178 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj6p2\" (UniqueName: \"kubernetes.io/projected/be033789-27be-444d-b72e-7abbbb34b285-kube-api-access-jj6p2\") pod \"auto-csr-approver-29556722-h599x\" (UID: \"be033789-27be-444d-b72e-7abbbb34b285\") " pod="openshift-infra/auto-csr-approver-29556722-h599x" Mar 13 12:02:00 crc kubenswrapper[4837]: I0313 12:02:00.751873 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556722-h599x" Mar 13 12:02:01 crc kubenswrapper[4837]: I0313 12:02:01.007346 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556722-h599x"] Mar 13 12:02:01 crc kubenswrapper[4837]: I0313 12:02:01.356053 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556722-h599x" event={"ID":"be033789-27be-444d-b72e-7abbbb34b285","Type":"ContainerStarted","Data":"c15701a0f861e4e4c3217a9bebcd0ebde36dbbecdb674e106e7fb3aae44db1c2"} Mar 13 12:02:02 crc kubenswrapper[4837]: I0313 12:02:02.872364 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-59b847b88-lrvzm" Mar 13 12:02:04 crc kubenswrapper[4837]: I0313 12:02:04.375358 4837 generic.go:334] "Generic (PLEG): container finished" podID="be033789-27be-444d-b72e-7abbbb34b285" containerID="bf1679f5dae4d4dbf23dda0605e595646a6c9aa5a55d2f380823eb7ec590b836" exitCode=0 Mar 13 12:02:04 crc kubenswrapper[4837]: I0313 12:02:04.375458 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556722-h599x" event={"ID":"be033789-27be-444d-b72e-7abbbb34b285","Type":"ContainerDied","Data":"bf1679f5dae4d4dbf23dda0605e595646a6c9aa5a55d2f380823eb7ec590b836"} Mar 13 12:02:05 crc kubenswrapper[4837]: I0313 12:02:05.484029 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:02:05 crc kubenswrapper[4837]: I0313 12:02:05.484356 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:02:05 crc kubenswrapper[4837]: I0313 12:02:05.484459 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 12:02:05 crc kubenswrapper[4837]: I0313 12:02:05.485430 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"86010f8ae6e03e22840b0db405e4816a52e1a80af0eff6188dd5d3d81e63937a"} pod="openshift-machine-config-operator/machine-config-daemon-2td4d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 12:02:05 crc kubenswrapper[4837]: I0313 12:02:05.485476 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" containerID="cri-o://86010f8ae6e03e22840b0db405e4816a52e1a80af0eff6188dd5d3d81e63937a" gracePeriod=600 Mar 13 12:02:05 crc kubenswrapper[4837]: I0313 12:02:05.622381 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556722-h599x" Mar 13 12:02:05 crc kubenswrapper[4837]: I0313 12:02:05.791469 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jj6p2\" (UniqueName: \"kubernetes.io/projected/be033789-27be-444d-b72e-7abbbb34b285-kube-api-access-jj6p2\") pod \"be033789-27be-444d-b72e-7abbbb34b285\" (UID: \"be033789-27be-444d-b72e-7abbbb34b285\") " Mar 13 12:02:05 crc kubenswrapper[4837]: I0313 12:02:05.799558 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be033789-27be-444d-b72e-7abbbb34b285-kube-api-access-jj6p2" (OuterVolumeSpecName: "kube-api-access-jj6p2") pod "be033789-27be-444d-b72e-7abbbb34b285" (UID: "be033789-27be-444d-b72e-7abbbb34b285"). InnerVolumeSpecName "kube-api-access-jj6p2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:02:05 crc kubenswrapper[4837]: I0313 12:02:05.893590 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jj6p2\" (UniqueName: \"kubernetes.io/projected/be033789-27be-444d-b72e-7abbbb34b285-kube-api-access-jj6p2\") on node \"crc\" DevicePath \"\"" Mar 13 12:02:06 crc kubenswrapper[4837]: I0313 12:02:06.389245 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556722-h599x" event={"ID":"be033789-27be-444d-b72e-7abbbb34b285","Type":"ContainerDied","Data":"c15701a0f861e4e4c3217a9bebcd0ebde36dbbecdb674e106e7fb3aae44db1c2"} Mar 13 12:02:06 crc kubenswrapper[4837]: I0313 12:02:06.389749 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c15701a0f861e4e4c3217a9bebcd0ebde36dbbecdb674e106e7fb3aae44db1c2" Mar 13 12:02:06 crc kubenswrapper[4837]: I0313 12:02:06.389277 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556722-h599x" Mar 13 12:02:06 crc kubenswrapper[4837]: I0313 12:02:06.391572 4837 generic.go:334] "Generic (PLEG): container finished" podID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerID="86010f8ae6e03e22840b0db405e4816a52e1a80af0eff6188dd5d3d81e63937a" exitCode=0 Mar 13 12:02:06 crc kubenswrapper[4837]: I0313 12:02:06.391615 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerDied","Data":"86010f8ae6e03e22840b0db405e4816a52e1a80af0eff6188dd5d3d81e63937a"} Mar 13 12:02:06 crc kubenswrapper[4837]: I0313 12:02:06.391669 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerStarted","Data":"62df99fa64e257c350cea1390039e0bd2f2c672bf6d80836ec3df94beec3d8d1"} Mar 13 12:02:06 crc kubenswrapper[4837]: I0313 12:02:06.391690 4837 scope.go:117] "RemoveContainer" containerID="2bc74d238c7c3c8f94dfb05f2715b04d643751479532bb38893e7ef8db5a10d2" Mar 13 12:02:06 crc kubenswrapper[4837]: I0313 12:02:06.674907 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556716-csq4j"] Mar 13 12:02:06 crc kubenswrapper[4837]: I0313 12:02:06.678591 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556716-csq4j"] Mar 13 12:02:07 crc kubenswrapper[4837]: I0313 12:02:07.055026 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a7b275e-9d21-4da0-8bb8-0fee8434ce82" path="/var/lib/kubelet/pods/0a7b275e-9d21-4da0-8bb8-0fee8434ce82/volumes" Mar 13 12:02:21 crc kubenswrapper[4837]: I0313 12:02:21.867148 4837 scope.go:117] "RemoveContainer" containerID="b8629809cebf6aa743a349229b16e8ffb9aaa032ac5c2d5f39b44ba6478a1a13" Mar 13 12:02:22 crc kubenswrapper[4837]: I0313 12:02:22.616546 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-dcfbdf95f-7x96d" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.419937 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-f8m9m"] Mar 13 12:02:23 crc kubenswrapper[4837]: E0313 12:02:23.420734 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be033789-27be-444d-b72e-7abbbb34b285" containerName="oc" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.420751 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="be033789-27be-444d-b72e-7abbbb34b285" containerName="oc" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.420877 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="be033789-27be-444d-b72e-7abbbb34b285" containerName="oc" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.423033 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-f8m9m" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.426098 4837 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.427154 4837 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-tj5lp" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.430754 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.433784 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-jwgl7"] Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.438289 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jwgl7" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.443756 4837 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.449329 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7wbb\" (UniqueName: \"kubernetes.io/projected/387739fd-caae-44d0-8cbb-50808d69618b-kube-api-access-j7wbb\") pod \"frr-k8s-f8m9m\" (UID: \"387739fd-caae-44d0-8cbb-50808d69618b\") " pod="metallb-system/frr-k8s-f8m9m" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.449371 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/387739fd-caae-44d0-8cbb-50808d69618b-frr-conf\") pod \"frr-k8s-f8m9m\" (UID: \"387739fd-caae-44d0-8cbb-50808d69618b\") " pod="metallb-system/frr-k8s-f8m9m" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.449400 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/387739fd-caae-44d0-8cbb-50808d69618b-frr-sockets\") pod \"frr-k8s-f8m9m\" (UID: \"387739fd-caae-44d0-8cbb-50808d69618b\") " pod="metallb-system/frr-k8s-f8m9m" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.449548 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/387739fd-caae-44d0-8cbb-50808d69618b-reloader\") pod \"frr-k8s-f8m9m\" (UID: \"387739fd-caae-44d0-8cbb-50808d69618b\") " pod="metallb-system/frr-k8s-f8m9m" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.449594 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/387739fd-caae-44d0-8cbb-50808d69618b-metrics-certs\") pod \"frr-k8s-f8m9m\" (UID: \"387739fd-caae-44d0-8cbb-50808d69618b\") " pod="metallb-system/frr-k8s-f8m9m" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.449626 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/387739fd-caae-44d0-8cbb-50808d69618b-metrics\") pod \"frr-k8s-f8m9m\" (UID: \"387739fd-caae-44d0-8cbb-50808d69618b\") " pod="metallb-system/frr-k8s-f8m9m" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.449725 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjwhv\" (UniqueName: \"kubernetes.io/projected/c72405c5-2c81-43f4-93c6-f73f9771be8b-kube-api-access-hjwhv\") pod \"frr-k8s-webhook-server-bcc4b6f68-jwgl7\" (UID: \"c72405c5-2c81-43f4-93c6-f73f9771be8b\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jwgl7" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.449590 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-jwgl7"] Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.449764 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/387739fd-caae-44d0-8cbb-50808d69618b-frr-startup\") pod \"frr-k8s-f8m9m\" (UID: \"387739fd-caae-44d0-8cbb-50808d69618b\") " pod="metallb-system/frr-k8s-f8m9m" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.449788 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c72405c5-2c81-43f4-93c6-f73f9771be8b-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-jwgl7\" (UID: \"c72405c5-2c81-43f4-93c6-f73f9771be8b\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jwgl7" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.527864 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-8skdh"] Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.528730 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-8skdh" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.530567 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.530870 4837 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-4kgm2" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.532299 4837 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.533787 4837 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.543460 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-zm9dj"] Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.544532 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-zm9dj" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.546853 4837 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.550548 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/82a5fe00-90be-47b1-a357-69942f385d4f-memberlist\") pod \"speaker-8skdh\" (UID: \"82a5fe00-90be-47b1-a357-69942f385d4f\") " pod="metallb-system/speaker-8skdh" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.550594 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/82a5fe00-90be-47b1-a357-69942f385d4f-metallb-excludel2\") pod \"speaker-8skdh\" (UID: \"82a5fe00-90be-47b1-a357-69942f385d4f\") " pod="metallb-system/speaker-8skdh" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.550628 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/387739fd-caae-44d0-8cbb-50808d69618b-reloader\") pod \"frr-k8s-f8m9m\" (UID: \"387739fd-caae-44d0-8cbb-50808d69618b\") " pod="metallb-system/frr-k8s-f8m9m" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.550682 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/387739fd-caae-44d0-8cbb-50808d69618b-metrics-certs\") pod \"frr-k8s-f8m9m\" (UID: \"387739fd-caae-44d0-8cbb-50808d69618b\") " pod="metallb-system/frr-k8s-f8m9m" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.550706 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/387739fd-caae-44d0-8cbb-50808d69618b-metrics\") pod \"frr-k8s-f8m9m\" (UID: \"387739fd-caae-44d0-8cbb-50808d69618b\") " pod="metallb-system/frr-k8s-f8m9m" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.550741 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjwhv\" (UniqueName: \"kubernetes.io/projected/c72405c5-2c81-43f4-93c6-f73f9771be8b-kube-api-access-hjwhv\") pod \"frr-k8s-webhook-server-bcc4b6f68-jwgl7\" (UID: \"c72405c5-2c81-43f4-93c6-f73f9771be8b\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jwgl7" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.550761 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/387739fd-caae-44d0-8cbb-50808d69618b-frr-startup\") pod \"frr-k8s-f8m9m\" (UID: \"387739fd-caae-44d0-8cbb-50808d69618b\") " pod="metallb-system/frr-k8s-f8m9m" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.550777 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c72405c5-2c81-43f4-93c6-f73f9771be8b-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-jwgl7\" (UID: \"c72405c5-2c81-43f4-93c6-f73f9771be8b\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jwgl7" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.550797 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/82a5fe00-90be-47b1-a357-69942f385d4f-metrics-certs\") pod \"speaker-8skdh\" (UID: \"82a5fe00-90be-47b1-a357-69942f385d4f\") " pod="metallb-system/speaker-8skdh" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.550822 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7wbb\" (UniqueName: \"kubernetes.io/projected/387739fd-caae-44d0-8cbb-50808d69618b-kube-api-access-j7wbb\") pod \"frr-k8s-f8m9m\" (UID: \"387739fd-caae-44d0-8cbb-50808d69618b\") " pod="metallb-system/frr-k8s-f8m9m" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.550837 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/387739fd-caae-44d0-8cbb-50808d69618b-frr-conf\") pod \"frr-k8s-f8m9m\" (UID: \"387739fd-caae-44d0-8cbb-50808d69618b\") " pod="metallb-system/frr-k8s-f8m9m" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.550857 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pvhj\" (UniqueName: \"kubernetes.io/projected/82a5fe00-90be-47b1-a357-69942f385d4f-kube-api-access-8pvhj\") pod \"speaker-8skdh\" (UID: \"82a5fe00-90be-47b1-a357-69942f385d4f\") " pod="metallb-system/speaker-8skdh" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.550881 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/387739fd-caae-44d0-8cbb-50808d69618b-frr-sockets\") pod \"frr-k8s-f8m9m\" (UID: \"387739fd-caae-44d0-8cbb-50808d69618b\") " pod="metallb-system/frr-k8s-f8m9m" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.551376 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/387739fd-caae-44d0-8cbb-50808d69618b-frr-sockets\") pod \"frr-k8s-f8m9m\" (UID: \"387739fd-caae-44d0-8cbb-50808d69618b\") " pod="metallb-system/frr-k8s-f8m9m" Mar 13 12:02:23 crc kubenswrapper[4837]: E0313 12:02:23.551612 4837 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.551674 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/387739fd-caae-44d0-8cbb-50808d69618b-metrics\") pod \"frr-k8s-f8m9m\" (UID: \"387739fd-caae-44d0-8cbb-50808d69618b\") " pod="metallb-system/frr-k8s-f8m9m" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.551687 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/387739fd-caae-44d0-8cbb-50808d69618b-frr-conf\") pod \"frr-k8s-f8m9m\" (UID: \"387739fd-caae-44d0-8cbb-50808d69618b\") " pod="metallb-system/frr-k8s-f8m9m" Mar 13 12:02:23 crc kubenswrapper[4837]: E0313 12:02:23.551781 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/387739fd-caae-44d0-8cbb-50808d69618b-metrics-certs podName:387739fd-caae-44d0-8cbb-50808d69618b nodeName:}" failed. No retries permitted until 2026-03-13 12:02:24.051761748 +0000 UTC m=+859.690028511 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/387739fd-caae-44d0-8cbb-50808d69618b-metrics-certs") pod "frr-k8s-f8m9m" (UID: "387739fd-caae-44d0-8cbb-50808d69618b") : secret "frr-k8s-certs-secret" not found Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.551984 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/387739fd-caae-44d0-8cbb-50808d69618b-reloader\") pod \"frr-k8s-f8m9m\" (UID: \"387739fd-caae-44d0-8cbb-50808d69618b\") " pod="metallb-system/frr-k8s-f8m9m" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.552497 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/387739fd-caae-44d0-8cbb-50808d69618b-frr-startup\") pod \"frr-k8s-f8m9m\" (UID: \"387739fd-caae-44d0-8cbb-50808d69618b\") " pod="metallb-system/frr-k8s-f8m9m" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.560595 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c72405c5-2c81-43f4-93c6-f73f9771be8b-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-jwgl7\" (UID: \"c72405c5-2c81-43f4-93c6-f73f9771be8b\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jwgl7" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.568966 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7wbb\" (UniqueName: \"kubernetes.io/projected/387739fd-caae-44d0-8cbb-50808d69618b-kube-api-access-j7wbb\") pod \"frr-k8s-f8m9m\" (UID: \"387739fd-caae-44d0-8cbb-50808d69618b\") " pod="metallb-system/frr-k8s-f8m9m" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.571365 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-zm9dj"] Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.575084 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjwhv\" (UniqueName: \"kubernetes.io/projected/c72405c5-2c81-43f4-93c6-f73f9771be8b-kube-api-access-hjwhv\") pod \"frr-k8s-webhook-server-bcc4b6f68-jwgl7\" (UID: \"c72405c5-2c81-43f4-93c6-f73f9771be8b\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jwgl7" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.652359 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/82a5fe00-90be-47b1-a357-69942f385d4f-metrics-certs\") pod \"speaker-8skdh\" (UID: \"82a5fe00-90be-47b1-a357-69942f385d4f\") " pod="metallb-system/speaker-8skdh" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.652696 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pvhj\" (UniqueName: \"kubernetes.io/projected/82a5fe00-90be-47b1-a357-69942f385d4f-kube-api-access-8pvhj\") pod \"speaker-8skdh\" (UID: \"82a5fe00-90be-47b1-a357-69942f385d4f\") " pod="metallb-system/speaker-8skdh" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.652835 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/82a5fe00-90be-47b1-a357-69942f385d4f-memberlist\") pod \"speaker-8skdh\" (UID: \"82a5fe00-90be-47b1-a357-69942f385d4f\") " pod="metallb-system/speaker-8skdh" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.652952 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/82a5fe00-90be-47b1-a357-69942f385d4f-metallb-excludel2\") pod \"speaker-8skdh\" (UID: \"82a5fe00-90be-47b1-a357-69942f385d4f\") " pod="metallb-system/speaker-8skdh" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.653072 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0ad270d6-2fc1-4ed0-8a87-bef0e59a4c88-metrics-certs\") pod \"controller-7bb4cc7c98-zm9dj\" (UID: \"0ad270d6-2fc1-4ed0-8a87-bef0e59a4c88\") " pod="metallb-system/controller-7bb4cc7c98-zm9dj" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.653191 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnk2z\" (UniqueName: \"kubernetes.io/projected/0ad270d6-2fc1-4ed0-8a87-bef0e59a4c88-kube-api-access-bnk2z\") pod \"controller-7bb4cc7c98-zm9dj\" (UID: \"0ad270d6-2fc1-4ed0-8a87-bef0e59a4c88\") " pod="metallb-system/controller-7bb4cc7c98-zm9dj" Mar 13 12:02:23 crc kubenswrapper[4837]: E0313 12:02:23.652969 4837 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.653312 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ad270d6-2fc1-4ed0-8a87-bef0e59a4c88-cert\") pod \"controller-7bb4cc7c98-zm9dj\" (UID: \"0ad270d6-2fc1-4ed0-8a87-bef0e59a4c88\") " pod="metallb-system/controller-7bb4cc7c98-zm9dj" Mar 13 12:02:23 crc kubenswrapper[4837]: E0313 12:02:23.653389 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82a5fe00-90be-47b1-a357-69942f385d4f-memberlist podName:82a5fe00-90be-47b1-a357-69942f385d4f nodeName:}" failed. No retries permitted until 2026-03-13 12:02:24.153358466 +0000 UTC m=+859.791625249 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/82a5fe00-90be-47b1-a357-69942f385d4f-memberlist") pod "speaker-8skdh" (UID: "82a5fe00-90be-47b1-a357-69942f385d4f") : secret "metallb-memberlist" not found Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.654097 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/82a5fe00-90be-47b1-a357-69942f385d4f-metallb-excludel2\") pod \"speaker-8skdh\" (UID: \"82a5fe00-90be-47b1-a357-69942f385d4f\") " pod="metallb-system/speaker-8skdh" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.656143 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/82a5fe00-90be-47b1-a357-69942f385d4f-metrics-certs\") pod \"speaker-8skdh\" (UID: \"82a5fe00-90be-47b1-a357-69942f385d4f\") " pod="metallb-system/speaker-8skdh" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.672456 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pvhj\" (UniqueName: \"kubernetes.io/projected/82a5fe00-90be-47b1-a357-69942f385d4f-kube-api-access-8pvhj\") pod \"speaker-8skdh\" (UID: \"82a5fe00-90be-47b1-a357-69942f385d4f\") " pod="metallb-system/speaker-8skdh" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.753270 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jwgl7" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.754364 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0ad270d6-2fc1-4ed0-8a87-bef0e59a4c88-metrics-certs\") pod \"controller-7bb4cc7c98-zm9dj\" (UID: \"0ad270d6-2fc1-4ed0-8a87-bef0e59a4c88\") " pod="metallb-system/controller-7bb4cc7c98-zm9dj" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.754413 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnk2z\" (UniqueName: \"kubernetes.io/projected/0ad270d6-2fc1-4ed0-8a87-bef0e59a4c88-kube-api-access-bnk2z\") pod \"controller-7bb4cc7c98-zm9dj\" (UID: \"0ad270d6-2fc1-4ed0-8a87-bef0e59a4c88\") " pod="metallb-system/controller-7bb4cc7c98-zm9dj" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.754459 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ad270d6-2fc1-4ed0-8a87-bef0e59a4c88-cert\") pod \"controller-7bb4cc7c98-zm9dj\" (UID: \"0ad270d6-2fc1-4ed0-8a87-bef0e59a4c88\") " pod="metallb-system/controller-7bb4cc7c98-zm9dj" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.757436 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0ad270d6-2fc1-4ed0-8a87-bef0e59a4c88-metrics-certs\") pod \"controller-7bb4cc7c98-zm9dj\" (UID: \"0ad270d6-2fc1-4ed0-8a87-bef0e59a4c88\") " pod="metallb-system/controller-7bb4cc7c98-zm9dj" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.759446 4837 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.769071 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ad270d6-2fc1-4ed0-8a87-bef0e59a4c88-cert\") pod \"controller-7bb4cc7c98-zm9dj\" (UID: \"0ad270d6-2fc1-4ed0-8a87-bef0e59a4c88\") " pod="metallb-system/controller-7bb4cc7c98-zm9dj" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.773386 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnk2z\" (UniqueName: \"kubernetes.io/projected/0ad270d6-2fc1-4ed0-8a87-bef0e59a4c88-kube-api-access-bnk2z\") pod \"controller-7bb4cc7c98-zm9dj\" (UID: \"0ad270d6-2fc1-4ed0-8a87-bef0e59a4c88\") " pod="metallb-system/controller-7bb4cc7c98-zm9dj" Mar 13 12:02:23 crc kubenswrapper[4837]: I0313 12:02:23.859587 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-zm9dj" Mar 13 12:02:24 crc kubenswrapper[4837]: I0313 12:02:24.057383 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/387739fd-caae-44d0-8cbb-50808d69618b-metrics-certs\") pod \"frr-k8s-f8m9m\" (UID: \"387739fd-caae-44d0-8cbb-50808d69618b\") " pod="metallb-system/frr-k8s-f8m9m" Mar 13 12:02:24 crc kubenswrapper[4837]: I0313 12:02:24.061813 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/387739fd-caae-44d0-8cbb-50808d69618b-metrics-certs\") pod \"frr-k8s-f8m9m\" (UID: \"387739fd-caae-44d0-8cbb-50808d69618b\") " pod="metallb-system/frr-k8s-f8m9m" Mar 13 12:02:24 crc kubenswrapper[4837]: I0313 12:02:24.152031 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-jwgl7"] Mar 13 12:02:24 crc kubenswrapper[4837]: I0313 12:02:24.158175 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/82a5fe00-90be-47b1-a357-69942f385d4f-memberlist\") pod \"speaker-8skdh\" (UID: \"82a5fe00-90be-47b1-a357-69942f385d4f\") " pod="metallb-system/speaker-8skdh" Mar 13 12:02:24 crc kubenswrapper[4837]: E0313 12:02:24.158426 4837 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 13 12:02:24 crc kubenswrapper[4837]: E0313 12:02:24.158517 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82a5fe00-90be-47b1-a357-69942f385d4f-memberlist podName:82a5fe00-90be-47b1-a357-69942f385d4f nodeName:}" failed. No retries permitted until 2026-03-13 12:02:25.158480496 +0000 UTC m=+860.796747279 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/82a5fe00-90be-47b1-a357-69942f385d4f-memberlist") pod "speaker-8skdh" (UID: "82a5fe00-90be-47b1-a357-69942f385d4f") : secret "metallb-memberlist" not found Mar 13 12:02:24 crc kubenswrapper[4837]: I0313 12:02:24.278700 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-zm9dj"] Mar 13 12:02:24 crc kubenswrapper[4837]: W0313 12:02:24.284989 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ad270d6_2fc1_4ed0_8a87_bef0e59a4c88.slice/crio-457175fc4d232168abb6b2f253ec67d16917fe945801bf05877ce399cde3ce96 WatchSource:0}: Error finding container 457175fc4d232168abb6b2f253ec67d16917fe945801bf05877ce399cde3ce96: Status 404 returned error can't find the container with id 457175fc4d232168abb6b2f253ec67d16917fe945801bf05877ce399cde3ce96 Mar 13 12:02:24 crc kubenswrapper[4837]: I0313 12:02:24.339047 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-f8m9m" Mar 13 12:02:24 crc kubenswrapper[4837]: I0313 12:02:24.506809 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-zm9dj" event={"ID":"0ad270d6-2fc1-4ed0-8a87-bef0e59a4c88","Type":"ContainerStarted","Data":"b8bb302cedd72c254f632582758d289ae62e01952ef332c906824ebc90cecb1d"} Mar 13 12:02:24 crc kubenswrapper[4837]: I0313 12:02:24.507142 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-zm9dj" event={"ID":"0ad270d6-2fc1-4ed0-8a87-bef0e59a4c88","Type":"ContainerStarted","Data":"457175fc4d232168abb6b2f253ec67d16917fe945801bf05877ce399cde3ce96"} Mar 13 12:02:24 crc kubenswrapper[4837]: I0313 12:02:24.508366 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-f8m9m" event={"ID":"387739fd-caae-44d0-8cbb-50808d69618b","Type":"ContainerStarted","Data":"b41e6ab652b91a272d7eee0125a9021991f1281f0073d450f1287f1aef1cfd76"} Mar 13 12:02:24 crc kubenswrapper[4837]: I0313 12:02:24.509289 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jwgl7" event={"ID":"c72405c5-2c81-43f4-93c6-f73f9771be8b","Type":"ContainerStarted","Data":"26fb66e61c49f8b8b5a46f19013cb3458098d9ddf498c1690ff836772ed59a46"} Mar 13 12:02:25 crc kubenswrapper[4837]: I0313 12:02:25.174418 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/82a5fe00-90be-47b1-a357-69942f385d4f-memberlist\") pod \"speaker-8skdh\" (UID: \"82a5fe00-90be-47b1-a357-69942f385d4f\") " pod="metallb-system/speaker-8skdh" Mar 13 12:02:25 crc kubenswrapper[4837]: I0313 12:02:25.179999 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/82a5fe00-90be-47b1-a357-69942f385d4f-memberlist\") pod \"speaker-8skdh\" (UID: \"82a5fe00-90be-47b1-a357-69942f385d4f\") " pod="metallb-system/speaker-8skdh" Mar 13 12:02:25 crc kubenswrapper[4837]: I0313 12:02:25.341270 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-8skdh" Mar 13 12:02:25 crc kubenswrapper[4837]: W0313 12:02:25.361235 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82a5fe00_90be_47b1_a357_69942f385d4f.slice/crio-8a95e0e1133aa61588bf3a7af70f6cf08e0764b1c2a26b8e431cb055e75073ef WatchSource:0}: Error finding container 8a95e0e1133aa61588bf3a7af70f6cf08e0764b1c2a26b8e431cb055e75073ef: Status 404 returned error can't find the container with id 8a95e0e1133aa61588bf3a7af70f6cf08e0764b1c2a26b8e431cb055e75073ef Mar 13 12:02:25 crc kubenswrapper[4837]: I0313 12:02:25.515994 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-zm9dj" event={"ID":"0ad270d6-2fc1-4ed0-8a87-bef0e59a4c88","Type":"ContainerStarted","Data":"89c056d8248ba48802df5e978e56d1e88f2bb66373245aa662decf4a5c20bcd3"} Mar 13 12:02:25 crc kubenswrapper[4837]: I0313 12:02:25.516132 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-zm9dj" Mar 13 12:02:25 crc kubenswrapper[4837]: I0313 12:02:25.517093 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-8skdh" event={"ID":"82a5fe00-90be-47b1-a357-69942f385d4f","Type":"ContainerStarted","Data":"8a95e0e1133aa61588bf3a7af70f6cf08e0764b1c2a26b8e431cb055e75073ef"} Mar 13 12:02:25 crc kubenswrapper[4837]: I0313 12:02:25.536114 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-zm9dj" podStartSLOduration=2.536094843 podStartE2EDuration="2.536094843s" podCreationTimestamp="2026-03-13 12:02:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:02:25.532986184 +0000 UTC m=+861.171252947" watchObservedRunningTime="2026-03-13 12:02:25.536094843 +0000 UTC m=+861.174361606" Mar 13 12:02:26 crc kubenswrapper[4837]: I0313 12:02:26.532336 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-8skdh" event={"ID":"82a5fe00-90be-47b1-a357-69942f385d4f","Type":"ContainerStarted","Data":"5be0eb884d925b3f3e7348d6400f7da8b66f5bdb2380949f71ee452caecf0ef5"} Mar 13 12:02:26 crc kubenswrapper[4837]: I0313 12:02:26.532741 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-8skdh" event={"ID":"82a5fe00-90be-47b1-a357-69942f385d4f","Type":"ContainerStarted","Data":"56dbb91f33e12bab2633a967716fbd79b888097682fe4c05115296fba6fda7d9"} Mar 13 12:02:26 crc kubenswrapper[4837]: I0313 12:02:26.560391 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-8skdh" podStartSLOduration=3.560371478 podStartE2EDuration="3.560371478s" podCreationTimestamp="2026-03-13 12:02:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:02:26.556607379 +0000 UTC m=+862.194874142" watchObservedRunningTime="2026-03-13 12:02:26.560371478 +0000 UTC m=+862.198638241" Mar 13 12:02:27 crc kubenswrapper[4837]: I0313 12:02:27.539436 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-8skdh" Mar 13 12:02:31 crc kubenswrapper[4837]: I0313 12:02:31.580679 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jwgl7" event={"ID":"c72405c5-2c81-43f4-93c6-f73f9771be8b","Type":"ContainerStarted","Data":"fc4ab59ac329b89ecfa18cdec798aa94b9bda1f43bf8a39626b79ce7619cbe23"} Mar 13 12:02:31 crc kubenswrapper[4837]: I0313 12:02:31.581201 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jwgl7" Mar 13 12:02:31 crc kubenswrapper[4837]: I0313 12:02:31.582810 4837 generic.go:334] "Generic (PLEG): container finished" podID="387739fd-caae-44d0-8cbb-50808d69618b" containerID="35462f4ca915f217aa024a344aa2bc5178b1a67828449d152ba31abfe87cc855" exitCode=0 Mar 13 12:02:31 crc kubenswrapper[4837]: I0313 12:02:31.582840 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-f8m9m" event={"ID":"387739fd-caae-44d0-8cbb-50808d69618b","Type":"ContainerDied","Data":"35462f4ca915f217aa024a344aa2bc5178b1a67828449d152ba31abfe87cc855"} Mar 13 12:02:31 crc kubenswrapper[4837]: I0313 12:02:31.600418 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jwgl7" podStartSLOduration=1.4743894229999999 podStartE2EDuration="8.600399575s" podCreationTimestamp="2026-03-13 12:02:23 +0000 UTC" firstStartedPulling="2026-03-13 12:02:24.156551235 +0000 UTC m=+859.794818008" lastFinishedPulling="2026-03-13 12:02:31.282561397 +0000 UTC m=+866.920828160" observedRunningTime="2026-03-13 12:02:31.598211165 +0000 UTC m=+867.236477938" watchObservedRunningTime="2026-03-13 12:02:31.600399575 +0000 UTC m=+867.238666338" Mar 13 12:02:32 crc kubenswrapper[4837]: I0313 12:02:32.594942 4837 generic.go:334] "Generic (PLEG): container finished" podID="387739fd-caae-44d0-8cbb-50808d69618b" containerID="29e0acbb4a6feb2029d8a9a6dd8c4183f5cd39fe99116e0bc5bbd9c5fe89c086" exitCode=0 Mar 13 12:02:32 crc kubenswrapper[4837]: I0313 12:02:32.595066 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-f8m9m" event={"ID":"387739fd-caae-44d0-8cbb-50808d69618b","Type":"ContainerDied","Data":"29e0acbb4a6feb2029d8a9a6dd8c4183f5cd39fe99116e0bc5bbd9c5fe89c086"} Mar 13 12:02:33 crc kubenswrapper[4837]: I0313 12:02:33.601418 4837 generic.go:334] "Generic (PLEG): container finished" podID="387739fd-caae-44d0-8cbb-50808d69618b" containerID="f924d13940cc44e7e612d34c839c313fd8ac1246f6d95b8a6a73e71f8b63be42" exitCode=0 Mar 13 12:02:33 crc kubenswrapper[4837]: I0313 12:02:33.601591 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-f8m9m" event={"ID":"387739fd-caae-44d0-8cbb-50808d69618b","Type":"ContainerDied","Data":"f924d13940cc44e7e612d34c839c313fd8ac1246f6d95b8a6a73e71f8b63be42"} Mar 13 12:02:34 crc kubenswrapper[4837]: I0313 12:02:34.612621 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-f8m9m" event={"ID":"387739fd-caae-44d0-8cbb-50808d69618b","Type":"ContainerStarted","Data":"853a312f5903550334f26e24dcdd9788b6411dd521712ae569793e38de62f3ac"} Mar 13 12:02:34 crc kubenswrapper[4837]: I0313 12:02:34.612986 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-f8m9m" event={"ID":"387739fd-caae-44d0-8cbb-50808d69618b","Type":"ContainerStarted","Data":"853f06af3d709819d2e3489c41e6d0dc6962dd4038a8e0973632ad9645455449"} Mar 13 12:02:34 crc kubenswrapper[4837]: I0313 12:02:34.613000 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-f8m9m" event={"ID":"387739fd-caae-44d0-8cbb-50808d69618b","Type":"ContainerStarted","Data":"6e9febbd67d8cfc28862e5ca4062fe2237c13330afe7a73d7ab8fd66b7db3ac1"} Mar 13 12:02:34 crc kubenswrapper[4837]: I0313 12:02:34.613013 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-f8m9m" event={"ID":"387739fd-caae-44d0-8cbb-50808d69618b","Type":"ContainerStarted","Data":"6e4b8decb3a41d01d34dddff7572c2acbbf179709004b0067f06773be6e96cad"} Mar 13 12:02:34 crc kubenswrapper[4837]: I0313 12:02:34.613023 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-f8m9m" event={"ID":"387739fd-caae-44d0-8cbb-50808d69618b","Type":"ContainerStarted","Data":"4c52a3263ccff805d580c7ba5c486d9728224a126099a3844174a655333fc069"} Mar 13 12:02:35 crc kubenswrapper[4837]: I0313 12:02:35.345549 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-8skdh" Mar 13 12:02:35 crc kubenswrapper[4837]: I0313 12:02:35.627414 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-f8m9m" event={"ID":"387739fd-caae-44d0-8cbb-50808d69618b","Type":"ContainerStarted","Data":"bf887ff1cd304fca174852af0fbe35baab6b293e7db603955d32f541111f8d86"} Mar 13 12:02:35 crc kubenswrapper[4837]: I0313 12:02:35.627586 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-f8m9m" Mar 13 12:02:35 crc kubenswrapper[4837]: I0313 12:02:35.659593 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-f8m9m" podStartSLOduration=5.809394477 podStartE2EDuration="12.659569662s" podCreationTimestamp="2026-03-13 12:02:23 +0000 UTC" firstStartedPulling="2026-03-13 12:02:24.44650665 +0000 UTC m=+860.084773413" lastFinishedPulling="2026-03-13 12:02:31.296681835 +0000 UTC m=+866.934948598" observedRunningTime="2026-03-13 12:02:35.65635905 +0000 UTC m=+871.294625883" watchObservedRunningTime="2026-03-13 12:02:35.659569662 +0000 UTC m=+871.297836465" Mar 13 12:02:38 crc kubenswrapper[4837]: I0313 12:02:38.061955 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-twrbr"] Mar 13 12:02:38 crc kubenswrapper[4837]: I0313 12:02:38.063190 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-twrbr" Mar 13 12:02:38 crc kubenswrapper[4837]: I0313 12:02:38.067361 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 13 12:02:38 crc kubenswrapper[4837]: I0313 12:02:38.067435 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 13 12:02:38 crc kubenswrapper[4837]: I0313 12:02:38.075924 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-twrbr"] Mar 13 12:02:38 crc kubenswrapper[4837]: I0313 12:02:38.150274 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmg2d\" (UniqueName: \"kubernetes.io/projected/a7e86b46-33ca-4192-92b4-d01e0a74007f-kube-api-access-zmg2d\") pod \"openstack-operator-index-twrbr\" (UID: \"a7e86b46-33ca-4192-92b4-d01e0a74007f\") " pod="openstack-operators/openstack-operator-index-twrbr" Mar 13 12:02:38 crc kubenswrapper[4837]: I0313 12:02:38.251096 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmg2d\" (UniqueName: \"kubernetes.io/projected/a7e86b46-33ca-4192-92b4-d01e0a74007f-kube-api-access-zmg2d\") pod \"openstack-operator-index-twrbr\" (UID: \"a7e86b46-33ca-4192-92b4-d01e0a74007f\") " pod="openstack-operators/openstack-operator-index-twrbr" Mar 13 12:02:38 crc kubenswrapper[4837]: I0313 12:02:38.272094 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmg2d\" (UniqueName: \"kubernetes.io/projected/a7e86b46-33ca-4192-92b4-d01e0a74007f-kube-api-access-zmg2d\") pod \"openstack-operator-index-twrbr\" (UID: \"a7e86b46-33ca-4192-92b4-d01e0a74007f\") " pod="openstack-operators/openstack-operator-index-twrbr" Mar 13 12:02:38 crc kubenswrapper[4837]: I0313 12:02:38.388750 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-twrbr" Mar 13 12:02:38 crc kubenswrapper[4837]: I0313 12:02:38.812618 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-twrbr"] Mar 13 12:02:39 crc kubenswrapper[4837]: I0313 12:02:39.340302 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-f8m9m" Mar 13 12:02:39 crc kubenswrapper[4837]: I0313 12:02:39.377221 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-f8m9m" Mar 13 12:02:39 crc kubenswrapper[4837]: I0313 12:02:39.666419 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-twrbr" event={"ID":"a7e86b46-33ca-4192-92b4-d01e0a74007f","Type":"ContainerStarted","Data":"43335a5723133f6d0b760252b36ec6f1a304a0d7335bf4c3d664b6185d08440a"} Mar 13 12:02:40 crc kubenswrapper[4837]: I0313 12:02:40.027425 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-twrbr"] Mar 13 12:02:40 crc kubenswrapper[4837]: I0313 12:02:40.430208 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-mdjzs"] Mar 13 12:02:40 crc kubenswrapper[4837]: I0313 12:02:40.430937 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mdjzs" Mar 13 12:02:40 crc kubenswrapper[4837]: I0313 12:02:40.435141 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-z5kdt" Mar 13 12:02:40 crc kubenswrapper[4837]: I0313 12:02:40.440344 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-mdjzs"] Mar 13 12:02:40 crc kubenswrapper[4837]: I0313 12:02:40.583735 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n5cg\" (UniqueName: \"kubernetes.io/projected/9da10ec5-aa1b-4797-91ce-04a91266831a-kube-api-access-6n5cg\") pod \"openstack-operator-index-mdjzs\" (UID: \"9da10ec5-aa1b-4797-91ce-04a91266831a\") " pod="openstack-operators/openstack-operator-index-mdjzs" Mar 13 12:02:40 crc kubenswrapper[4837]: I0313 12:02:40.685835 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n5cg\" (UniqueName: \"kubernetes.io/projected/9da10ec5-aa1b-4797-91ce-04a91266831a-kube-api-access-6n5cg\") pod \"openstack-operator-index-mdjzs\" (UID: \"9da10ec5-aa1b-4797-91ce-04a91266831a\") " pod="openstack-operators/openstack-operator-index-mdjzs" Mar 13 12:02:40 crc kubenswrapper[4837]: I0313 12:02:40.708226 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n5cg\" (UniqueName: \"kubernetes.io/projected/9da10ec5-aa1b-4797-91ce-04a91266831a-kube-api-access-6n5cg\") pod \"openstack-operator-index-mdjzs\" (UID: \"9da10ec5-aa1b-4797-91ce-04a91266831a\") " pod="openstack-operators/openstack-operator-index-mdjzs" Mar 13 12:02:40 crc kubenswrapper[4837]: I0313 12:02:40.758000 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mdjzs" Mar 13 12:02:41 crc kubenswrapper[4837]: I0313 12:02:41.575200 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-mdjzs"] Mar 13 12:02:41 crc kubenswrapper[4837]: I0313 12:02:41.684614 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-twrbr" event={"ID":"a7e86b46-33ca-4192-92b4-d01e0a74007f","Type":"ContainerStarted","Data":"274b5fb94f92e54f6dea922d649f43fbf44d8ca2323031bfbfbe1f34754319dc"} Mar 13 12:02:41 crc kubenswrapper[4837]: I0313 12:02:41.684713 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-twrbr" podUID="a7e86b46-33ca-4192-92b4-d01e0a74007f" containerName="registry-server" containerID="cri-o://274b5fb94f92e54f6dea922d649f43fbf44d8ca2323031bfbfbe1f34754319dc" gracePeriod=2 Mar 13 12:02:41 crc kubenswrapper[4837]: I0313 12:02:41.686949 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mdjzs" event={"ID":"9da10ec5-aa1b-4797-91ce-04a91266831a","Type":"ContainerStarted","Data":"f9980e6b3a1501a81063427d9fefa166155c0cd4539721d0622d9171d78b3bf4"} Mar 13 12:02:41 crc kubenswrapper[4837]: I0313 12:02:41.701378 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-twrbr" podStartSLOduration=1.3001593200000001 podStartE2EDuration="3.70135336s" podCreationTimestamp="2026-03-13 12:02:38 +0000 UTC" firstStartedPulling="2026-03-13 12:02:38.825557927 +0000 UTC m=+874.463824710" lastFinishedPulling="2026-03-13 12:02:41.226751987 +0000 UTC m=+876.865018750" observedRunningTime="2026-03-13 12:02:41.69659796 +0000 UTC m=+877.334864723" watchObservedRunningTime="2026-03-13 12:02:41.70135336 +0000 UTC m=+877.339620123" Mar 13 12:02:42 crc kubenswrapper[4837]: I0313 12:02:42.064933 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-twrbr" Mar 13 12:02:42 crc kubenswrapper[4837]: I0313 12:02:42.110201 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmg2d\" (UniqueName: \"kubernetes.io/projected/a7e86b46-33ca-4192-92b4-d01e0a74007f-kube-api-access-zmg2d\") pod \"a7e86b46-33ca-4192-92b4-d01e0a74007f\" (UID: \"a7e86b46-33ca-4192-92b4-d01e0a74007f\") " Mar 13 12:02:42 crc kubenswrapper[4837]: I0313 12:02:42.115284 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7e86b46-33ca-4192-92b4-d01e0a74007f-kube-api-access-zmg2d" (OuterVolumeSpecName: "kube-api-access-zmg2d") pod "a7e86b46-33ca-4192-92b4-d01e0a74007f" (UID: "a7e86b46-33ca-4192-92b4-d01e0a74007f"). InnerVolumeSpecName "kube-api-access-zmg2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:02:42 crc kubenswrapper[4837]: I0313 12:02:42.212240 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmg2d\" (UniqueName: \"kubernetes.io/projected/a7e86b46-33ca-4192-92b4-d01e0a74007f-kube-api-access-zmg2d\") on node \"crc\" DevicePath \"\"" Mar 13 12:02:42 crc kubenswrapper[4837]: I0313 12:02:42.694074 4837 generic.go:334] "Generic (PLEG): container finished" podID="a7e86b46-33ca-4192-92b4-d01e0a74007f" containerID="274b5fb94f92e54f6dea922d649f43fbf44d8ca2323031bfbfbe1f34754319dc" exitCode=0 Mar 13 12:02:42 crc kubenswrapper[4837]: I0313 12:02:42.694147 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-twrbr" event={"ID":"a7e86b46-33ca-4192-92b4-d01e0a74007f","Type":"ContainerDied","Data":"274b5fb94f92e54f6dea922d649f43fbf44d8ca2323031bfbfbe1f34754319dc"} Mar 13 12:02:42 crc kubenswrapper[4837]: I0313 12:02:42.694176 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-twrbr" event={"ID":"a7e86b46-33ca-4192-92b4-d01e0a74007f","Type":"ContainerDied","Data":"43335a5723133f6d0b760252b36ec6f1a304a0d7335bf4c3d664b6185d08440a"} Mar 13 12:02:42 crc kubenswrapper[4837]: I0313 12:02:42.694196 4837 scope.go:117] "RemoveContainer" containerID="274b5fb94f92e54f6dea922d649f43fbf44d8ca2323031bfbfbe1f34754319dc" Mar 13 12:02:42 crc kubenswrapper[4837]: I0313 12:02:42.694308 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-twrbr" Mar 13 12:02:42 crc kubenswrapper[4837]: I0313 12:02:42.700606 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mdjzs" event={"ID":"9da10ec5-aa1b-4797-91ce-04a91266831a","Type":"ContainerStarted","Data":"ff8978a385436862c2dd9165a5895fd9b48507db65668e8474bde2b10137cecc"} Mar 13 12:02:42 crc kubenswrapper[4837]: I0313 12:02:42.719994 4837 scope.go:117] "RemoveContainer" containerID="274b5fb94f92e54f6dea922d649f43fbf44d8ca2323031bfbfbe1f34754319dc" Mar 13 12:02:42 crc kubenswrapper[4837]: E0313 12:02:42.721881 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"274b5fb94f92e54f6dea922d649f43fbf44d8ca2323031bfbfbe1f34754319dc\": container with ID starting with 274b5fb94f92e54f6dea922d649f43fbf44d8ca2323031bfbfbe1f34754319dc not found: ID does not exist" containerID="274b5fb94f92e54f6dea922d649f43fbf44d8ca2323031bfbfbe1f34754319dc" Mar 13 12:02:42 crc kubenswrapper[4837]: I0313 12:02:42.721927 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"274b5fb94f92e54f6dea922d649f43fbf44d8ca2323031bfbfbe1f34754319dc"} err="failed to get container status \"274b5fb94f92e54f6dea922d649f43fbf44d8ca2323031bfbfbe1f34754319dc\": rpc error: code = NotFound desc = could not find container \"274b5fb94f92e54f6dea922d649f43fbf44d8ca2323031bfbfbe1f34754319dc\": container with ID starting with 274b5fb94f92e54f6dea922d649f43fbf44d8ca2323031bfbfbe1f34754319dc not found: ID does not exist" Mar 13 12:02:42 crc kubenswrapper[4837]: I0313 12:02:42.724120 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-mdjzs" podStartSLOduration=2.678480911 podStartE2EDuration="2.724096216s" podCreationTimestamp="2026-03-13 12:02:40 +0000 UTC" firstStartedPulling="2026-03-13 12:02:41.583878639 +0000 UTC m=+877.222145402" lastFinishedPulling="2026-03-13 12:02:41.629493954 +0000 UTC m=+877.267760707" observedRunningTime="2026-03-13 12:02:42.720202913 +0000 UTC m=+878.358469676" watchObservedRunningTime="2026-03-13 12:02:42.724096216 +0000 UTC m=+878.362362999" Mar 13 12:02:42 crc kubenswrapper[4837]: I0313 12:02:42.739009 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-twrbr"] Mar 13 12:02:42 crc kubenswrapper[4837]: I0313 12:02:42.744028 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-twrbr"] Mar 13 12:02:43 crc kubenswrapper[4837]: I0313 12:02:43.057231 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7e86b46-33ca-4192-92b4-d01e0a74007f" path="/var/lib/kubelet/pods/a7e86b46-33ca-4192-92b4-d01e0a74007f/volumes" Mar 13 12:02:43 crc kubenswrapper[4837]: I0313 12:02:43.757815 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jwgl7" Mar 13 12:02:43 crc kubenswrapper[4837]: I0313 12:02:43.866575 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-zm9dj" Mar 13 12:02:44 crc kubenswrapper[4837]: I0313 12:02:44.342364 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-f8m9m" Mar 13 12:02:50 crc kubenswrapper[4837]: I0313 12:02:50.758167 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-mdjzs" Mar 13 12:02:50 crc kubenswrapper[4837]: I0313 12:02:50.758763 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-mdjzs" Mar 13 12:02:50 crc kubenswrapper[4837]: I0313 12:02:50.785889 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-mdjzs" Mar 13 12:02:51 crc kubenswrapper[4837]: I0313 12:02:51.776694 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-mdjzs" Mar 13 12:03:04 crc kubenswrapper[4837]: I0313 12:03:04.069899 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b"] Mar 13 12:03:04 crc kubenswrapper[4837]: E0313 12:03:04.071347 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7e86b46-33ca-4192-92b4-d01e0a74007f" containerName="registry-server" Mar 13 12:03:04 crc kubenswrapper[4837]: I0313 12:03:04.071366 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7e86b46-33ca-4192-92b4-d01e0a74007f" containerName="registry-server" Mar 13 12:03:04 crc kubenswrapper[4837]: I0313 12:03:04.071536 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7e86b46-33ca-4192-92b4-d01e0a74007f" containerName="registry-server" Mar 13 12:03:04 crc kubenswrapper[4837]: I0313 12:03:04.072581 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b" Mar 13 12:03:04 crc kubenswrapper[4837]: I0313 12:03:04.078223 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b"] Mar 13 12:03:04 crc kubenswrapper[4837]: I0313 12:03:04.078564 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-kf99l" Mar 13 12:03:04 crc kubenswrapper[4837]: I0313 12:03:04.087563 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53ac9dfc-487a-47cf-83f2-91542b93bb95-bundle\") pod \"e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b\" (UID: \"53ac9dfc-487a-47cf-83f2-91542b93bb95\") " pod="openstack-operators/e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b" Mar 13 12:03:04 crc kubenswrapper[4837]: I0313 12:03:04.087699 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r52m\" (UniqueName: \"kubernetes.io/projected/53ac9dfc-487a-47cf-83f2-91542b93bb95-kube-api-access-8r52m\") pod \"e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b\" (UID: \"53ac9dfc-487a-47cf-83f2-91542b93bb95\") " pod="openstack-operators/e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b" Mar 13 12:03:04 crc kubenswrapper[4837]: I0313 12:03:04.087728 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53ac9dfc-487a-47cf-83f2-91542b93bb95-util\") pod \"e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b\" (UID: \"53ac9dfc-487a-47cf-83f2-91542b93bb95\") " pod="openstack-operators/e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b" Mar 13 12:03:04 crc kubenswrapper[4837]: I0313 12:03:04.188490 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53ac9dfc-487a-47cf-83f2-91542b93bb95-bundle\") pod \"e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b\" (UID: \"53ac9dfc-487a-47cf-83f2-91542b93bb95\") " pod="openstack-operators/e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b" Mar 13 12:03:04 crc kubenswrapper[4837]: I0313 12:03:04.188624 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8r52m\" (UniqueName: \"kubernetes.io/projected/53ac9dfc-487a-47cf-83f2-91542b93bb95-kube-api-access-8r52m\") pod \"e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b\" (UID: \"53ac9dfc-487a-47cf-83f2-91542b93bb95\") " pod="openstack-operators/e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b" Mar 13 12:03:04 crc kubenswrapper[4837]: I0313 12:03:04.188688 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53ac9dfc-487a-47cf-83f2-91542b93bb95-util\") pod \"e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b\" (UID: \"53ac9dfc-487a-47cf-83f2-91542b93bb95\") " pod="openstack-operators/e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b" Mar 13 12:03:04 crc kubenswrapper[4837]: I0313 12:03:04.189308 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53ac9dfc-487a-47cf-83f2-91542b93bb95-bundle\") pod \"e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b\" (UID: \"53ac9dfc-487a-47cf-83f2-91542b93bb95\") " pod="openstack-operators/e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b" Mar 13 12:03:04 crc kubenswrapper[4837]: I0313 12:03:04.189332 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53ac9dfc-487a-47cf-83f2-91542b93bb95-util\") pod \"e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b\" (UID: \"53ac9dfc-487a-47cf-83f2-91542b93bb95\") " pod="openstack-operators/e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b" Mar 13 12:03:04 crc kubenswrapper[4837]: I0313 12:03:04.212656 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r52m\" (UniqueName: \"kubernetes.io/projected/53ac9dfc-487a-47cf-83f2-91542b93bb95-kube-api-access-8r52m\") pod \"e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b\" (UID: \"53ac9dfc-487a-47cf-83f2-91542b93bb95\") " pod="openstack-operators/e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b" Mar 13 12:03:04 crc kubenswrapper[4837]: I0313 12:03:04.391039 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b" Mar 13 12:03:04 crc kubenswrapper[4837]: I0313 12:03:04.797836 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b"] Mar 13 12:03:04 crc kubenswrapper[4837]: I0313 12:03:04.835245 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b" event={"ID":"53ac9dfc-487a-47cf-83f2-91542b93bb95","Type":"ContainerStarted","Data":"db248d00a2576920070079868eff75c4075eb99329a51ca1c2b9e34722c9b26a"} Mar 13 12:03:05 crc kubenswrapper[4837]: I0313 12:03:05.843760 4837 generic.go:334] "Generic (PLEG): container finished" podID="53ac9dfc-487a-47cf-83f2-91542b93bb95" containerID="a274adcdd19e87ea805b41482270fea6edd2aed6a3b3d5426c7b9c13123d6942" exitCode=0 Mar 13 12:03:05 crc kubenswrapper[4837]: I0313 12:03:05.843875 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b" event={"ID":"53ac9dfc-487a-47cf-83f2-91542b93bb95","Type":"ContainerDied","Data":"a274adcdd19e87ea805b41482270fea6edd2aed6a3b3d5426c7b9c13123d6942"} Mar 13 12:03:05 crc kubenswrapper[4837]: I0313 12:03:05.846715 4837 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 12:03:06 crc kubenswrapper[4837]: I0313 12:03:06.855088 4837 generic.go:334] "Generic (PLEG): container finished" podID="53ac9dfc-487a-47cf-83f2-91542b93bb95" containerID="059c38107b2a0b253e4cfb490b581f58a0086997bd0ae22c09f0f4b699ad0737" exitCode=0 Mar 13 12:03:06 crc kubenswrapper[4837]: I0313 12:03:06.855171 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b" event={"ID":"53ac9dfc-487a-47cf-83f2-91542b93bb95","Type":"ContainerDied","Data":"059c38107b2a0b253e4cfb490b581f58a0086997bd0ae22c09f0f4b699ad0737"} Mar 13 12:03:07 crc kubenswrapper[4837]: I0313 12:03:07.865669 4837 generic.go:334] "Generic (PLEG): container finished" podID="53ac9dfc-487a-47cf-83f2-91542b93bb95" containerID="d9cfaabf006620bf7355481966b22d6484289d6fe86e72479ae98c95a57d85b1" exitCode=0 Mar 13 12:03:07 crc kubenswrapper[4837]: I0313 12:03:07.865729 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b" event={"ID":"53ac9dfc-487a-47cf-83f2-91542b93bb95","Type":"ContainerDied","Data":"d9cfaabf006620bf7355481966b22d6484289d6fe86e72479ae98c95a57d85b1"} Mar 13 12:03:09 crc kubenswrapper[4837]: I0313 12:03:09.135487 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b" Mar 13 12:03:09 crc kubenswrapper[4837]: I0313 12:03:09.250803 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8r52m\" (UniqueName: \"kubernetes.io/projected/53ac9dfc-487a-47cf-83f2-91542b93bb95-kube-api-access-8r52m\") pod \"53ac9dfc-487a-47cf-83f2-91542b93bb95\" (UID: \"53ac9dfc-487a-47cf-83f2-91542b93bb95\") " Mar 13 12:03:09 crc kubenswrapper[4837]: I0313 12:03:09.250936 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53ac9dfc-487a-47cf-83f2-91542b93bb95-bundle\") pod \"53ac9dfc-487a-47cf-83f2-91542b93bb95\" (UID: \"53ac9dfc-487a-47cf-83f2-91542b93bb95\") " Mar 13 12:03:09 crc kubenswrapper[4837]: I0313 12:03:09.251041 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53ac9dfc-487a-47cf-83f2-91542b93bb95-util\") pod \"53ac9dfc-487a-47cf-83f2-91542b93bb95\" (UID: \"53ac9dfc-487a-47cf-83f2-91542b93bb95\") " Mar 13 12:03:09 crc kubenswrapper[4837]: I0313 12:03:09.251654 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53ac9dfc-487a-47cf-83f2-91542b93bb95-bundle" (OuterVolumeSpecName: "bundle") pod "53ac9dfc-487a-47cf-83f2-91542b93bb95" (UID: "53ac9dfc-487a-47cf-83f2-91542b93bb95"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:03:09 crc kubenswrapper[4837]: I0313 12:03:09.256222 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53ac9dfc-487a-47cf-83f2-91542b93bb95-kube-api-access-8r52m" (OuterVolumeSpecName: "kube-api-access-8r52m") pod "53ac9dfc-487a-47cf-83f2-91542b93bb95" (UID: "53ac9dfc-487a-47cf-83f2-91542b93bb95"). InnerVolumeSpecName "kube-api-access-8r52m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:03:09 crc kubenswrapper[4837]: I0313 12:03:09.264561 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53ac9dfc-487a-47cf-83f2-91542b93bb95-util" (OuterVolumeSpecName: "util") pod "53ac9dfc-487a-47cf-83f2-91542b93bb95" (UID: "53ac9dfc-487a-47cf-83f2-91542b93bb95"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:03:09 crc kubenswrapper[4837]: I0313 12:03:09.352293 4837 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53ac9dfc-487a-47cf-83f2-91542b93bb95-util\") on node \"crc\" DevicePath \"\"" Mar 13 12:03:09 crc kubenswrapper[4837]: I0313 12:03:09.352578 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8r52m\" (UniqueName: \"kubernetes.io/projected/53ac9dfc-487a-47cf-83f2-91542b93bb95-kube-api-access-8r52m\") on node \"crc\" DevicePath \"\"" Mar 13 12:03:09 crc kubenswrapper[4837]: I0313 12:03:09.352773 4837 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53ac9dfc-487a-47cf-83f2-91542b93bb95-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:03:09 crc kubenswrapper[4837]: I0313 12:03:09.886004 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b" event={"ID":"53ac9dfc-487a-47cf-83f2-91542b93bb95","Type":"ContainerDied","Data":"db248d00a2576920070079868eff75c4075eb99329a51ca1c2b9e34722c9b26a"} Mar 13 12:03:09 crc kubenswrapper[4837]: I0313 12:03:09.886051 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db248d00a2576920070079868eff75c4075eb99329a51ca1c2b9e34722c9b26a" Mar 13 12:03:09 crc kubenswrapper[4837]: I0313 12:03:09.886069 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b" Mar 13 12:03:16 crc kubenswrapper[4837]: I0313 12:03:16.112297 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-c99df78b8-qxmfb"] Mar 13 12:03:16 crc kubenswrapper[4837]: E0313 12:03:16.113089 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53ac9dfc-487a-47cf-83f2-91542b93bb95" containerName="extract" Mar 13 12:03:16 crc kubenswrapper[4837]: I0313 12:03:16.113107 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="53ac9dfc-487a-47cf-83f2-91542b93bb95" containerName="extract" Mar 13 12:03:16 crc kubenswrapper[4837]: E0313 12:03:16.113126 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53ac9dfc-487a-47cf-83f2-91542b93bb95" containerName="pull" Mar 13 12:03:16 crc kubenswrapper[4837]: I0313 12:03:16.113133 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="53ac9dfc-487a-47cf-83f2-91542b93bb95" containerName="pull" Mar 13 12:03:16 crc kubenswrapper[4837]: E0313 12:03:16.113147 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53ac9dfc-487a-47cf-83f2-91542b93bb95" containerName="util" Mar 13 12:03:16 crc kubenswrapper[4837]: I0313 12:03:16.113155 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="53ac9dfc-487a-47cf-83f2-91542b93bb95" containerName="util" Mar 13 12:03:16 crc kubenswrapper[4837]: I0313 12:03:16.113301 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="53ac9dfc-487a-47cf-83f2-91542b93bb95" containerName="extract" Mar 13 12:03:16 crc kubenswrapper[4837]: I0313 12:03:16.113857 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-c99df78b8-qxmfb" Mar 13 12:03:16 crc kubenswrapper[4837]: I0313 12:03:16.116565 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-bvswq" Mar 13 12:03:16 crc kubenswrapper[4837]: I0313 12:03:16.132826 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-c99df78b8-qxmfb"] Mar 13 12:03:16 crc kubenswrapper[4837]: I0313 12:03:16.282027 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62h5n\" (UniqueName: \"kubernetes.io/projected/4f8c5e9e-7680-4bc3-8096-0c62a1de4da5-kube-api-access-62h5n\") pod \"openstack-operator-controller-init-c99df78b8-qxmfb\" (UID: \"4f8c5e9e-7680-4bc3-8096-0c62a1de4da5\") " pod="openstack-operators/openstack-operator-controller-init-c99df78b8-qxmfb" Mar 13 12:03:16 crc kubenswrapper[4837]: I0313 12:03:16.383370 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62h5n\" (UniqueName: \"kubernetes.io/projected/4f8c5e9e-7680-4bc3-8096-0c62a1de4da5-kube-api-access-62h5n\") pod \"openstack-operator-controller-init-c99df78b8-qxmfb\" (UID: \"4f8c5e9e-7680-4bc3-8096-0c62a1de4da5\") " pod="openstack-operators/openstack-operator-controller-init-c99df78b8-qxmfb" Mar 13 12:03:16 crc kubenswrapper[4837]: I0313 12:03:16.408331 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62h5n\" (UniqueName: \"kubernetes.io/projected/4f8c5e9e-7680-4bc3-8096-0c62a1de4da5-kube-api-access-62h5n\") pod \"openstack-operator-controller-init-c99df78b8-qxmfb\" (UID: \"4f8c5e9e-7680-4bc3-8096-0c62a1de4da5\") " pod="openstack-operators/openstack-operator-controller-init-c99df78b8-qxmfb" Mar 13 12:03:16 crc kubenswrapper[4837]: I0313 12:03:16.444068 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-c99df78b8-qxmfb" Mar 13 12:03:16 crc kubenswrapper[4837]: I0313 12:03:16.898154 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-c99df78b8-qxmfb"] Mar 13 12:03:16 crc kubenswrapper[4837]: I0313 12:03:16.935497 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-c99df78b8-qxmfb" event={"ID":"4f8c5e9e-7680-4bc3-8096-0c62a1de4da5","Type":"ContainerStarted","Data":"a91152655fd513fa810a620170966830e4d18b0c4296c0a2036388aef535cced"} Mar 13 12:03:21 crc kubenswrapper[4837]: I0313 12:03:21.973066 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-c99df78b8-qxmfb" event={"ID":"4f8c5e9e-7680-4bc3-8096-0c62a1de4da5","Type":"ContainerStarted","Data":"1b1541ed1a3eef95d859359a0a9ace57b247ce760a7f62e559775c7828759bb7"} Mar 13 12:03:21 crc kubenswrapper[4837]: I0313 12:03:21.973910 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-c99df78b8-qxmfb" Mar 13 12:03:22 crc kubenswrapper[4837]: I0313 12:03:22.021254 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-c99df78b8-qxmfb" podStartSLOduration=1.770435438 podStartE2EDuration="6.021223354s" podCreationTimestamp="2026-03-13 12:03:16 +0000 UTC" firstStartedPulling="2026-03-13 12:03:16.913162993 +0000 UTC m=+912.551429756" lastFinishedPulling="2026-03-13 12:03:21.163950909 +0000 UTC m=+916.802217672" observedRunningTime="2026-03-13 12:03:22.010552667 +0000 UTC m=+917.648819440" watchObservedRunningTime="2026-03-13 12:03:22.021223354 +0000 UTC m=+917.659490117" Mar 13 12:03:26 crc kubenswrapper[4837]: I0313 12:03:26.457773 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-c99df78b8-qxmfb" Mar 13 12:03:59 crc kubenswrapper[4837]: I0313 12:03:59.962125 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-jvdqq"] Mar 13 12:03:59 crc kubenswrapper[4837]: I0313 12:03:59.963362 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-jvdqq" Mar 13 12:03:59 crc kubenswrapper[4837]: I0313 12:03:59.965429 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-c69xz" Mar 13 12:03:59 crc kubenswrapper[4837]: I0313 12:03:59.976687 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-jvdqq"] Mar 13 12:03:59 crc kubenswrapper[4837]: I0313 12:03:59.981209 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-kbn8z"] Mar 13 12:03:59 crc kubenswrapper[4837]: I0313 12:03:59.981964 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-kbn8z" Mar 13 12:03:59 crc kubenswrapper[4837]: I0313 12:03:59.985021 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-twrwq" Mar 13 12:03:59 crc kubenswrapper[4837]: I0313 12:03:59.989065 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-b7cdx"] Mar 13 12:03:59 crc kubenswrapper[4837]: I0313 12:03:59.990201 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-b7cdx" Mar 13 12:03:59 crc kubenswrapper[4837]: I0313 12:03:59.991576 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-2lzrn" Mar 13 12:03:59 crc kubenswrapper[4837]: I0313 12:03:59.993806 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-mrgb9"] Mar 13 12:03:59 crc kubenswrapper[4837]: I0313 12:03:59.994659 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-mrgb9" Mar 13 12:03:59 crc kubenswrapper[4837]: I0313 12:03:59.996348 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-2qszj" Mar 13 12:03:59 crc kubenswrapper[4837]: I0313 12:03:59.998370 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-kbn8z"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.004119 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-b7cdx"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.009190 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-ss4rm"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.009960 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-ss4rm" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.011687 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-gk8bz" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.027986 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-mrgb9"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.066523 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-ss4rm"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.070834 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dgth\" (UniqueName: \"kubernetes.io/projected/1870e3ae-40fd-479c-9aa7-9ce3a3e2dd2e-kube-api-access-9dgth\") pod \"glance-operator-controller-manager-5964f64c48-mrgb9\" (UID: \"1870e3ae-40fd-479c-9aa7-9ce3a3e2dd2e\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-mrgb9" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.070893 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qf5qp\" (UniqueName: \"kubernetes.io/projected/0a24601d-8e41-4f99-9e33-870d791a3e7e-kube-api-access-qf5qp\") pod \"cinder-operator-controller-manager-984cd4dcf-kbn8z\" (UID: \"0a24601d-8e41-4f99-9e33-870d791a3e7e\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-kbn8z" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.070920 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvm59\" (UniqueName: \"kubernetes.io/projected/e645f00a-8463-4fac-b010-f0500b54d68a-kube-api-access-jvm59\") pod \"designate-operator-controller-manager-66d56f6ff4-b7cdx\" (UID: \"e645f00a-8463-4fac-b010-f0500b54d68a\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-b7cdx" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.070957 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffhls\" (UniqueName: \"kubernetes.io/projected/1d59bb7f-598d-4c70-9b8c-ce4e3048691f-kube-api-access-ffhls\") pod \"barbican-operator-controller-manager-677bd678f7-jvdqq\" (UID: \"1d59bb7f-598d-4c70-9b8c-ce4e3048691f\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-jvdqq" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.076286 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-bvmr7"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.094899 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-bvmr7" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.103234 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-m6md6" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.133924 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-bvmr7"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.162761 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-fhlk9"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.163775 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-fhlk9" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.171246 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.171324 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-dzbzz" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.171859 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mnsd\" (UniqueName: \"kubernetes.io/projected/11a29883-0638-4da4-a1dc-bf2127a3645c-kube-api-access-5mnsd\") pod \"horizon-operator-controller-manager-6d9d6b584d-bvmr7\" (UID: \"11a29883-0638-4da4-a1dc-bf2127a3645c\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-bvmr7" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.171917 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dgth\" (UniqueName: \"kubernetes.io/projected/1870e3ae-40fd-479c-9aa7-9ce3a3e2dd2e-kube-api-access-9dgth\") pod \"glance-operator-controller-manager-5964f64c48-mrgb9\" (UID: \"1870e3ae-40fd-479c-9aa7-9ce3a3e2dd2e\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-mrgb9" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.171950 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9jq6\" (UniqueName: \"kubernetes.io/projected/b2c881d7-03db-4608-a3f4-9a9ad8b2f5da-kube-api-access-t9jq6\") pod \"heat-operator-controller-manager-77b6666d85-ss4rm\" (UID: \"b2c881d7-03db-4608-a3f4-9a9ad8b2f5da\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-ss4rm" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.172017 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qf5qp\" (UniqueName: \"kubernetes.io/projected/0a24601d-8e41-4f99-9e33-870d791a3e7e-kube-api-access-qf5qp\") pod \"cinder-operator-controller-manager-984cd4dcf-kbn8z\" (UID: \"0a24601d-8e41-4f99-9e33-870d791a3e7e\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-kbn8z" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.172053 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvm59\" (UniqueName: \"kubernetes.io/projected/e645f00a-8463-4fac-b010-f0500b54d68a-kube-api-access-jvm59\") pod \"designate-operator-controller-manager-66d56f6ff4-b7cdx\" (UID: \"e645f00a-8463-4fac-b010-f0500b54d68a\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-b7cdx" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.172108 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffhls\" (UniqueName: \"kubernetes.io/projected/1d59bb7f-598d-4c70-9b8c-ce4e3048691f-kube-api-access-ffhls\") pod \"barbican-operator-controller-manager-677bd678f7-jvdqq\" (UID: \"1d59bb7f-598d-4c70-9b8c-ce4e3048691f\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-jvdqq" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.199378 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffhls\" (UniqueName: \"kubernetes.io/projected/1d59bb7f-598d-4c70-9b8c-ce4e3048691f-kube-api-access-ffhls\") pod \"barbican-operator-controller-manager-677bd678f7-jvdqq\" (UID: \"1d59bb7f-598d-4c70-9b8c-ce4e3048691f\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-jvdqq" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.201583 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-9zvxf"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.201945 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvm59\" (UniqueName: \"kubernetes.io/projected/e645f00a-8463-4fac-b010-f0500b54d68a-kube-api-access-jvm59\") pod \"designate-operator-controller-manager-66d56f6ff4-b7cdx\" (UID: \"e645f00a-8463-4fac-b010-f0500b54d68a\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-b7cdx" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.202698 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-9zvxf" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.205685 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-d57j6" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.207264 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qf5qp\" (UniqueName: \"kubernetes.io/projected/0a24601d-8e41-4f99-9e33-870d791a3e7e-kube-api-access-qf5qp\") pod \"cinder-operator-controller-manager-984cd4dcf-kbn8z\" (UID: \"0a24601d-8e41-4f99-9e33-870d791a3e7e\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-kbn8z" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.214552 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dgth\" (UniqueName: \"kubernetes.io/projected/1870e3ae-40fd-479c-9aa7-9ce3a3e2dd2e-kube-api-access-9dgth\") pod \"glance-operator-controller-manager-5964f64c48-mrgb9\" (UID: \"1870e3ae-40fd-479c-9aa7-9ce3a3e2dd2e\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-mrgb9" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.219280 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-fhlk9"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.244509 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-9zvxf"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.264756 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-kc2x6"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.265943 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-kc2x6" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.268534 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-cwfjs" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.270274 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-twrg7"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.271039 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-twrg7" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.273341 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mnsd\" (UniqueName: \"kubernetes.io/projected/11a29883-0638-4da4-a1dc-bf2127a3645c-kube-api-access-5mnsd\") pod \"horizon-operator-controller-manager-6d9d6b584d-bvmr7\" (UID: \"11a29883-0638-4da4-a1dc-bf2127a3645c\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-bvmr7" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.273457 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfnnw\" (UniqueName: \"kubernetes.io/projected/c19c3466-ab50-4be3-8299-d7b8b3d263df-kube-api-access-jfnnw\") pod \"infra-operator-controller-manager-5995f4446f-fhlk9\" (UID: \"c19c3466-ab50-4be3-8299-d7b8b3d263df\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-fhlk9" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.273491 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9jq6\" (UniqueName: \"kubernetes.io/projected/b2c881d7-03db-4608-a3f4-9a9ad8b2f5da-kube-api-access-t9jq6\") pod \"heat-operator-controller-manager-77b6666d85-ss4rm\" (UID: \"b2c881d7-03db-4608-a3f4-9a9ad8b2f5da\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-ss4rm" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.273548 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c19c3466-ab50-4be3-8299-d7b8b3d263df-cert\") pod \"infra-operator-controller-manager-5995f4446f-fhlk9\" (UID: \"c19c3466-ab50-4be3-8299-d7b8b3d263df\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-fhlk9" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.273585 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xb5zl\" (UniqueName: \"kubernetes.io/projected/89e6d6f8-7bd3-4862-b41c-cd5c1f05f3e5-kube-api-access-xb5zl\") pod \"ironic-operator-controller-manager-6bbb499bbc-9zvxf\" (UID: \"89e6d6f8-7bd3-4862-b41c-cd5c1f05f3e5\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-9zvxf" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.278946 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-674gz" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.283371 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-jvdqq" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.284502 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-kc2x6"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.306677 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-kbn8z" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.307852 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-twrg7"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.307911 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-7nm95"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.308937 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-7nm95" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.316998 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-b7cdx" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.317208 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-9hk47" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.317588 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mnsd\" (UniqueName: \"kubernetes.io/projected/11a29883-0638-4da4-a1dc-bf2127a3645c-kube-api-access-5mnsd\") pod \"horizon-operator-controller-manager-6d9d6b584d-bvmr7\" (UID: \"11a29883-0638-4da4-a1dc-bf2127a3645c\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-bvmr7" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.329387 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9jq6\" (UniqueName: \"kubernetes.io/projected/b2c881d7-03db-4608-a3f4-9a9ad8b2f5da-kube-api-access-t9jq6\") pod \"heat-operator-controller-manager-77b6666d85-ss4rm\" (UID: \"b2c881d7-03db-4608-a3f4-9a9ad8b2f5da\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-ss4rm" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.334601 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-7nm95"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.335888 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-mrgb9" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.339714 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556724-st6gn"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.342654 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556724-st6gn" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.345089 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556724-st6gn"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.348576 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.348757 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.348929 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.351401 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-ss4rm" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.362708 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-6ht9l"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.363586 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-6ht9l" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.368842 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-grrds" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.374291 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h2hg\" (UniqueName: \"kubernetes.io/projected/fa1b1ba2-3856-49cb-bda4-8ac5e63b5298-kube-api-access-9h2hg\") pod \"manila-operator-controller-manager-68f45f9d9f-twrg7\" (UID: \"fa1b1ba2-3856-49cb-bda4-8ac5e63b5298\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-twrg7" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.374343 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvmnr\" (UniqueName: \"kubernetes.io/projected/9bd066a9-3999-405a-b619-540678a46ded-kube-api-access-xvmnr\") pod \"keystone-operator-controller-manager-684f77d66d-kc2x6\" (UID: \"9bd066a9-3999-405a-b619-540678a46ded\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-kc2x6" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.374385 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c19c3466-ab50-4be3-8299-d7b8b3d263df-cert\") pod \"infra-operator-controller-manager-5995f4446f-fhlk9\" (UID: \"c19c3466-ab50-4be3-8299-d7b8b3d263df\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-fhlk9" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.374420 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xb5zl\" (UniqueName: \"kubernetes.io/projected/89e6d6f8-7bd3-4862-b41c-cd5c1f05f3e5-kube-api-access-xb5zl\") pod \"ironic-operator-controller-manager-6bbb499bbc-9zvxf\" (UID: \"89e6d6f8-7bd3-4862-b41c-cd5c1f05f3e5\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-9zvxf" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.374457 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvfhj\" (UniqueName: \"kubernetes.io/projected/046bdee0-f0cf-4d17-916b-68d301502473-kube-api-access-kvfhj\") pod \"mariadb-operator-controller-manager-658d4cdd5-7nm95\" (UID: \"046bdee0-f0cf-4d17-916b-68d301502473\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-7nm95" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.374520 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfnnw\" (UniqueName: \"kubernetes.io/projected/c19c3466-ab50-4be3-8299-d7b8b3d263df-kube-api-access-jfnnw\") pod \"infra-operator-controller-manager-5995f4446f-fhlk9\" (UID: \"c19c3466-ab50-4be3-8299-d7b8b3d263df\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-fhlk9" Mar 13 12:04:00 crc kubenswrapper[4837]: E0313 12:04:00.374968 4837 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 13 12:04:00 crc kubenswrapper[4837]: E0313 12:04:00.375018 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c19c3466-ab50-4be3-8299-d7b8b3d263df-cert podName:c19c3466-ab50-4be3-8299-d7b8b3d263df nodeName:}" failed. No retries permitted until 2026-03-13 12:04:00.875001849 +0000 UTC m=+956.513268612 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c19c3466-ab50-4be3-8299-d7b8b3d263df-cert") pod "infra-operator-controller-manager-5995f4446f-fhlk9" (UID: "c19c3466-ab50-4be3-8299-d7b8b3d263df") : secret "infra-operator-webhook-server-cert" not found Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.391648 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-6ht9l"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.394615 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xb5zl\" (UniqueName: \"kubernetes.io/projected/89e6d6f8-7bd3-4862-b41c-cd5c1f05f3e5-kube-api-access-xb5zl\") pod \"ironic-operator-controller-manager-6bbb499bbc-9zvxf\" (UID: \"89e6d6f8-7bd3-4862-b41c-cd5c1f05f3e5\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-9zvxf" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.395703 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfnnw\" (UniqueName: \"kubernetes.io/projected/c19c3466-ab50-4be3-8299-d7b8b3d263df-kube-api-access-jfnnw\") pod \"infra-operator-controller-manager-5995f4446f-fhlk9\" (UID: \"c19c3466-ab50-4be3-8299-d7b8b3d263df\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-fhlk9" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.396558 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-shrx7"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.397489 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-shrx7" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.401492 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-shrx7"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.402275 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-k657h" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.408445 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-7f7zd"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.412252 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-7f7zd" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.418814 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-69bgg" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.420602 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b77x9vc"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.422312 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b77x9vc" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.427649 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-7f7zd"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.432936 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.435824 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-bvmr7" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.437722 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-nxwr9"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.438499 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-nxwr9" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.441074 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b77x9vc"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.441580 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-qzn2w" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.441763 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-5pgmr" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.448067 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-nxwr9"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.453435 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-fwblp"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.468355 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-fwblp"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.468434 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-cfv8z"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.471442 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-fwblp" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.473223 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-k86kd" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.475725 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9b48\" (UniqueName: \"kubernetes.io/projected/ee1c592d-7979-4b75-b8e4-7ccd6d7d6048-kube-api-access-p9b48\") pod \"nova-operator-controller-manager-569cc54c5-shrx7\" (UID: \"ee1c592d-7979-4b75-b8e4-7ccd6d7d6048\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-shrx7" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.475777 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvpdw\" (UniqueName: \"kubernetes.io/projected/561aed86-f289-4dd1-8c53-307ccdc99165-kube-api-access-pvpdw\") pod \"octavia-operator-controller-manager-5f4f55cb5c-7f7zd\" (UID: \"561aed86-f289-4dd1-8c53-307ccdc99165\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-7f7zd" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.475839 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c85bh\" (UniqueName: \"kubernetes.io/projected/8bda3181-d107-4de8-b754-e5e67dd8dd9c-kube-api-access-c85bh\") pod \"auto-csr-approver-29556724-st6gn\" (UID: \"8bda3181-d107-4de8-b754-e5e67dd8dd9c\") " pod="openshift-infra/auto-csr-approver-29556724-st6gn" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.475897 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9h2hg\" (UniqueName: \"kubernetes.io/projected/fa1b1ba2-3856-49cb-bda4-8ac5e63b5298-kube-api-access-9h2hg\") pod \"manila-operator-controller-manager-68f45f9d9f-twrg7\" (UID: \"fa1b1ba2-3856-49cb-bda4-8ac5e63b5298\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-twrg7" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.475921 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvmnr\" (UniqueName: \"kubernetes.io/projected/9bd066a9-3999-405a-b619-540678a46ded-kube-api-access-xvmnr\") pod \"keystone-operator-controller-manager-684f77d66d-kc2x6\" (UID: \"9bd066a9-3999-405a-b619-540678a46ded\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-kc2x6" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.475947 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4g74\" (UniqueName: \"kubernetes.io/projected/3059d7c0-2624-4d3e-af0f-de054401f1ec-kube-api-access-j4g74\") pod \"neutron-operator-controller-manager-776c5696bf-6ht9l\" (UID: \"3059d7c0-2624-4d3e-af0f-de054401f1ec\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-6ht9l" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.475973 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7b38159c-e030-4734-963d-dfc38d29c75c-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b77x9vc\" (UID: \"7b38159c-e030-4734-963d-dfc38d29c75c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b77x9vc" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.475999 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54dqh\" (UniqueName: \"kubernetes.io/projected/7b38159c-e030-4734-963d-dfc38d29c75c-kube-api-access-54dqh\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b77x9vc\" (UID: \"7b38159c-e030-4734-963d-dfc38d29c75c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b77x9vc" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.476062 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvfhj\" (UniqueName: \"kubernetes.io/projected/046bdee0-f0cf-4d17-916b-68d301502473-kube-api-access-kvfhj\") pod \"mariadb-operator-controller-manager-658d4cdd5-7nm95\" (UID: \"046bdee0-f0cf-4d17-916b-68d301502473\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-7nm95" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.478117 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-677c674df7-cfv8z" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.485463 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-cfv8z"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.494183 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-zkdlz" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.500606 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-8lkmx"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.515575 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvfhj\" (UniqueName: \"kubernetes.io/projected/046bdee0-f0cf-4d17-916b-68d301502473-kube-api-access-kvfhj\") pod \"mariadb-operator-controller-manager-658d4cdd5-7nm95\" (UID: \"046bdee0-f0cf-4d17-916b-68d301502473\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-7nm95" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.519177 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-8lkmx" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.545287 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-8lkmx"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.550258 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-wkl4s" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.554155 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvmnr\" (UniqueName: \"kubernetes.io/projected/9bd066a9-3999-405a-b619-540678a46ded-kube-api-access-xvmnr\") pod \"keystone-operator-controller-manager-684f77d66d-kc2x6\" (UID: \"9bd066a9-3999-405a-b619-540678a46ded\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-kc2x6" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.574510 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h2hg\" (UniqueName: \"kubernetes.io/projected/fa1b1ba2-3856-49cb-bda4-8ac5e63b5298-kube-api-access-9h2hg\") pod \"manila-operator-controller-manager-68f45f9d9f-twrg7\" (UID: \"fa1b1ba2-3856-49cb-bda4-8ac5e63b5298\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-twrg7" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.581333 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7nhg\" (UniqueName: \"kubernetes.io/projected/55649f1c-678e-4e03-be55-7c4435446199-kube-api-access-g7nhg\") pod \"swift-operator-controller-manager-677c674df7-cfv8z\" (UID: \"55649f1c-678e-4e03-be55-7c4435446199\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-cfv8z" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.581392 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swds9\" (UniqueName: \"kubernetes.io/projected/cb20db22-bd0e-4897-8ed6-a6a80a91ffff-kube-api-access-swds9\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-8lkmx\" (UID: \"cb20db22-bd0e-4897-8ed6-a6a80a91ffff\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-8lkmx" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.581415 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4g74\" (UniqueName: \"kubernetes.io/projected/3059d7c0-2624-4d3e-af0f-de054401f1ec-kube-api-access-j4g74\") pod \"neutron-operator-controller-manager-776c5696bf-6ht9l\" (UID: \"3059d7c0-2624-4d3e-af0f-de054401f1ec\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-6ht9l" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.581433 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7b38159c-e030-4734-963d-dfc38d29c75c-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b77x9vc\" (UID: \"7b38159c-e030-4734-963d-dfc38d29c75c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b77x9vc" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.581453 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54dqh\" (UniqueName: \"kubernetes.io/projected/7b38159c-e030-4734-963d-dfc38d29c75c-kube-api-access-54dqh\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b77x9vc\" (UID: \"7b38159c-e030-4734-963d-dfc38d29c75c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b77x9vc" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.581499 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr8mf\" (UniqueName: \"kubernetes.io/projected/35a21ab1-95b5-446a-ae10-d004e5aa2995-kube-api-access-zr8mf\") pod \"placement-operator-controller-manager-574d45c66c-fwblp\" (UID: \"35a21ab1-95b5-446a-ae10-d004e5aa2995\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-fwblp" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.581522 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9b48\" (UniqueName: \"kubernetes.io/projected/ee1c592d-7979-4b75-b8e4-7ccd6d7d6048-kube-api-access-p9b48\") pod \"nova-operator-controller-manager-569cc54c5-shrx7\" (UID: \"ee1c592d-7979-4b75-b8e4-7ccd6d7d6048\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-shrx7" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.581544 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvpdw\" (UniqueName: \"kubernetes.io/projected/561aed86-f289-4dd1-8c53-307ccdc99165-kube-api-access-pvpdw\") pod \"octavia-operator-controller-manager-5f4f55cb5c-7f7zd\" (UID: \"561aed86-f289-4dd1-8c53-307ccdc99165\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-7f7zd" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.581580 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c85bh\" (UniqueName: \"kubernetes.io/projected/8bda3181-d107-4de8-b754-e5e67dd8dd9c-kube-api-access-c85bh\") pod \"auto-csr-approver-29556724-st6gn\" (UID: \"8bda3181-d107-4de8-b754-e5e67dd8dd9c\") " pod="openshift-infra/auto-csr-approver-29556724-st6gn" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.581603 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6n5d\" (UniqueName: \"kubernetes.io/projected/5f00cf34-6fc4-4ee9-93e5-5ff8c6b1128d-kube-api-access-k6n5d\") pod \"ovn-operator-controller-manager-bbc5b68f9-nxwr9\" (UID: \"5f00cf34-6fc4-4ee9-93e5-5ff8c6b1128d\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-nxwr9" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.589000 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-9zvxf" Mar 13 12:04:00 crc kubenswrapper[4837]: E0313 12:04:00.589896 4837 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 12:04:00 crc kubenswrapper[4837]: E0313 12:04:00.590015 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b38159c-e030-4734-963d-dfc38d29c75c-cert podName:7b38159c-e030-4734-963d-dfc38d29c75c nodeName:}" failed. No retries permitted until 2026-03-13 12:04:01.089970358 +0000 UTC m=+956.728237121 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7b38159c-e030-4734-963d-dfc38d29c75c-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b77x9vc" (UID: "7b38159c-e030-4734-963d-dfc38d29c75c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.609627 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-kc2x6" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.633985 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4g74\" (UniqueName: \"kubernetes.io/projected/3059d7c0-2624-4d3e-af0f-de054401f1ec-kube-api-access-j4g74\") pod \"neutron-operator-controller-manager-776c5696bf-6ht9l\" (UID: \"3059d7c0-2624-4d3e-af0f-de054401f1ec\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-6ht9l" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.635245 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c85bh\" (UniqueName: \"kubernetes.io/projected/8bda3181-d107-4de8-b754-e5e67dd8dd9c-kube-api-access-c85bh\") pod \"auto-csr-approver-29556724-st6gn\" (UID: \"8bda3181-d107-4de8-b754-e5e67dd8dd9c\") " pod="openshift-infra/auto-csr-approver-29556724-st6gn" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.643422 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvpdw\" (UniqueName: \"kubernetes.io/projected/561aed86-f289-4dd1-8c53-307ccdc99165-kube-api-access-pvpdw\") pod \"octavia-operator-controller-manager-5f4f55cb5c-7f7zd\" (UID: \"561aed86-f289-4dd1-8c53-307ccdc99165\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-7f7zd" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.645576 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-dk4nr"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.651165 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-dk4nr" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.652230 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54dqh\" (UniqueName: \"kubernetes.io/projected/7b38159c-e030-4734-963d-dfc38d29c75c-kube-api-access-54dqh\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b77x9vc\" (UID: \"7b38159c-e030-4734-963d-dfc38d29c75c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b77x9vc" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.654332 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-dk4nr"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.658311 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-t6g4b" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.667497 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9b48\" (UniqueName: \"kubernetes.io/projected/ee1c592d-7979-4b75-b8e4-7ccd6d7d6048-kube-api-access-p9b48\") pod \"nova-operator-controller-manager-569cc54c5-shrx7\" (UID: \"ee1c592d-7979-4b75-b8e4-7ccd6d7d6048\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-shrx7" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.679256 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-hrcp9"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.680391 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-hrcp9" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.691243 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-q48z8" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.700493 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-twrg7" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.702695 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-hrcp9"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.715851 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6n5d\" (UniqueName: \"kubernetes.io/projected/5f00cf34-6fc4-4ee9-93e5-5ff8c6b1128d-kube-api-access-k6n5d\") pod \"ovn-operator-controller-manager-bbc5b68f9-nxwr9\" (UID: \"5f00cf34-6fc4-4ee9-93e5-5ff8c6b1128d\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-nxwr9" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.715900 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7nhg\" (UniqueName: \"kubernetes.io/projected/55649f1c-678e-4e03-be55-7c4435446199-kube-api-access-g7nhg\") pod \"swift-operator-controller-manager-677c674df7-cfv8z\" (UID: \"55649f1c-678e-4e03-be55-7c4435446199\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-cfv8z" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.715931 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swds9\" (UniqueName: \"kubernetes.io/projected/cb20db22-bd0e-4897-8ed6-a6a80a91ffff-kube-api-access-swds9\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-8lkmx\" (UID: \"cb20db22-bd0e-4897-8ed6-a6a80a91ffff\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-8lkmx" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.715999 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8v8b\" (UniqueName: \"kubernetes.io/projected/fe107e39-b5ec-473d-8851-b57775dadafc-kube-api-access-c8v8b\") pod \"test-operator-controller-manager-5c5cb9c4d7-dk4nr\" (UID: \"fe107e39-b5ec-473d-8851-b57775dadafc\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-dk4nr" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.716038 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zr8mf\" (UniqueName: \"kubernetes.io/projected/35a21ab1-95b5-446a-ae10-d004e5aa2995-kube-api-access-zr8mf\") pod \"placement-operator-controller-manager-574d45c66c-fwblp\" (UID: \"35a21ab1-95b5-446a-ae10-d004e5aa2995\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-fwblp" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.721533 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-7nm95" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.733391 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-jvdqq"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.740971 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556724-st6gn" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.754052 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr8mf\" (UniqueName: \"kubernetes.io/projected/35a21ab1-95b5-446a-ae10-d004e5aa2995-kube-api-access-zr8mf\") pod \"placement-operator-controller-manager-574d45c66c-fwblp\" (UID: \"35a21ab1-95b5-446a-ae10-d004e5aa2995\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-fwblp" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.757042 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7nhg\" (UniqueName: \"kubernetes.io/projected/55649f1c-678e-4e03-be55-7c4435446199-kube-api-access-g7nhg\") pod \"swift-operator-controller-manager-677c674df7-cfv8z\" (UID: \"55649f1c-678e-4e03-be55-7c4435446199\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-cfv8z" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.764161 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swds9\" (UniqueName: \"kubernetes.io/projected/cb20db22-bd0e-4897-8ed6-a6a80a91ffff-kube-api-access-swds9\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-8lkmx\" (UID: \"cb20db22-bd0e-4897-8ed6-a6a80a91ffff\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-8lkmx" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.767368 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6n5d\" (UniqueName: \"kubernetes.io/projected/5f00cf34-6fc4-4ee9-93e5-5ff8c6b1128d-kube-api-access-k6n5d\") pod \"ovn-operator-controller-manager-bbc5b68f9-nxwr9\" (UID: \"5f00cf34-6fc4-4ee9-93e5-5ff8c6b1128d\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-nxwr9" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.768078 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-6ht9l" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.777195 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.783671 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.783782 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.789508 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-shrx7" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.801709 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.801901 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.802330 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-x6jmh" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.824082 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8v8b\" (UniqueName: \"kubernetes.io/projected/fe107e39-b5ec-473d-8851-b57775dadafc-kube-api-access-c8v8b\") pod \"test-operator-controller-manager-5c5cb9c4d7-dk4nr\" (UID: \"fe107e39-b5ec-473d-8851-b57775dadafc\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-dk4nr" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.824254 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-984ct\" (UniqueName: \"kubernetes.io/projected/5ef20b1d-5c03-4993-b635-b031ddcab3bf-kube-api-access-984ct\") pod \"watcher-operator-controller-manager-6dd88c6f67-hrcp9\" (UID: \"5ef20b1d-5c03-4993-b635-b031ddcab3bf\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-hrcp9" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.831333 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-7f7zd" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.839229 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xkk4z"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.840302 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xkk4z" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.843802 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-6xrg5" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.861068 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xkk4z"] Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.869032 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8v8b\" (UniqueName: \"kubernetes.io/projected/fe107e39-b5ec-473d-8851-b57775dadafc-kube-api-access-c8v8b\") pod \"test-operator-controller-manager-5c5cb9c4d7-dk4nr\" (UID: \"fe107e39-b5ec-473d-8851-b57775dadafc\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-dk4nr" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.895246 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-nxwr9" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.906969 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-fwblp" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.925428 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7wqr\" (UniqueName: \"kubernetes.io/projected/eaf3fa29-f441-43df-9fbe-409d9d8ad871-kube-api-access-v7wqr\") pod \"openstack-operator-controller-manager-55876d85bb-96mp7\" (UID: \"eaf3fa29-f441-43df-9fbe-409d9d8ad871\") " pod="openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.925486 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-984ct\" (UniqueName: \"kubernetes.io/projected/5ef20b1d-5c03-4993-b635-b031ddcab3bf-kube-api-access-984ct\") pod \"watcher-operator-controller-manager-6dd88c6f67-hrcp9\" (UID: \"5ef20b1d-5c03-4993-b635-b031ddcab3bf\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-hrcp9" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.925695 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-webhook-certs\") pod \"openstack-operator-controller-manager-55876d85bb-96mp7\" (UID: \"eaf3fa29-f441-43df-9fbe-409d9d8ad871\") " pod="openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.925746 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c19c3466-ab50-4be3-8299-d7b8b3d263df-cert\") pod \"infra-operator-controller-manager-5995f4446f-fhlk9\" (UID: \"c19c3466-ab50-4be3-8299-d7b8b3d263df\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-fhlk9" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.925783 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-metrics-certs\") pod \"openstack-operator-controller-manager-55876d85bb-96mp7\" (UID: \"eaf3fa29-f441-43df-9fbe-409d9d8ad871\") " pod="openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7" Mar 13 12:04:00 crc kubenswrapper[4837]: E0313 12:04:00.925943 4837 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.925946 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dls2b\" (UniqueName: \"kubernetes.io/projected/ce0c89e1-3fc0-473d-875f-461c8b423061-kube-api-access-dls2b\") pod \"rabbitmq-cluster-operator-manager-668c99d594-xkk4z\" (UID: \"ce0c89e1-3fc0-473d-875f-461c8b423061\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xkk4z" Mar 13 12:04:00 crc kubenswrapper[4837]: E0313 12:04:00.926002 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c19c3466-ab50-4be3-8299-d7b8b3d263df-cert podName:c19c3466-ab50-4be3-8299-d7b8b3d263df nodeName:}" failed. No retries permitted until 2026-03-13 12:04:01.925979842 +0000 UTC m=+957.564246665 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c19c3466-ab50-4be3-8299-d7b8b3d263df-cert") pod "infra-operator-controller-manager-5995f4446f-fhlk9" (UID: "c19c3466-ab50-4be3-8299-d7b8b3d263df") : secret "infra-operator-webhook-server-cert" not found Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.939689 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-677c674df7-cfv8z" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.946462 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-984ct\" (UniqueName: \"kubernetes.io/projected/5ef20b1d-5c03-4993-b635-b031ddcab3bf-kube-api-access-984ct\") pod \"watcher-operator-controller-manager-6dd88c6f67-hrcp9\" (UID: \"5ef20b1d-5c03-4993-b635-b031ddcab3bf\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-hrcp9" Mar 13 12:04:00 crc kubenswrapper[4837]: I0313 12:04:00.975082 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-8lkmx" Mar 13 12:04:01 crc kubenswrapper[4837]: I0313 12:04:01.027112 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-webhook-certs\") pod \"openstack-operator-controller-manager-55876d85bb-96mp7\" (UID: \"eaf3fa29-f441-43df-9fbe-409d9d8ad871\") " pod="openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7" Mar 13 12:04:01 crc kubenswrapper[4837]: I0313 12:04:01.027436 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-metrics-certs\") pod \"openstack-operator-controller-manager-55876d85bb-96mp7\" (UID: \"eaf3fa29-f441-43df-9fbe-409d9d8ad871\") " pod="openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7" Mar 13 12:04:01 crc kubenswrapper[4837]: E0313 12:04:01.027477 4837 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 12:04:01 crc kubenswrapper[4837]: I0313 12:04:01.027492 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dls2b\" (UniqueName: \"kubernetes.io/projected/ce0c89e1-3fc0-473d-875f-461c8b423061-kube-api-access-dls2b\") pod \"rabbitmq-cluster-operator-manager-668c99d594-xkk4z\" (UID: \"ce0c89e1-3fc0-473d-875f-461c8b423061\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xkk4z" Mar 13 12:04:01 crc kubenswrapper[4837]: I0313 12:04:01.027514 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7wqr\" (UniqueName: \"kubernetes.io/projected/eaf3fa29-f441-43df-9fbe-409d9d8ad871-kube-api-access-v7wqr\") pod \"openstack-operator-controller-manager-55876d85bb-96mp7\" (UID: \"eaf3fa29-f441-43df-9fbe-409d9d8ad871\") " pod="openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7" Mar 13 12:04:01 crc kubenswrapper[4837]: E0313 12:04:01.027530 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-webhook-certs podName:eaf3fa29-f441-43df-9fbe-409d9d8ad871 nodeName:}" failed. No retries permitted until 2026-03-13 12:04:01.527514278 +0000 UTC m=+957.165781041 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-webhook-certs") pod "openstack-operator-controller-manager-55876d85bb-96mp7" (UID: "eaf3fa29-f441-43df-9fbe-409d9d8ad871") : secret "webhook-server-cert" not found Mar 13 12:04:01 crc kubenswrapper[4837]: E0313 12:04:01.027716 4837 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 13 12:04:01 crc kubenswrapper[4837]: E0313 12:04:01.027754 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-metrics-certs podName:eaf3fa29-f441-43df-9fbe-409d9d8ad871 nodeName:}" failed. No retries permitted until 2026-03-13 12:04:01.527742375 +0000 UTC m=+957.166009138 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-metrics-certs") pod "openstack-operator-controller-manager-55876d85bb-96mp7" (UID: "eaf3fa29-f441-43df-9fbe-409d9d8ad871") : secret "metrics-server-cert" not found Mar 13 12:04:01 crc kubenswrapper[4837]: I0313 12:04:01.049317 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7wqr\" (UniqueName: \"kubernetes.io/projected/eaf3fa29-f441-43df-9fbe-409d9d8ad871-kube-api-access-v7wqr\") pod \"openstack-operator-controller-manager-55876d85bb-96mp7\" (UID: \"eaf3fa29-f441-43df-9fbe-409d9d8ad871\") " pod="openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7" Mar 13 12:04:01 crc kubenswrapper[4837]: I0313 12:04:01.053146 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dls2b\" (UniqueName: \"kubernetes.io/projected/ce0c89e1-3fc0-473d-875f-461c8b423061-kube-api-access-dls2b\") pod \"rabbitmq-cluster-operator-manager-668c99d594-xkk4z\" (UID: \"ce0c89e1-3fc0-473d-875f-461c8b423061\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xkk4z" Mar 13 12:04:01 crc kubenswrapper[4837]: I0313 12:04:01.074168 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-dk4nr" Mar 13 12:04:01 crc kubenswrapper[4837]: I0313 12:04:01.109273 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-hrcp9" Mar 13 12:04:01 crc kubenswrapper[4837]: I0313 12:04:01.129321 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7b38159c-e030-4734-963d-dfc38d29c75c-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b77x9vc\" (UID: \"7b38159c-e030-4734-963d-dfc38d29c75c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b77x9vc" Mar 13 12:04:01 crc kubenswrapper[4837]: E0313 12:04:01.129563 4837 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 12:04:01 crc kubenswrapper[4837]: E0313 12:04:01.129618 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b38159c-e030-4734-963d-dfc38d29c75c-cert podName:7b38159c-e030-4734-963d-dfc38d29c75c nodeName:}" failed. No retries permitted until 2026-03-13 12:04:02.129600932 +0000 UTC m=+957.767867695 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7b38159c-e030-4734-963d-dfc38d29c75c-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b77x9vc" (UID: "7b38159c-e030-4734-963d-dfc38d29c75c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 12:04:01 crc kubenswrapper[4837]: I0313 12:04:01.186512 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xkk4z" Mar 13 12:04:01 crc kubenswrapper[4837]: I0313 12:04:01.293050 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-jvdqq" event={"ID":"1d59bb7f-598d-4c70-9b8c-ce4e3048691f","Type":"ContainerStarted","Data":"7d8e7bfd32eadcbd104d30b096182e4677f1eafdf69317cc33edd58d9d0b72f9"} Mar 13 12:04:01 crc kubenswrapper[4837]: I0313 12:04:01.362128 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-kbn8z"] Mar 13 12:04:01 crc kubenswrapper[4837]: I0313 12:04:01.369272 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-mrgb9"] Mar 13 12:04:01 crc kubenswrapper[4837]: I0313 12:04:01.381386 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-b7cdx"] Mar 13 12:04:01 crc kubenswrapper[4837]: W0313 12:04:01.414222 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode645f00a_8463_4fac_b010_f0500b54d68a.slice/crio-c980a6655ad19b2727eca7a82807c2b6b75a428e5c486aad061ecb8b214b4cb8 WatchSource:0}: Error finding container c980a6655ad19b2727eca7a82807c2b6b75a428e5c486aad061ecb8b214b4cb8: Status 404 returned error can't find the container with id c980a6655ad19b2727eca7a82807c2b6b75a428e5c486aad061ecb8b214b4cb8 Mar 13 12:04:01 crc kubenswrapper[4837]: I0313 12:04:01.543841 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-webhook-certs\") pod \"openstack-operator-controller-manager-55876d85bb-96mp7\" (UID: \"eaf3fa29-f441-43df-9fbe-409d9d8ad871\") " pod="openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7" Mar 13 12:04:01 crc kubenswrapper[4837]: I0313 12:04:01.544215 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-metrics-certs\") pod \"openstack-operator-controller-manager-55876d85bb-96mp7\" (UID: \"eaf3fa29-f441-43df-9fbe-409d9d8ad871\") " pod="openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7" Mar 13 12:04:01 crc kubenswrapper[4837]: E0313 12:04:01.544012 4837 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 12:04:01 crc kubenswrapper[4837]: E0313 12:04:01.544362 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-webhook-certs podName:eaf3fa29-f441-43df-9fbe-409d9d8ad871 nodeName:}" failed. No retries permitted until 2026-03-13 12:04:02.544348349 +0000 UTC m=+958.182615102 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-webhook-certs") pod "openstack-operator-controller-manager-55876d85bb-96mp7" (UID: "eaf3fa29-f441-43df-9fbe-409d9d8ad871") : secret "webhook-server-cert" not found Mar 13 12:04:01 crc kubenswrapper[4837]: E0313 12:04:01.544317 4837 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 13 12:04:01 crc kubenswrapper[4837]: E0313 12:04:01.544771 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-metrics-certs podName:eaf3fa29-f441-43df-9fbe-409d9d8ad871 nodeName:}" failed. No retries permitted until 2026-03-13 12:04:02.544760062 +0000 UTC m=+958.183026825 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-metrics-certs") pod "openstack-operator-controller-manager-55876d85bb-96mp7" (UID: "eaf3fa29-f441-43df-9fbe-409d9d8ad871") : secret "metrics-server-cert" not found Mar 13 12:04:01 crc kubenswrapper[4837]: I0313 12:04:01.787716 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-ss4rm"] Mar 13 12:04:01 crc kubenswrapper[4837]: I0313 12:04:01.817091 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-kc2x6"] Mar 13 12:04:01 crc kubenswrapper[4837]: W0313 12:04:01.840062 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11a29883_0638_4da4_a1dc_bf2127a3645c.slice/crio-df846e2fca5a361a69d7a486a9114ce8ebd6cfc85586f65cbad53dab6327bc7e WatchSource:0}: Error finding container df846e2fca5a361a69d7a486a9114ce8ebd6cfc85586f65cbad53dab6327bc7e: Status 404 returned error can't find the container with id df846e2fca5a361a69d7a486a9114ce8ebd6cfc85586f65cbad53dab6327bc7e Mar 13 12:04:01 crc kubenswrapper[4837]: I0313 12:04:01.857697 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-9zvxf"] Mar 13 12:04:01 crc kubenswrapper[4837]: W0313 12:04:01.868291 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod046bdee0_f0cf_4d17_916b_68d301502473.slice/crio-973a134af6d2e986eb360f51b4cb55f76448d07d5437dcebe10cfd92ee2b761e WatchSource:0}: Error finding container 973a134af6d2e986eb360f51b4cb55f76448d07d5437dcebe10cfd92ee2b761e: Status 404 returned error can't find the container with id 973a134af6d2e986eb360f51b4cb55f76448d07d5437dcebe10cfd92ee2b761e Mar 13 12:04:01 crc kubenswrapper[4837]: I0313 12:04:01.877794 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-bvmr7"] Mar 13 12:04:01 crc kubenswrapper[4837]: I0313 12:04:01.882167 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-twrg7"] Mar 13 12:04:01 crc kubenswrapper[4837]: I0313 12:04:01.886207 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556724-st6gn"] Mar 13 12:04:01 crc kubenswrapper[4837]: I0313 12:04:01.890043 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-shrx7"] Mar 13 12:04:01 crc kubenswrapper[4837]: I0313 12:04:01.896240 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-7f7zd"] Mar 13 12:04:01 crc kubenswrapper[4837]: I0313 12:04:01.901841 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-7nm95"] Mar 13 12:04:01 crc kubenswrapper[4837]: I0313 12:04:01.949970 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c19c3466-ab50-4be3-8299-d7b8b3d263df-cert\") pod \"infra-operator-controller-manager-5995f4446f-fhlk9\" (UID: \"c19c3466-ab50-4be3-8299-d7b8b3d263df\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-fhlk9" Mar 13 12:04:01 crc kubenswrapper[4837]: E0313 12:04:01.950145 4837 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 13 12:04:01 crc kubenswrapper[4837]: E0313 12:04:01.950249 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c19c3466-ab50-4be3-8299-d7b8b3d263df-cert podName:c19c3466-ab50-4be3-8299-d7b8b3d263df nodeName:}" failed. No retries permitted until 2026-03-13 12:04:03.950221965 +0000 UTC m=+959.588488798 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c19c3466-ab50-4be3-8299-d7b8b3d263df-cert") pod "infra-operator-controller-manager-5995f4446f-fhlk9" (UID: "c19c3466-ab50-4be3-8299-d7b8b3d263df") : secret "infra-operator-webhook-server-cert" not found Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.106463 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-p6qtb"] Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.108038 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p6qtb" Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.123446 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p6qtb"] Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.153260 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7b38159c-e030-4734-963d-dfc38d29c75c-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b77x9vc\" (UID: \"7b38159c-e030-4734-963d-dfc38d29c75c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b77x9vc" Mar 13 12:04:02 crc kubenswrapper[4837]: E0313 12:04:02.154596 4837 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 12:04:02 crc kubenswrapper[4837]: E0313 12:04:02.154671 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b38159c-e030-4734-963d-dfc38d29c75c-cert podName:7b38159c-e030-4734-963d-dfc38d29c75c nodeName:}" failed. No retries permitted until 2026-03-13 12:04:04.154653971 +0000 UTC m=+959.792920744 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7b38159c-e030-4734-963d-dfc38d29c75c-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b77x9vc" (UID: "7b38159c-e030-4734-963d-dfc38d29c75c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.175982 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-6ht9l"] Mar 13 12:04:02 crc kubenswrapper[4837]: W0313 12:04:02.192356 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3059d7c0_2624_4d3e_af0f_de054401f1ec.slice/crio-d2925738557f48a6fb3397d20ead57f90592cd0526a571448f888a7af3946718 WatchSource:0}: Error finding container d2925738557f48a6fb3397d20ead57f90592cd0526a571448f888a7af3946718: Status 404 returned error can't find the container with id d2925738557f48a6fb3397d20ead57f90592cd0526a571448f888a7af3946718 Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.193792 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xkk4z"] Mar 13 12:04:02 crc kubenswrapper[4837]: W0313 12:04:02.199354 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce0c89e1_3fc0_473d_875f_461c8b423061.slice/crio-0c846e500d27f6f332ef7b69c630d892f6f55ac4ac50d2ebaf9abf3ba3b5db99 WatchSource:0}: Error finding container 0c846e500d27f6f332ef7b69c630d892f6f55ac4ac50d2ebaf9abf3ba3b5db99: Status 404 returned error can't find the container with id 0c846e500d27f6f332ef7b69c630d892f6f55ac4ac50d2ebaf9abf3ba3b5db99 Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.212601 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-8lkmx"] Mar 13 12:04:02 crc kubenswrapper[4837]: E0313 12:04:02.218112 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:27c84b712abc2df6108e22636075eec25fea0229800f38594a492fd41b02c49d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-swds9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-6cd66dbd4b-8lkmx_openstack-operators(cb20db22-bd0e-4897-8ed6-a6a80a91ffff): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 13 12:04:02 crc kubenswrapper[4837]: E0313 12:04:02.219463 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-8lkmx" podUID="cb20db22-bd0e-4897-8ed6-a6a80a91ffff" Mar 13 12:04:02 crc kubenswrapper[4837]: E0313 12:04:02.229117 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-k6n5d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-bbc5b68f9-nxwr9_openstack-operators(5f00cf34-6fc4-4ee9-93e5-5ff8c6b1128d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 13 12:04:02 crc kubenswrapper[4837]: E0313 12:04:02.230699 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-nxwr9" podUID="5f00cf34-6fc4-4ee9-93e5-5ff8c6b1128d" Mar 13 12:04:02 crc kubenswrapper[4837]: E0313 12:04:02.231608 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:e7e865363955c670e41b6c042c4f87abceff78f5495ba5c5c82988baad45c978,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zr8mf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-574d45c66c-fwblp_openstack-operators(35a21ab1-95b5-446a-ae10-d004e5aa2995): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 13 12:04:02 crc kubenswrapper[4837]: E0313 12:04:02.231782 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:c223309f51714785bd878ad04080f7428567edad793be4f992d492abd77af44c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g7nhg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-677c674df7-cfv8z_openstack-operators(55649f1c-678e-4e03-be55-7c4435446199): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 13 12:04:02 crc kubenswrapper[4837]: E0313 12:04:02.232766 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-fwblp" podUID="35a21ab1-95b5-446a-ae10-d004e5aa2995" Mar 13 12:04:02 crc kubenswrapper[4837]: E0313 12:04:02.233070 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-677c674df7-cfv8z" podUID="55649f1c-678e-4e03-be55-7c4435446199" Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.233086 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-nxwr9"] Mar 13 12:04:02 crc kubenswrapper[4837]: W0313 12:04:02.238412 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ef20b1d_5c03_4993_b635_b031ddcab3bf.slice/crio-dcc1d8f44237cca3f04c246ddf270962766cd1585c909eabc90c7c0bbbb9480a WatchSource:0}: Error finding container dcc1d8f44237cca3f04c246ddf270962766cd1585c909eabc90c7c0bbbb9480a: Status 404 returned error can't find the container with id dcc1d8f44237cca3f04c246ddf270962766cd1585c909eabc90c7c0bbbb9480a Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.238603 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-cfv8z"] Mar 13 12:04:02 crc kubenswrapper[4837]: E0313 12:04:02.242239 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-c8v8b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5c5cb9c4d7-dk4nr_openstack-operators(fe107e39-b5ec-473d-8851-b57775dadafc): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 13 12:04:02 crc kubenswrapper[4837]: E0313 12:04:02.242455 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-984ct,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6dd88c6f67-hrcp9_openstack-operators(5ef20b1d-5c03-4993-b635-b031ddcab3bf): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 13 12:04:02 crc kubenswrapper[4837]: E0313 12:04:02.243559 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-hrcp9" podUID="5ef20b1d-5c03-4993-b635-b031ddcab3bf" Mar 13 12:04:02 crc kubenswrapper[4837]: E0313 12:04:02.243559 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-dk4nr" podUID="fe107e39-b5ec-473d-8851-b57775dadafc" Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.244993 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-dk4nr"] Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.254361 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebbc8197-2f60-4876-8bab-ae450e22db4d-utilities\") pod \"community-operators-p6qtb\" (UID: \"ebbc8197-2f60-4876-8bab-ae450e22db4d\") " pod="openshift-marketplace/community-operators-p6qtb" Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.254444 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebbc8197-2f60-4876-8bab-ae450e22db4d-catalog-content\") pod \"community-operators-p6qtb\" (UID: \"ebbc8197-2f60-4876-8bab-ae450e22db4d\") " pod="openshift-marketplace/community-operators-p6qtb" Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.254470 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp64z\" (UniqueName: \"kubernetes.io/projected/ebbc8197-2f60-4876-8bab-ae450e22db4d-kube-api-access-sp64z\") pod \"community-operators-p6qtb\" (UID: \"ebbc8197-2f60-4876-8bab-ae450e22db4d\") " pod="openshift-marketplace/community-operators-p6qtb" Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.255063 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-fwblp"] Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.268784 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-hrcp9"] Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.304137 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-7f7zd" event={"ID":"561aed86-f289-4dd1-8c53-307ccdc99165","Type":"ContainerStarted","Data":"7a2ea750319e0d08f2039479d3b5aa46a4b3ffd7adb15c905ea1b11047e76944"} Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.305358 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-fwblp" event={"ID":"35a21ab1-95b5-446a-ae10-d004e5aa2995","Type":"ContainerStarted","Data":"09aea184fa206c9de8e20fb65da85b6e009c3c5562b9a3b0687a0469c5882e23"} Mar 13 12:04:02 crc kubenswrapper[4837]: E0313 12:04:02.307961 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e7e865363955c670e41b6c042c4f87abceff78f5495ba5c5c82988baad45c978\\\"\"" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-fwblp" podUID="35a21ab1-95b5-446a-ae10-d004e5aa2995" Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.311744 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-6ht9l" event={"ID":"3059d7c0-2624-4d3e-af0f-de054401f1ec","Type":"ContainerStarted","Data":"d2925738557f48a6fb3397d20ead57f90592cd0526a571448f888a7af3946718"} Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.313857 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-nxwr9" event={"ID":"5f00cf34-6fc4-4ee9-93e5-5ff8c6b1128d","Type":"ContainerStarted","Data":"cf3854daf3db7c6d695a603af6be0d7698a92c72d53f54446de0440f3118ab70"} Mar 13 12:04:02 crc kubenswrapper[4837]: E0313 12:04:02.315957 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-nxwr9" podUID="5f00cf34-6fc4-4ee9-93e5-5ff8c6b1128d" Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.316430 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-kbn8z" event={"ID":"0a24601d-8e41-4f99-9e33-870d791a3e7e","Type":"ContainerStarted","Data":"1ee78c3d4f4d935dc6a2005765aaf1b8b7e393e23a68176976e6f0476c89191e"} Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.327497 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-kc2x6" event={"ID":"9bd066a9-3999-405a-b619-540678a46ded","Type":"ContainerStarted","Data":"34ea3ad6afe043b5e68616aa108e8badd29dadcf06d4e6a237885c1435d8fe29"} Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.332655 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-8lkmx" event={"ID":"cb20db22-bd0e-4897-8ed6-a6a80a91ffff","Type":"ContainerStarted","Data":"594079f055b4e810263cce9b8b863247718bf8f2b9c52e011f72ec5413e696b1"} Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.334129 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-twrg7" event={"ID":"fa1b1ba2-3856-49cb-bda4-8ac5e63b5298","Type":"ContainerStarted","Data":"c15d9bec887caed298d4c5a009f3c7ef6558cf7a7c93b4119b2424a7a24efd8d"} Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.335397 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-mrgb9" event={"ID":"1870e3ae-40fd-479c-9aa7-9ce3a3e2dd2e","Type":"ContainerStarted","Data":"8ae3541fd730ec7b2211ecae609278323b121bcfd0b7fa803ac1fe83c5cb7824"} Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.336454 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-9zvxf" event={"ID":"89e6d6f8-7bd3-4862-b41c-cd5c1f05f3e5","Type":"ContainerStarted","Data":"9c912fc0484b61004845556a1bee08639c3644b0b0868268e2c0adaa089e0521"} Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.338554 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556724-st6gn" event={"ID":"8bda3181-d107-4de8-b754-e5e67dd8dd9c","Type":"ContainerStarted","Data":"c507ea9ace1bdacbbcb4871524163d9cbeadd2ea422f02492bfe1b29dd12a90a"} Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.340242 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xkk4z" event={"ID":"ce0c89e1-3fc0-473d-875f-461c8b423061","Type":"ContainerStarted","Data":"0c846e500d27f6f332ef7b69c630d892f6f55ac4ac50d2ebaf9abf3ba3b5db99"} Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.341206 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-shrx7" event={"ID":"ee1c592d-7979-4b75-b8e4-7ccd6d7d6048","Type":"ContainerStarted","Data":"899620c325d5224c6c75f31743d65a6033d26885415637cbb61b879f3c5bbec2"} Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.344165 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-b7cdx" event={"ID":"e645f00a-8463-4fac-b010-f0500b54d68a","Type":"ContainerStarted","Data":"c980a6655ad19b2727eca7a82807c2b6b75a428e5c486aad061ecb8b214b4cb8"} Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.347378 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-7nm95" event={"ID":"046bdee0-f0cf-4d17-916b-68d301502473","Type":"ContainerStarted","Data":"973a134af6d2e986eb360f51b4cb55f76448d07d5437dcebe10cfd92ee2b761e"} Mar 13 12:04:02 crc kubenswrapper[4837]: E0313 12:04:02.347797 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:27c84b712abc2df6108e22636075eec25fea0229800f38594a492fd41b02c49d\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-8lkmx" podUID="cb20db22-bd0e-4897-8ed6-a6a80a91ffff" Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.349252 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-hrcp9" event={"ID":"5ef20b1d-5c03-4993-b635-b031ddcab3bf","Type":"ContainerStarted","Data":"dcc1d8f44237cca3f04c246ddf270962766cd1585c909eabc90c7c0bbbb9480a"} Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.351362 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-bvmr7" event={"ID":"11a29883-0638-4da4-a1dc-bf2127a3645c","Type":"ContainerStarted","Data":"df846e2fca5a361a69d7a486a9114ce8ebd6cfc85586f65cbad53dab6327bc7e"} Mar 13 12:04:02 crc kubenswrapper[4837]: E0313 12:04:02.372468 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-hrcp9" podUID="5ef20b1d-5c03-4993-b635-b031ddcab3bf" Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.374612 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebbc8197-2f60-4876-8bab-ae450e22db4d-utilities\") pod \"community-operators-p6qtb\" (UID: \"ebbc8197-2f60-4876-8bab-ae450e22db4d\") " pod="openshift-marketplace/community-operators-p6qtb" Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.374700 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebbc8197-2f60-4876-8bab-ae450e22db4d-catalog-content\") pod \"community-operators-p6qtb\" (UID: \"ebbc8197-2f60-4876-8bab-ae450e22db4d\") " pod="openshift-marketplace/community-operators-p6qtb" Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.374724 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sp64z\" (UniqueName: \"kubernetes.io/projected/ebbc8197-2f60-4876-8bab-ae450e22db4d-kube-api-access-sp64z\") pod \"community-operators-p6qtb\" (UID: \"ebbc8197-2f60-4876-8bab-ae450e22db4d\") " pod="openshift-marketplace/community-operators-p6qtb" Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.375172 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebbc8197-2f60-4876-8bab-ae450e22db4d-utilities\") pod \"community-operators-p6qtb\" (UID: \"ebbc8197-2f60-4876-8bab-ae450e22db4d\") " pod="openshift-marketplace/community-operators-p6qtb" Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.376540 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-677c674df7-cfv8z" event={"ID":"55649f1c-678e-4e03-be55-7c4435446199","Type":"ContainerStarted","Data":"3aba8663044e53082f5c5ada44ee2551b345d644484002e743b9c11e8ed389b4"} Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.378745 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebbc8197-2f60-4876-8bab-ae450e22db4d-catalog-content\") pod \"community-operators-p6qtb\" (UID: \"ebbc8197-2f60-4876-8bab-ae450e22db4d\") " pod="openshift-marketplace/community-operators-p6qtb" Mar 13 12:04:02 crc kubenswrapper[4837]: E0313 12:04:02.383414 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:c223309f51714785bd878ad04080f7428567edad793be4f992d492abd77af44c\\\"\"" pod="openstack-operators/swift-operator-controller-manager-677c674df7-cfv8z" podUID="55649f1c-678e-4e03-be55-7c4435446199" Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.389370 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-ss4rm" event={"ID":"b2c881d7-03db-4608-a3f4-9a9ad8b2f5da","Type":"ContainerStarted","Data":"4fe0b9552e8c23264df44d7e5a4c8edc527f22f45b2450d1264573009da99ed7"} Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.396152 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-dk4nr" event={"ID":"fe107e39-b5ec-473d-8851-b57775dadafc","Type":"ContainerStarted","Data":"d096a2ddd7df99ea480cf5b9762fa3e7f74cdb1bf44c91ac699c2a25bb91a1eb"} Mar 13 12:04:02 crc kubenswrapper[4837]: E0313 12:04:02.398003 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-dk4nr" podUID="fe107e39-b5ec-473d-8851-b57775dadafc" Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.402421 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp64z\" (UniqueName: \"kubernetes.io/projected/ebbc8197-2f60-4876-8bab-ae450e22db4d-kube-api-access-sp64z\") pod \"community-operators-p6qtb\" (UID: \"ebbc8197-2f60-4876-8bab-ae450e22db4d\") " pod="openshift-marketplace/community-operators-p6qtb" Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.436346 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p6qtb" Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.580379 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-webhook-certs\") pod \"openstack-operator-controller-manager-55876d85bb-96mp7\" (UID: \"eaf3fa29-f441-43df-9fbe-409d9d8ad871\") " pod="openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7" Mar 13 12:04:02 crc kubenswrapper[4837]: I0313 12:04:02.580444 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-metrics-certs\") pod \"openstack-operator-controller-manager-55876d85bb-96mp7\" (UID: \"eaf3fa29-f441-43df-9fbe-409d9d8ad871\") " pod="openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7" Mar 13 12:04:02 crc kubenswrapper[4837]: E0313 12:04:02.581045 4837 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 12:04:02 crc kubenswrapper[4837]: E0313 12:04:02.581101 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-webhook-certs podName:eaf3fa29-f441-43df-9fbe-409d9d8ad871 nodeName:}" failed. No retries permitted until 2026-03-13 12:04:04.581085349 +0000 UTC m=+960.219352112 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-webhook-certs") pod "openstack-operator-controller-manager-55876d85bb-96mp7" (UID: "eaf3fa29-f441-43df-9fbe-409d9d8ad871") : secret "webhook-server-cert" not found Mar 13 12:04:02 crc kubenswrapper[4837]: E0313 12:04:02.584028 4837 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 13 12:04:02 crc kubenswrapper[4837]: E0313 12:04:02.584113 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-metrics-certs podName:eaf3fa29-f441-43df-9fbe-409d9d8ad871 nodeName:}" failed. No retries permitted until 2026-03-13 12:04:04.584089614 +0000 UTC m=+960.222356377 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-metrics-certs") pod "openstack-operator-controller-manager-55876d85bb-96mp7" (UID: "eaf3fa29-f441-43df-9fbe-409d9d8ad871") : secret "metrics-server-cert" not found Mar 13 12:04:03 crc kubenswrapper[4837]: I0313 12:04:03.024441 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p6qtb"] Mar 13 12:04:03 crc kubenswrapper[4837]: W0313 12:04:03.052709 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebbc8197_2f60_4876_8bab_ae450e22db4d.slice/crio-9a83d8f4cff27c5d1da1eb8b95de8aad51c44fbb135c4b97c55466208565344c WatchSource:0}: Error finding container 9a83d8f4cff27c5d1da1eb8b95de8aad51c44fbb135c4b97c55466208565344c: Status 404 returned error can't find the container with id 9a83d8f4cff27c5d1da1eb8b95de8aad51c44fbb135c4b97c55466208565344c Mar 13 12:04:03 crc kubenswrapper[4837]: I0313 12:04:03.410212 4837 generic.go:334] "Generic (PLEG): container finished" podID="ebbc8197-2f60-4876-8bab-ae450e22db4d" containerID="0ddcb245a681a303f4445dab08f3327e3df698349bd5573d79829fcc09b9c9ef" exitCode=0 Mar 13 12:04:03 crc kubenswrapper[4837]: I0313 12:04:03.411379 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p6qtb" event={"ID":"ebbc8197-2f60-4876-8bab-ae450e22db4d","Type":"ContainerDied","Data":"0ddcb245a681a303f4445dab08f3327e3df698349bd5573d79829fcc09b9c9ef"} Mar 13 12:04:03 crc kubenswrapper[4837]: I0313 12:04:03.411411 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p6qtb" event={"ID":"ebbc8197-2f60-4876-8bab-ae450e22db4d","Type":"ContainerStarted","Data":"9a83d8f4cff27c5d1da1eb8b95de8aad51c44fbb135c4b97c55466208565344c"} Mar 13 12:04:03 crc kubenswrapper[4837]: E0313 12:04:03.412583 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-hrcp9" podUID="5ef20b1d-5c03-4993-b635-b031ddcab3bf" Mar 13 12:04:03 crc kubenswrapper[4837]: E0313 12:04:03.414107 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-nxwr9" podUID="5f00cf34-6fc4-4ee9-93e5-5ff8c6b1128d" Mar 13 12:04:03 crc kubenswrapper[4837]: E0313 12:04:03.414170 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:27c84b712abc2df6108e22636075eec25fea0229800f38594a492fd41b02c49d\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-8lkmx" podUID="cb20db22-bd0e-4897-8ed6-a6a80a91ffff" Mar 13 12:04:03 crc kubenswrapper[4837]: E0313 12:04:03.414239 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-dk4nr" podUID="fe107e39-b5ec-473d-8851-b57775dadafc" Mar 13 12:04:03 crc kubenswrapper[4837]: E0313 12:04:03.414543 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e7e865363955c670e41b6c042c4f87abceff78f5495ba5c5c82988baad45c978\\\"\"" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-fwblp" podUID="35a21ab1-95b5-446a-ae10-d004e5aa2995" Mar 13 12:04:03 crc kubenswrapper[4837]: E0313 12:04:03.417138 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:c223309f51714785bd878ad04080f7428567edad793be4f992d492abd77af44c\\\"\"" pod="openstack-operators/swift-operator-controller-manager-677c674df7-cfv8z" podUID="55649f1c-678e-4e03-be55-7c4435446199" Mar 13 12:04:04 crc kubenswrapper[4837]: I0313 12:04:04.011459 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c19c3466-ab50-4be3-8299-d7b8b3d263df-cert\") pod \"infra-operator-controller-manager-5995f4446f-fhlk9\" (UID: \"c19c3466-ab50-4be3-8299-d7b8b3d263df\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-fhlk9" Mar 13 12:04:04 crc kubenswrapper[4837]: E0313 12:04:04.013225 4837 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 13 12:04:04 crc kubenswrapper[4837]: E0313 12:04:04.013436 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c19c3466-ab50-4be3-8299-d7b8b3d263df-cert podName:c19c3466-ab50-4be3-8299-d7b8b3d263df nodeName:}" failed. No retries permitted until 2026-03-13 12:04:08.01341905 +0000 UTC m=+963.651685813 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c19c3466-ab50-4be3-8299-d7b8b3d263df-cert") pod "infra-operator-controller-manager-5995f4446f-fhlk9" (UID: "c19c3466-ab50-4be3-8299-d7b8b3d263df") : secret "infra-operator-webhook-server-cert" not found Mar 13 12:04:04 crc kubenswrapper[4837]: I0313 12:04:04.217477 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7b38159c-e030-4734-963d-dfc38d29c75c-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b77x9vc\" (UID: \"7b38159c-e030-4734-963d-dfc38d29c75c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b77x9vc" Mar 13 12:04:04 crc kubenswrapper[4837]: E0313 12:04:04.217663 4837 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 12:04:04 crc kubenswrapper[4837]: E0313 12:04:04.217852 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b38159c-e030-4734-963d-dfc38d29c75c-cert podName:7b38159c-e030-4734-963d-dfc38d29c75c nodeName:}" failed. No retries permitted until 2026-03-13 12:04:08.217830425 +0000 UTC m=+963.856097188 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7b38159c-e030-4734-963d-dfc38d29c75c-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b77x9vc" (UID: "7b38159c-e030-4734-963d-dfc38d29c75c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 12:04:04 crc kubenswrapper[4837]: I0313 12:04:04.421114 4837 generic.go:334] "Generic (PLEG): container finished" podID="8bda3181-d107-4de8-b754-e5e67dd8dd9c" containerID="945088ee0e42cd72cf70828366cf9ffb988a0eebcb4e0d5222d7e3f1439eeef4" exitCode=0 Mar 13 12:04:04 crc kubenswrapper[4837]: I0313 12:04:04.421175 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556724-st6gn" event={"ID":"8bda3181-d107-4de8-b754-e5e67dd8dd9c","Type":"ContainerDied","Data":"945088ee0e42cd72cf70828366cf9ffb988a0eebcb4e0d5222d7e3f1439eeef4"} Mar 13 12:04:04 crc kubenswrapper[4837]: I0313 12:04:04.625376 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-webhook-certs\") pod \"openstack-operator-controller-manager-55876d85bb-96mp7\" (UID: \"eaf3fa29-f441-43df-9fbe-409d9d8ad871\") " pod="openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7" Mar 13 12:04:04 crc kubenswrapper[4837]: I0313 12:04:04.625457 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-metrics-certs\") pod \"openstack-operator-controller-manager-55876d85bb-96mp7\" (UID: \"eaf3fa29-f441-43df-9fbe-409d9d8ad871\") " pod="openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7" Mar 13 12:04:04 crc kubenswrapper[4837]: E0313 12:04:04.625599 4837 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 12:04:04 crc kubenswrapper[4837]: E0313 12:04:04.625690 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-webhook-certs podName:eaf3fa29-f441-43df-9fbe-409d9d8ad871 nodeName:}" failed. No retries permitted until 2026-03-13 12:04:08.625670263 +0000 UTC m=+964.263937026 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-webhook-certs") pod "openstack-operator-controller-manager-55876d85bb-96mp7" (UID: "eaf3fa29-f441-43df-9fbe-409d9d8ad871") : secret "webhook-server-cert" not found Mar 13 12:04:04 crc kubenswrapper[4837]: E0313 12:04:04.625731 4837 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 13 12:04:04 crc kubenswrapper[4837]: E0313 12:04:04.625835 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-metrics-certs podName:eaf3fa29-f441-43df-9fbe-409d9d8ad871 nodeName:}" failed. No retries permitted until 2026-03-13 12:04:08.625816358 +0000 UTC m=+964.264083121 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-metrics-certs") pod "openstack-operator-controller-manager-55876d85bb-96mp7" (UID: "eaf3fa29-f441-43df-9fbe-409d9d8ad871") : secret "metrics-server-cert" not found Mar 13 12:04:05 crc kubenswrapper[4837]: I0313 12:04:05.483446 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:04:05 crc kubenswrapper[4837]: I0313 12:04:05.483881 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:04:08 crc kubenswrapper[4837]: I0313 12:04:08.077606 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c19c3466-ab50-4be3-8299-d7b8b3d263df-cert\") pod \"infra-operator-controller-manager-5995f4446f-fhlk9\" (UID: \"c19c3466-ab50-4be3-8299-d7b8b3d263df\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-fhlk9" Mar 13 12:04:08 crc kubenswrapper[4837]: E0313 12:04:08.077786 4837 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 13 12:04:08 crc kubenswrapper[4837]: E0313 12:04:08.077971 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c19c3466-ab50-4be3-8299-d7b8b3d263df-cert podName:c19c3466-ab50-4be3-8299-d7b8b3d263df nodeName:}" failed. No retries permitted until 2026-03-13 12:04:16.077953957 +0000 UTC m=+971.716220720 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c19c3466-ab50-4be3-8299-d7b8b3d263df-cert") pod "infra-operator-controller-manager-5995f4446f-fhlk9" (UID: "c19c3466-ab50-4be3-8299-d7b8b3d263df") : secret "infra-operator-webhook-server-cert" not found Mar 13 12:04:08 crc kubenswrapper[4837]: I0313 12:04:08.288601 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7b38159c-e030-4734-963d-dfc38d29c75c-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b77x9vc\" (UID: \"7b38159c-e030-4734-963d-dfc38d29c75c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b77x9vc" Mar 13 12:04:08 crc kubenswrapper[4837]: E0313 12:04:08.288865 4837 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 12:04:08 crc kubenswrapper[4837]: E0313 12:04:08.288972 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b38159c-e030-4734-963d-dfc38d29c75c-cert podName:7b38159c-e030-4734-963d-dfc38d29c75c nodeName:}" failed. No retries permitted until 2026-03-13 12:04:16.2889535 +0000 UTC m=+971.927220263 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7b38159c-e030-4734-963d-dfc38d29c75c-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b77x9vc" (UID: "7b38159c-e030-4734-963d-dfc38d29c75c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 12:04:08 crc kubenswrapper[4837]: I0313 12:04:08.693529 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-webhook-certs\") pod \"openstack-operator-controller-manager-55876d85bb-96mp7\" (UID: \"eaf3fa29-f441-43df-9fbe-409d9d8ad871\") " pod="openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7" Mar 13 12:04:08 crc kubenswrapper[4837]: I0313 12:04:08.693605 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-metrics-certs\") pod \"openstack-operator-controller-manager-55876d85bb-96mp7\" (UID: \"eaf3fa29-f441-43df-9fbe-409d9d8ad871\") " pod="openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7" Mar 13 12:04:08 crc kubenswrapper[4837]: E0313 12:04:08.693740 4837 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 12:04:08 crc kubenswrapper[4837]: E0313 12:04:08.693765 4837 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 13 12:04:08 crc kubenswrapper[4837]: E0313 12:04:08.693838 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-webhook-certs podName:eaf3fa29-f441-43df-9fbe-409d9d8ad871 nodeName:}" failed. No retries permitted until 2026-03-13 12:04:16.693817515 +0000 UTC m=+972.332084278 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-webhook-certs") pod "openstack-operator-controller-manager-55876d85bb-96mp7" (UID: "eaf3fa29-f441-43df-9fbe-409d9d8ad871") : secret "webhook-server-cert" not found Mar 13 12:04:08 crc kubenswrapper[4837]: E0313 12:04:08.693859 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-metrics-certs podName:eaf3fa29-f441-43df-9fbe-409d9d8ad871 nodeName:}" failed. No retries permitted until 2026-03-13 12:04:16.693850936 +0000 UTC m=+972.332117769 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-metrics-certs") pod "openstack-operator-controller-manager-55876d85bb-96mp7" (UID: "eaf3fa29-f441-43df-9fbe-409d9d8ad871") : secret "metrics-server-cert" not found Mar 13 12:04:09 crc kubenswrapper[4837]: I0313 12:04:09.015511 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-84c45"] Mar 13 12:04:09 crc kubenswrapper[4837]: I0313 12:04:09.020129 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-84c45" Mar 13 12:04:09 crc kubenswrapper[4837]: I0313 12:04:09.022512 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-84c45"] Mar 13 12:04:09 crc kubenswrapper[4837]: I0313 12:04:09.101669 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b5b628a-9d8f-4ce7-b023-adbb2b00ff47-utilities\") pod \"certified-operators-84c45\" (UID: \"5b5b628a-9d8f-4ce7-b023-adbb2b00ff47\") " pod="openshift-marketplace/certified-operators-84c45" Mar 13 12:04:09 crc kubenswrapper[4837]: I0313 12:04:09.101745 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kfl9\" (UniqueName: \"kubernetes.io/projected/5b5b628a-9d8f-4ce7-b023-adbb2b00ff47-kube-api-access-8kfl9\") pod \"certified-operators-84c45\" (UID: \"5b5b628a-9d8f-4ce7-b023-adbb2b00ff47\") " pod="openshift-marketplace/certified-operators-84c45" Mar 13 12:04:09 crc kubenswrapper[4837]: I0313 12:04:09.102003 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b5b628a-9d8f-4ce7-b023-adbb2b00ff47-catalog-content\") pod \"certified-operators-84c45\" (UID: \"5b5b628a-9d8f-4ce7-b023-adbb2b00ff47\") " pod="openshift-marketplace/certified-operators-84c45" Mar 13 12:04:09 crc kubenswrapper[4837]: I0313 12:04:09.202588 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b5b628a-9d8f-4ce7-b023-adbb2b00ff47-catalog-content\") pod \"certified-operators-84c45\" (UID: \"5b5b628a-9d8f-4ce7-b023-adbb2b00ff47\") " pod="openshift-marketplace/certified-operators-84c45" Mar 13 12:04:09 crc kubenswrapper[4837]: I0313 12:04:09.202956 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b5b628a-9d8f-4ce7-b023-adbb2b00ff47-utilities\") pod \"certified-operators-84c45\" (UID: \"5b5b628a-9d8f-4ce7-b023-adbb2b00ff47\") " pod="openshift-marketplace/certified-operators-84c45" Mar 13 12:04:09 crc kubenswrapper[4837]: I0313 12:04:09.202996 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kfl9\" (UniqueName: \"kubernetes.io/projected/5b5b628a-9d8f-4ce7-b023-adbb2b00ff47-kube-api-access-8kfl9\") pod \"certified-operators-84c45\" (UID: \"5b5b628a-9d8f-4ce7-b023-adbb2b00ff47\") " pod="openshift-marketplace/certified-operators-84c45" Mar 13 12:04:09 crc kubenswrapper[4837]: I0313 12:04:09.203169 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b5b628a-9d8f-4ce7-b023-adbb2b00ff47-catalog-content\") pod \"certified-operators-84c45\" (UID: \"5b5b628a-9d8f-4ce7-b023-adbb2b00ff47\") " pod="openshift-marketplace/certified-operators-84c45" Mar 13 12:04:09 crc kubenswrapper[4837]: I0313 12:04:09.203683 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b5b628a-9d8f-4ce7-b023-adbb2b00ff47-utilities\") pod \"certified-operators-84c45\" (UID: \"5b5b628a-9d8f-4ce7-b023-adbb2b00ff47\") " pod="openshift-marketplace/certified-operators-84c45" Mar 13 12:04:09 crc kubenswrapper[4837]: I0313 12:04:09.226670 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kfl9\" (UniqueName: \"kubernetes.io/projected/5b5b628a-9d8f-4ce7-b023-adbb2b00ff47-kube-api-access-8kfl9\") pod \"certified-operators-84c45\" (UID: \"5b5b628a-9d8f-4ce7-b023-adbb2b00ff47\") " pod="openshift-marketplace/certified-operators-84c45" Mar 13 12:04:09 crc kubenswrapper[4837]: I0313 12:04:09.345368 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-84c45" Mar 13 12:04:11 crc kubenswrapper[4837]: I0313 12:04:11.600486 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556724-st6gn" Mar 13 12:04:11 crc kubenswrapper[4837]: I0313 12:04:11.743107 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c85bh\" (UniqueName: \"kubernetes.io/projected/8bda3181-d107-4de8-b754-e5e67dd8dd9c-kube-api-access-c85bh\") pod \"8bda3181-d107-4de8-b754-e5e67dd8dd9c\" (UID: \"8bda3181-d107-4de8-b754-e5e67dd8dd9c\") " Mar 13 12:04:11 crc kubenswrapper[4837]: I0313 12:04:11.746796 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bda3181-d107-4de8-b754-e5e67dd8dd9c-kube-api-access-c85bh" (OuterVolumeSpecName: "kube-api-access-c85bh") pod "8bda3181-d107-4de8-b754-e5e67dd8dd9c" (UID: "8bda3181-d107-4de8-b754-e5e67dd8dd9c"). InnerVolumeSpecName "kube-api-access-c85bh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:04:11 crc kubenswrapper[4837]: I0313 12:04:11.844241 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c85bh\" (UniqueName: \"kubernetes.io/projected/8bda3181-d107-4de8-b754-e5e67dd8dd9c-kube-api-access-c85bh\") on node \"crc\" DevicePath \"\"" Mar 13 12:04:12 crc kubenswrapper[4837]: I0313 12:04:12.473620 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556724-st6gn" event={"ID":"8bda3181-d107-4de8-b754-e5e67dd8dd9c","Type":"ContainerDied","Data":"c507ea9ace1bdacbbcb4871524163d9cbeadd2ea422f02492bfe1b29dd12a90a"} Mar 13 12:04:12 crc kubenswrapper[4837]: I0313 12:04:12.473678 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c507ea9ace1bdacbbcb4871524163d9cbeadd2ea422f02492bfe1b29dd12a90a" Mar 13 12:04:12 crc kubenswrapper[4837]: I0313 12:04:12.473727 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556724-st6gn" Mar 13 12:04:12 crc kubenswrapper[4837]: I0313 12:04:12.666989 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556718-7z6qj"] Mar 13 12:04:12 crc kubenswrapper[4837]: I0313 12:04:12.671964 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556718-7z6qj"] Mar 13 12:04:13 crc kubenswrapper[4837]: I0313 12:04:13.056444 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa01e7a4-71d3-4c91-8319-52a575269601" path="/var/lib/kubelet/pods/aa01e7a4-71d3-4c91-8319-52a575269601/volumes" Mar 13 12:04:14 crc kubenswrapper[4837]: I0313 12:04:14.967352 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-84c45"] Mar 13 12:04:14 crc kubenswrapper[4837]: W0313 12:04:14.994756 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b5b628a_9d8f_4ce7_b023_adbb2b00ff47.slice/crio-975dad2eacabc4e891be9a4559895099caf82984d20099b8ca5021f0401addb6 WatchSource:0}: Error finding container 975dad2eacabc4e891be9a4559895099caf82984d20099b8ca5021f0401addb6: Status 404 returned error can't find the container with id 975dad2eacabc4e891be9a4559895099caf82984d20099b8ca5021f0401addb6 Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.501238 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-b7cdx" event={"ID":"e645f00a-8463-4fac-b010-f0500b54d68a","Type":"ContainerStarted","Data":"c0c941079793ea3e2294c1a3ff92e74ae0f005d09a1f62fc3a195290cce0093b"} Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.502718 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-ss4rm" event={"ID":"b2c881d7-03db-4608-a3f4-9a9ad8b2f5da","Type":"ContainerStarted","Data":"99018576e16e6c14b998e51892e2371fa278a42e6a7c1a766fa800f2545d0556"} Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.502854 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-ss4rm" Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.504101 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-twrg7" event={"ID":"fa1b1ba2-3856-49cb-bda4-8ac5e63b5298","Type":"ContainerStarted","Data":"a981be69ac171a461fb49a754ee45225c0d4814cb8cee38ab244eb7bb64fed80"} Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.504179 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-twrg7" Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.511463 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xkk4z" event={"ID":"ce0c89e1-3fc0-473d-875f-461c8b423061","Type":"ContainerStarted","Data":"a9df8093717a2ad9e21bfeae5dc8f64f13d93b7992d61993d6c5fbbf1e5d1a53"} Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.523550 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-kc2x6" event={"ID":"9bd066a9-3999-405a-b619-540678a46ded","Type":"ContainerStarted","Data":"7c848b83431cf06b7a899c3a04dc2546822780053219e04b63dec7707f4e69d3"} Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.523716 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-kc2x6" Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.526196 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-7f7zd" event={"ID":"561aed86-f289-4dd1-8c53-307ccdc99165","Type":"ContainerStarted","Data":"d26b09a496068c126b5dd06e2c21171667180c92f655372f30818b492936ef3c"} Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.526846 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-7f7zd" Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.530934 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-6ht9l" event={"ID":"3059d7c0-2624-4d3e-af0f-de054401f1ec","Type":"ContainerStarted","Data":"dbda97fc4d7bb724557e5593f08fd816794b762a8cefab05bc3dc86500e3e7f2"} Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.531035 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-6ht9l" Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.538185 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-mrgb9" event={"ID":"1870e3ae-40fd-479c-9aa7-9ce3a3e2dd2e","Type":"ContainerStarted","Data":"9f7b0a3f4094dd090d148a3a68dbef241db6f18aca6d8bcdd79501087bfa8e48"} Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.538471 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-mrgb9" Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.543909 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-b7cdx" podStartSLOduration=3.229906324 podStartE2EDuration="16.543893626s" podCreationTimestamp="2026-03-13 12:03:59 +0000 UTC" firstStartedPulling="2026-03-13 12:04:01.419009329 +0000 UTC m=+957.057276092" lastFinishedPulling="2026-03-13 12:04:14.732996631 +0000 UTC m=+970.371263394" observedRunningTime="2026-03-13 12:04:15.541666436 +0000 UTC m=+971.179933199" watchObservedRunningTime="2026-03-13 12:04:15.543893626 +0000 UTC m=+971.182160389" Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.547163 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-84c45" event={"ID":"5b5b628a-9d8f-4ce7-b023-adbb2b00ff47","Type":"ContainerStarted","Data":"d879a84e3b9f99d7a40acbf499a69feddf5c1973d117fb58100dc525ee864d7d"} Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.547209 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-84c45" event={"ID":"5b5b628a-9d8f-4ce7-b023-adbb2b00ff47","Type":"ContainerStarted","Data":"975dad2eacabc4e891be9a4559895099caf82984d20099b8ca5021f0401addb6"} Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.563312 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-bvmr7" event={"ID":"11a29883-0638-4da4-a1dc-bf2127a3645c","Type":"ContainerStarted","Data":"8fe6623e92501c7039280f383c00c64769ab2163095c7634b3a67dbbde9d8baa"} Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.563955 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-bvmr7" Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.580388 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-6ht9l" podStartSLOduration=3.023606036 podStartE2EDuration="15.580365792s" podCreationTimestamp="2026-03-13 12:04:00 +0000 UTC" firstStartedPulling="2026-03-13 12:04:02.21429473 +0000 UTC m=+957.852561493" lastFinishedPulling="2026-03-13 12:04:14.771054486 +0000 UTC m=+970.409321249" observedRunningTime="2026-03-13 12:04:15.571601944 +0000 UTC m=+971.209868697" watchObservedRunningTime="2026-03-13 12:04:15.580365792 +0000 UTC m=+971.218632565" Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.585948 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-9zvxf" event={"ID":"89e6d6f8-7bd3-4862-b41c-cd5c1f05f3e5","Type":"ContainerStarted","Data":"a5b6c5bc653b71ef848edfcf85e66d28959963c70cfcfdf4b0090e115f21466e"} Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.586803 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-9zvxf" Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.606611 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-twrg7" podStartSLOduration=2.759714026 podStartE2EDuration="15.606594122s" podCreationTimestamp="2026-03-13 12:04:00 +0000 UTC" firstStartedPulling="2026-03-13 12:04:01.843960609 +0000 UTC m=+957.482227372" lastFinishedPulling="2026-03-13 12:04:14.690840715 +0000 UTC m=+970.329107468" observedRunningTime="2026-03-13 12:04:15.601030626 +0000 UTC m=+971.239297389" watchObservedRunningTime="2026-03-13 12:04:15.606594122 +0000 UTC m=+971.244860885" Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.618053 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-7nm95" event={"ID":"046bdee0-f0cf-4d17-916b-68d301502473","Type":"ContainerStarted","Data":"e79795df944d63a92a34cf26e3d6864ae6401490c7cee14a7c7f905c4b2dfdb6"} Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.618861 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-7nm95" Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.633768 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-jvdqq" event={"ID":"1d59bb7f-598d-4c70-9b8c-ce4e3048691f","Type":"ContainerStarted","Data":"76bb8a1a95a22170f96278f0e14b759c03de32bf8150a13d7828cdf16602339b"} Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.634467 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-jvdqq" Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.653394 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-mrgb9" podStartSLOduration=3.280086905 podStartE2EDuration="16.653362334s" podCreationTimestamp="2026-03-13 12:03:59 +0000 UTC" firstStartedPulling="2026-03-13 12:04:01.402650271 +0000 UTC m=+957.040917034" lastFinishedPulling="2026-03-13 12:04:14.7759257 +0000 UTC m=+970.414192463" observedRunningTime="2026-03-13 12:04:15.649348366 +0000 UTC m=+971.287615129" watchObservedRunningTime="2026-03-13 12:04:15.653362334 +0000 UTC m=+971.291629097" Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.656045 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p6qtb" event={"ID":"ebbc8197-2f60-4876-8bab-ae450e22db4d","Type":"ContainerStarted","Data":"8bba29e2f9fe402d7901b43545abaa3e8580bab8bbce4d30cd8eeaffafb977c7"} Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.672406 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-shrx7" event={"ID":"ee1c592d-7979-4b75-b8e4-7ccd6d7d6048","Type":"ContainerStarted","Data":"025c8bb60386947143b36cf09cd303596ca8f1daa7b656f7f24c96249980901d"} Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.673897 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-shrx7" Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.699738 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-kbn8z" event={"ID":"0a24601d-8e41-4f99-9e33-870d791a3e7e","Type":"ContainerStarted","Data":"0087edc6acb16aab7cac54b3862d534769dba2c9621d81325a105b830409440e"} Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.700148 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-kbn8z" Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.717597 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-kc2x6" podStartSLOduration=2.721752415 podStartE2EDuration="15.717565438s" podCreationTimestamp="2026-03-13 12:04:00 +0000 UTC" firstStartedPulling="2026-03-13 12:04:01.793212383 +0000 UTC m=+957.431479146" lastFinishedPulling="2026-03-13 12:04:14.789025406 +0000 UTC m=+970.427292169" observedRunningTime="2026-03-13 12:04:15.703244954 +0000 UTC m=+971.341511717" watchObservedRunningTime="2026-03-13 12:04:15.717565438 +0000 UTC m=+971.355832201" Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.718737 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-ss4rm" podStartSLOduration=3.775047544 podStartE2EDuration="16.718730274s" podCreationTimestamp="2026-03-13 12:03:59 +0000 UTC" firstStartedPulling="2026-03-13 12:04:01.792546862 +0000 UTC m=+957.430813625" lastFinishedPulling="2026-03-13 12:04:14.736229592 +0000 UTC m=+970.374496355" observedRunningTime="2026-03-13 12:04:15.681152834 +0000 UTC m=+971.319419597" watchObservedRunningTime="2026-03-13 12:04:15.718730274 +0000 UTC m=+971.356997037" Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.735094 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-7f7zd" podStartSLOduration=3.269176655 podStartE2EDuration="15.735072752s" podCreationTimestamp="2026-03-13 12:04:00 +0000 UTC" firstStartedPulling="2026-03-13 12:04:01.867164165 +0000 UTC m=+957.505430938" lastFinishedPulling="2026-03-13 12:04:14.333060272 +0000 UTC m=+969.971327035" observedRunningTime="2026-03-13 12:04:15.731087966 +0000 UTC m=+971.369354719" watchObservedRunningTime="2026-03-13 12:04:15.735072752 +0000 UTC m=+971.373339515" Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.781975 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xkk4z" podStartSLOduration=3.192470765 podStartE2EDuration="15.781951367s" podCreationTimestamp="2026-03-13 12:04:00 +0000 UTC" firstStartedPulling="2026-03-13 12:04:02.211043477 +0000 UTC m=+957.849310240" lastFinishedPulling="2026-03-13 12:04:14.800524039 +0000 UTC m=+970.438790842" observedRunningTime="2026-03-13 12:04:15.777881918 +0000 UTC m=+971.416148681" watchObservedRunningTime="2026-03-13 12:04:15.781951367 +0000 UTC m=+971.420218130" Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.826405 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-kbn8z" podStartSLOduration=3.8889076989999998 podStartE2EDuration="16.826383044s" podCreationTimestamp="2026-03-13 12:03:59 +0000 UTC" firstStartedPulling="2026-03-13 12:04:01.40828686 +0000 UTC m=+957.046553623" lastFinishedPulling="2026-03-13 12:04:14.345762205 +0000 UTC m=+969.984028968" observedRunningTime="2026-03-13 12:04:15.82307941 +0000 UTC m=+971.461346173" watchObservedRunningTime="2026-03-13 12:04:15.826383044 +0000 UTC m=+971.464649807" Mar 13 12:04:15 crc kubenswrapper[4837]: E0313 12:04:15.876887 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebbc8197_2f60_4876_8bab_ae450e22db4d.slice/crio-conmon-8bba29e2f9fe402d7901b43545abaa3e8580bab8bbce4d30cd8eeaffafb977c7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebbc8197_2f60_4876_8bab_ae450e22db4d.slice/crio-8bba29e2f9fe402d7901b43545abaa3e8580bab8bbce4d30cd8eeaffafb977c7.scope\": RecentStats: unable to find data in memory cache]" Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.896795 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-7nm95" podStartSLOduration=3.838338234 podStartE2EDuration="15.896768614s" podCreationTimestamp="2026-03-13 12:04:00 +0000 UTC" firstStartedPulling="2026-03-13 12:04:01.881774338 +0000 UTC m=+957.520041101" lastFinishedPulling="2026-03-13 12:04:13.940204718 +0000 UTC m=+969.578471481" observedRunningTime="2026-03-13 12:04:15.881594093 +0000 UTC m=+971.519860866" watchObservedRunningTime="2026-03-13 12:04:15.896768614 +0000 UTC m=+971.535035387" Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.943740 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-shrx7" podStartSLOduration=2.946334107 podStartE2EDuration="15.943721881s" podCreationTimestamp="2026-03-13 12:04:00 +0000 UTC" firstStartedPulling="2026-03-13 12:04:01.867180495 +0000 UTC m=+957.505447258" lastFinishedPulling="2026-03-13 12:04:14.864568269 +0000 UTC m=+970.502835032" observedRunningTime="2026-03-13 12:04:15.941454359 +0000 UTC m=+971.579721122" watchObservedRunningTime="2026-03-13 12:04:15.943721881 +0000 UTC m=+971.581988644" Mar 13 12:04:15 crc kubenswrapper[4837]: I0313 12:04:15.977043 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-bvmr7" podStartSLOduration=3.4862075900000002 podStartE2EDuration="15.977020126s" podCreationTimestamp="2026-03-13 12:04:00 +0000 UTC" firstStartedPulling="2026-03-13 12:04:01.854971259 +0000 UTC m=+957.493238022" lastFinishedPulling="2026-03-13 12:04:14.345783795 +0000 UTC m=+969.984050558" observedRunningTime="2026-03-13 12:04:15.974196096 +0000 UTC m=+971.612462849" watchObservedRunningTime="2026-03-13 12:04:15.977020126 +0000 UTC m=+971.615286889" Mar 13 12:04:16 crc kubenswrapper[4837]: I0313 12:04:16.039932 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-9zvxf" podStartSLOduration=3.168137805 podStartE2EDuration="16.039909898s" podCreationTimestamp="2026-03-13 12:04:00 +0000 UTC" firstStartedPulling="2026-03-13 12:04:01.818499214 +0000 UTC m=+957.456765987" lastFinishedPulling="2026-03-13 12:04:14.690271317 +0000 UTC m=+970.328538080" observedRunningTime="2026-03-13 12:04:16.004495106 +0000 UTC m=+971.642761859" watchObservedRunningTime="2026-03-13 12:04:16.039909898 +0000 UTC m=+971.678176651" Mar 13 12:04:16 crc kubenswrapper[4837]: I0313 12:04:16.044626 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-jvdqq" podStartSLOduration=3.914331675 podStartE2EDuration="17.044615677s" podCreationTimestamp="2026-03-13 12:03:59 +0000 UTC" firstStartedPulling="2026-03-13 12:04:00.809977828 +0000 UTC m=+956.448244591" lastFinishedPulling="2026-03-13 12:04:13.94026183 +0000 UTC m=+969.578528593" observedRunningTime="2026-03-13 12:04:16.037879113 +0000 UTC m=+971.676145876" watchObservedRunningTime="2026-03-13 12:04:16.044615677 +0000 UTC m=+971.682882440" Mar 13 12:04:16 crc kubenswrapper[4837]: I0313 12:04:16.129740 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c19c3466-ab50-4be3-8299-d7b8b3d263df-cert\") pod \"infra-operator-controller-manager-5995f4446f-fhlk9\" (UID: \"c19c3466-ab50-4be3-8299-d7b8b3d263df\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-fhlk9" Mar 13 12:04:16 crc kubenswrapper[4837]: E0313 12:04:16.129924 4837 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 13 12:04:16 crc kubenswrapper[4837]: E0313 12:04:16.129999 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c19c3466-ab50-4be3-8299-d7b8b3d263df-cert podName:c19c3466-ab50-4be3-8299-d7b8b3d263df nodeName:}" failed. No retries permitted until 2026-03-13 12:04:32.129981072 +0000 UTC m=+987.768247825 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c19c3466-ab50-4be3-8299-d7b8b3d263df-cert") pod "infra-operator-controller-manager-5995f4446f-fhlk9" (UID: "c19c3466-ab50-4be3-8299-d7b8b3d263df") : secret "infra-operator-webhook-server-cert" not found Mar 13 12:04:16 crc kubenswrapper[4837]: I0313 12:04:16.332033 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7b38159c-e030-4734-963d-dfc38d29c75c-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b77x9vc\" (UID: \"7b38159c-e030-4734-963d-dfc38d29c75c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b77x9vc" Mar 13 12:04:16 crc kubenswrapper[4837]: E0313 12:04:16.332216 4837 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 12:04:16 crc kubenswrapper[4837]: E0313 12:04:16.332293 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b38159c-e030-4734-963d-dfc38d29c75c-cert podName:7b38159c-e030-4734-963d-dfc38d29c75c nodeName:}" failed. No retries permitted until 2026-03-13 12:04:32.332275929 +0000 UTC m=+987.970542692 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7b38159c-e030-4734-963d-dfc38d29c75c-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b77x9vc" (UID: "7b38159c-e030-4734-963d-dfc38d29c75c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 12:04:16 crc kubenswrapper[4837]: I0313 12:04:16.711721 4837 generic.go:334] "Generic (PLEG): container finished" podID="ebbc8197-2f60-4876-8bab-ae450e22db4d" containerID="8bba29e2f9fe402d7901b43545abaa3e8580bab8bbce4d30cd8eeaffafb977c7" exitCode=0 Mar 13 12:04:16 crc kubenswrapper[4837]: I0313 12:04:16.711793 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p6qtb" event={"ID":"ebbc8197-2f60-4876-8bab-ae450e22db4d","Type":"ContainerDied","Data":"8bba29e2f9fe402d7901b43545abaa3e8580bab8bbce4d30cd8eeaffafb977c7"} Mar 13 12:04:16 crc kubenswrapper[4837]: I0313 12:04:16.713920 4837 generic.go:334] "Generic (PLEG): container finished" podID="5b5b628a-9d8f-4ce7-b023-adbb2b00ff47" containerID="d879a84e3b9f99d7a40acbf499a69feddf5c1973d117fb58100dc525ee864d7d" exitCode=0 Mar 13 12:04:16 crc kubenswrapper[4837]: I0313 12:04:16.714756 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-84c45" event={"ID":"5b5b628a-9d8f-4ce7-b023-adbb2b00ff47","Type":"ContainerDied","Data":"d879a84e3b9f99d7a40acbf499a69feddf5c1973d117fb58100dc525ee864d7d"} Mar 13 12:04:16 crc kubenswrapper[4837]: I0313 12:04:16.716429 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-b7cdx" Mar 13 12:04:16 crc kubenswrapper[4837]: I0313 12:04:16.744015 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-metrics-certs\") pod \"openstack-operator-controller-manager-55876d85bb-96mp7\" (UID: \"eaf3fa29-f441-43df-9fbe-409d9d8ad871\") " pod="openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7" Mar 13 12:04:16 crc kubenswrapper[4837]: I0313 12:04:16.744310 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-webhook-certs\") pod \"openstack-operator-controller-manager-55876d85bb-96mp7\" (UID: \"eaf3fa29-f441-43df-9fbe-409d9d8ad871\") " pod="openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7" Mar 13 12:04:16 crc kubenswrapper[4837]: I0313 12:04:16.758427 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-webhook-certs\") pod \"openstack-operator-controller-manager-55876d85bb-96mp7\" (UID: \"eaf3fa29-f441-43df-9fbe-409d9d8ad871\") " pod="openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7" Mar 13 12:04:16 crc kubenswrapper[4837]: I0313 12:04:16.760678 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eaf3fa29-f441-43df-9fbe-409d9d8ad871-metrics-certs\") pod \"openstack-operator-controller-manager-55876d85bb-96mp7\" (UID: \"eaf3fa29-f441-43df-9fbe-409d9d8ad871\") " pod="openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7" Mar 13 12:04:16 crc kubenswrapper[4837]: I0313 12:04:16.779112 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-x6jmh" Mar 13 12:04:16 crc kubenswrapper[4837]: I0313 12:04:16.786257 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7" Mar 13 12:04:17 crc kubenswrapper[4837]: I0313 12:04:17.113076 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7"] Mar 13 12:04:17 crc kubenswrapper[4837]: W0313 12:04:17.124512 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeaf3fa29_f441_43df_9fbe_409d9d8ad871.slice/crio-532e6f022fc3bbb9047521c5b235eeb2ea9c2b1b404f498d41b1b67802ac6a23 WatchSource:0}: Error finding container 532e6f022fc3bbb9047521c5b235eeb2ea9c2b1b404f498d41b1b67802ac6a23: Status 404 returned error can't find the container with id 532e6f022fc3bbb9047521c5b235eeb2ea9c2b1b404f498d41b1b67802ac6a23 Mar 13 12:04:17 crc kubenswrapper[4837]: I0313 12:04:17.720762 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7" event={"ID":"eaf3fa29-f441-43df-9fbe-409d9d8ad871","Type":"ContainerStarted","Data":"532e6f022fc3bbb9047521c5b235eeb2ea9c2b1b404f498d41b1b67802ac6a23"} Mar 13 12:04:20 crc kubenswrapper[4837]: I0313 12:04:20.287138 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-jvdqq" Mar 13 12:04:20 crc kubenswrapper[4837]: I0313 12:04:20.309091 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-kbn8z" Mar 13 12:04:20 crc kubenswrapper[4837]: I0313 12:04:20.320156 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-b7cdx" Mar 13 12:04:20 crc kubenswrapper[4837]: I0313 12:04:20.341174 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-mrgb9" Mar 13 12:04:20 crc kubenswrapper[4837]: I0313 12:04:20.355788 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-ss4rm" Mar 13 12:04:20 crc kubenswrapper[4837]: I0313 12:04:20.439709 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-bvmr7" Mar 13 12:04:20 crc kubenswrapper[4837]: I0313 12:04:20.593063 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-9zvxf" Mar 13 12:04:20 crc kubenswrapper[4837]: I0313 12:04:20.612988 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-kc2x6" Mar 13 12:04:20 crc kubenswrapper[4837]: I0313 12:04:20.703482 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-twrg7" Mar 13 12:04:20 crc kubenswrapper[4837]: I0313 12:04:20.724757 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-7nm95" Mar 13 12:04:20 crc kubenswrapper[4837]: I0313 12:04:20.751053 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7" event={"ID":"eaf3fa29-f441-43df-9fbe-409d9d8ad871","Type":"ContainerStarted","Data":"1c79a7f053c259d44cf3c176d2e5da8d9e16078db1e03ad08933021339b6e95c"} Mar 13 12:04:20 crc kubenswrapper[4837]: I0313 12:04:20.751451 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7" Mar 13 12:04:20 crc kubenswrapper[4837]: I0313 12:04:20.771300 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-6ht9l" Mar 13 12:04:20 crc kubenswrapper[4837]: I0313 12:04:20.787980 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7" podStartSLOduration=20.787954556 podStartE2EDuration="20.787954556s" podCreationTimestamp="2026-03-13 12:04:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:04:20.78146066 +0000 UTC m=+976.419727423" watchObservedRunningTime="2026-03-13 12:04:20.787954556 +0000 UTC m=+976.426221389" Mar 13 12:04:20 crc kubenswrapper[4837]: I0313 12:04:20.801223 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-shrx7" Mar 13 12:04:20 crc kubenswrapper[4837]: I0313 12:04:20.838198 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-7f7zd" Mar 13 12:04:21 crc kubenswrapper[4837]: I0313 12:04:21.951906 4837 scope.go:117] "RemoveContainer" containerID="f165f764ee51b6b29672c3c9a0ac54376301b2d6f3ce983abfa09b63813909b9" Mar 13 12:04:24 crc kubenswrapper[4837]: I0313 12:04:24.788236 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-84c45" event={"ID":"5b5b628a-9d8f-4ce7-b023-adbb2b00ff47","Type":"ContainerStarted","Data":"400082a25f5ff89be657587e2ae6b6c9469f186282f23d0b831a5031bd2d1dfb"} Mar 13 12:04:24 crc kubenswrapper[4837]: I0313 12:04:24.790527 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p6qtb" event={"ID":"ebbc8197-2f60-4876-8bab-ae450e22db4d","Type":"ContainerStarted","Data":"ae0fc7875b5423aa4eeca51206519575c5e9ed80070b4ef32750000a14fbd0df"} Mar 13 12:04:24 crc kubenswrapper[4837]: I0313 12:04:24.792014 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-fwblp" event={"ID":"35a21ab1-95b5-446a-ae10-d004e5aa2995","Type":"ContainerStarted","Data":"76522bc68aa812f69238e3872a3412d8bddf1196af88f9e904e2ed539ac6e32a"} Mar 13 12:04:24 crc kubenswrapper[4837]: I0313 12:04:24.792220 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-fwblp" Mar 13 12:04:24 crc kubenswrapper[4837]: I0313 12:04:24.793264 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-hrcp9" event={"ID":"5ef20b1d-5c03-4993-b635-b031ddcab3bf","Type":"ContainerStarted","Data":"e52f5bccfc9477dc64bf778fb2266d7bff7a50af8c2d85588e0a62e951ce03ac"} Mar 13 12:04:24 crc kubenswrapper[4837]: I0313 12:04:24.793419 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-hrcp9" Mar 13 12:04:24 crc kubenswrapper[4837]: I0313 12:04:24.794576 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-nxwr9" event={"ID":"5f00cf34-6fc4-4ee9-93e5-5ff8c6b1128d","Type":"ContainerStarted","Data":"9f58e7a6ae2a49785e328ced3d15e0ed6a3770c9bd85ea7ee06f1d062a36a7d5"} Mar 13 12:04:24 crc kubenswrapper[4837]: I0313 12:04:24.794798 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-nxwr9" Mar 13 12:04:24 crc kubenswrapper[4837]: I0313 12:04:24.795959 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-677c674df7-cfv8z" event={"ID":"55649f1c-678e-4e03-be55-7c4435446199","Type":"ContainerStarted","Data":"057230b019cd1eb295878f6b1772bae9b97d1696a39934a128cc60b0b7f78404"} Mar 13 12:04:24 crc kubenswrapper[4837]: I0313 12:04:24.796167 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-677c674df7-cfv8z" Mar 13 12:04:24 crc kubenswrapper[4837]: I0313 12:04:24.797219 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-8lkmx" event={"ID":"cb20db22-bd0e-4897-8ed6-a6a80a91ffff","Type":"ContainerStarted","Data":"abd2c01a67167137e03c652ba82b4e7c80e5d272da396063ea4aee9b01b3b72a"} Mar 13 12:04:24 crc kubenswrapper[4837]: I0313 12:04:24.797407 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-8lkmx" Mar 13 12:04:24 crc kubenswrapper[4837]: I0313 12:04:24.798411 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-dk4nr" event={"ID":"fe107e39-b5ec-473d-8851-b57775dadafc","Type":"ContainerStarted","Data":"9fad83d12a16d9e6168924f862324dea13d9e775d616db279f26dc4bd7d2e686"} Mar 13 12:04:24 crc kubenswrapper[4837]: I0313 12:04:24.798574 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-dk4nr" Mar 13 12:04:24 crc kubenswrapper[4837]: I0313 12:04:24.825973 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-dk4nr" podStartSLOduration=2.696709791 podStartE2EDuration="24.825958062s" podCreationTimestamp="2026-03-13 12:04:00 +0000 UTC" firstStartedPulling="2026-03-13 12:04:02.242125402 +0000 UTC m=+957.880392165" lastFinishedPulling="2026-03-13 12:04:24.371373673 +0000 UTC m=+980.009640436" observedRunningTime="2026-03-13 12:04:24.824706793 +0000 UTC m=+980.462973566" watchObservedRunningTime="2026-03-13 12:04:24.825958062 +0000 UTC m=+980.464224825" Mar 13 12:04:24 crc kubenswrapper[4837]: I0313 12:04:24.850450 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-8lkmx" podStartSLOduration=4.150153501 podStartE2EDuration="24.850433918s" podCreationTimestamp="2026-03-13 12:04:00 +0000 UTC" firstStartedPulling="2026-03-13 12:04:02.217956547 +0000 UTC m=+957.856223310" lastFinishedPulling="2026-03-13 12:04:22.918236954 +0000 UTC m=+978.556503727" observedRunningTime="2026-03-13 12:04:24.845366157 +0000 UTC m=+980.483632920" watchObservedRunningTime="2026-03-13 12:04:24.850433918 +0000 UTC m=+980.488700671" Mar 13 12:04:24 crc kubenswrapper[4837]: I0313 12:04:24.867078 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-677c674df7-cfv8z" podStartSLOduration=2.769493108 podStartE2EDuration="24.867062775s" podCreationTimestamp="2026-03-13 12:04:00 +0000 UTC" firstStartedPulling="2026-03-13 12:04:02.231692422 +0000 UTC m=+957.869959185" lastFinishedPulling="2026-03-13 12:04:24.329262079 +0000 UTC m=+979.967528852" observedRunningTime="2026-03-13 12:04:24.864164543 +0000 UTC m=+980.502431306" watchObservedRunningTime="2026-03-13 12:04:24.867062775 +0000 UTC m=+980.505329538" Mar 13 12:04:24 crc kubenswrapper[4837]: I0313 12:04:24.887209 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-hrcp9" podStartSLOduration=2.800421825 podStartE2EDuration="24.887190201s" podCreationTimestamp="2026-03-13 12:04:00 +0000 UTC" firstStartedPulling="2026-03-13 12:04:02.24239122 +0000 UTC m=+957.880657983" lastFinishedPulling="2026-03-13 12:04:24.329159596 +0000 UTC m=+979.967426359" observedRunningTime="2026-03-13 12:04:24.886845751 +0000 UTC m=+980.525112514" watchObservedRunningTime="2026-03-13 12:04:24.887190201 +0000 UTC m=+980.525456964" Mar 13 12:04:24 crc kubenswrapper[4837]: I0313 12:04:24.911464 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-fwblp" podStartSLOduration=2.810509696 podStartE2EDuration="24.91142769s" podCreationTimestamp="2026-03-13 12:04:00 +0000 UTC" firstStartedPulling="2026-03-13 12:04:02.231483855 +0000 UTC m=+957.869750618" lastFinishedPulling="2026-03-13 12:04:24.332401859 +0000 UTC m=+979.970668612" observedRunningTime="2026-03-13 12:04:24.909128607 +0000 UTC m=+980.547395380" watchObservedRunningTime="2026-03-13 12:04:24.91142769 +0000 UTC m=+980.549694453" Mar 13 12:04:24 crc kubenswrapper[4837]: I0313 12:04:24.977425 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-nxwr9" podStartSLOduration=3.401998082 podStartE2EDuration="24.977404219s" podCreationTimestamp="2026-03-13 12:04:00 +0000 UTC" firstStartedPulling="2026-03-13 12:04:02.228936744 +0000 UTC m=+957.867203507" lastFinishedPulling="2026-03-13 12:04:23.804342881 +0000 UTC m=+979.442609644" observedRunningTime="2026-03-13 12:04:24.943128703 +0000 UTC m=+980.581395466" watchObservedRunningTime="2026-03-13 12:04:24.977404219 +0000 UTC m=+980.615670982" Mar 13 12:04:24 crc kubenswrapper[4837]: I0313 12:04:24.979012 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-p6qtb" podStartSLOduration=3.5822019210000002 podStartE2EDuration="22.97899888s" podCreationTimestamp="2026-03-13 12:04:02 +0000 UTC" firstStartedPulling="2026-03-13 12:04:04.924739956 +0000 UTC m=+960.563006719" lastFinishedPulling="2026-03-13 12:04:24.321536915 +0000 UTC m=+979.959803678" observedRunningTime="2026-03-13 12:04:24.976493451 +0000 UTC m=+980.614760234" watchObservedRunningTime="2026-03-13 12:04:24.97899888 +0000 UTC m=+980.617265643" Mar 13 12:04:25 crc kubenswrapper[4837]: I0313 12:04:25.807152 4837 generic.go:334] "Generic (PLEG): container finished" podID="5b5b628a-9d8f-4ce7-b023-adbb2b00ff47" containerID="400082a25f5ff89be657587e2ae6b6c9469f186282f23d0b831a5031bd2d1dfb" exitCode=0 Mar 13 12:04:25 crc kubenswrapper[4837]: I0313 12:04:25.807855 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-84c45" event={"ID":"5b5b628a-9d8f-4ce7-b023-adbb2b00ff47","Type":"ContainerDied","Data":"400082a25f5ff89be657587e2ae6b6c9469f186282f23d0b831a5031bd2d1dfb"} Mar 13 12:04:26 crc kubenswrapper[4837]: I0313 12:04:26.793257 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-55876d85bb-96mp7" Mar 13 12:04:26 crc kubenswrapper[4837]: I0313 12:04:26.818535 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-84c45" event={"ID":"5b5b628a-9d8f-4ce7-b023-adbb2b00ff47","Type":"ContainerStarted","Data":"e4431103593ed4766f2c10d7f2c36127f9debf5d279d8749159e768de7bb7cc2"} Mar 13 12:04:26 crc kubenswrapper[4837]: I0313 12:04:26.844483 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-84c45" podStartSLOduration=8.183218857 podStartE2EDuration="18.844459049s" podCreationTimestamp="2026-03-13 12:04:08 +0000 UTC" firstStartedPulling="2026-03-13 12:04:15.549173884 +0000 UTC m=+971.187440647" lastFinishedPulling="2026-03-13 12:04:26.210414076 +0000 UTC m=+981.848680839" observedRunningTime="2026-03-13 12:04:26.838278024 +0000 UTC m=+982.476544787" watchObservedRunningTime="2026-03-13 12:04:26.844459049 +0000 UTC m=+982.482725812" Mar 13 12:04:29 crc kubenswrapper[4837]: I0313 12:04:29.346098 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-84c45" Mar 13 12:04:29 crc kubenswrapper[4837]: I0313 12:04:29.346160 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-84c45" Mar 13 12:04:29 crc kubenswrapper[4837]: I0313 12:04:29.394859 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-84c45" Mar 13 12:04:30 crc kubenswrapper[4837]: I0313 12:04:30.898538 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-nxwr9" Mar 13 12:04:30 crc kubenswrapper[4837]: I0313 12:04:30.910369 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-fwblp" Mar 13 12:04:30 crc kubenswrapper[4837]: I0313 12:04:30.944071 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-677c674df7-cfv8z" Mar 13 12:04:30 crc kubenswrapper[4837]: I0313 12:04:30.978252 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-8lkmx" Mar 13 12:04:31 crc kubenswrapper[4837]: I0313 12:04:31.076988 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-dk4nr" Mar 13 12:04:31 crc kubenswrapper[4837]: I0313 12:04:31.114410 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-hrcp9" Mar 13 12:04:32 crc kubenswrapper[4837]: I0313 12:04:32.180008 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c19c3466-ab50-4be3-8299-d7b8b3d263df-cert\") pod \"infra-operator-controller-manager-5995f4446f-fhlk9\" (UID: \"c19c3466-ab50-4be3-8299-d7b8b3d263df\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-fhlk9" Mar 13 12:04:32 crc kubenswrapper[4837]: I0313 12:04:32.187317 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c19c3466-ab50-4be3-8299-d7b8b3d263df-cert\") pod \"infra-operator-controller-manager-5995f4446f-fhlk9\" (UID: \"c19c3466-ab50-4be3-8299-d7b8b3d263df\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-fhlk9" Mar 13 12:04:32 crc kubenswrapper[4837]: I0313 12:04:32.360566 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-dzbzz" Mar 13 12:04:32 crc kubenswrapper[4837]: I0313 12:04:32.369160 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-fhlk9" Mar 13 12:04:32 crc kubenswrapper[4837]: I0313 12:04:32.383474 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7b38159c-e030-4734-963d-dfc38d29c75c-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b77x9vc\" (UID: \"7b38159c-e030-4734-963d-dfc38d29c75c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b77x9vc" Mar 13 12:04:32 crc kubenswrapper[4837]: I0313 12:04:32.387927 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7b38159c-e030-4734-963d-dfc38d29c75c-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b77x9vc\" (UID: \"7b38159c-e030-4734-963d-dfc38d29c75c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b77x9vc" Mar 13 12:04:32 crc kubenswrapper[4837]: I0313 12:04:32.436708 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-p6qtb" Mar 13 12:04:32 crc kubenswrapper[4837]: I0313 12:04:32.436766 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-p6qtb" Mar 13 12:04:32 crc kubenswrapper[4837]: I0313 12:04:32.515994 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-p6qtb" Mar 13 12:04:32 crc kubenswrapper[4837]: I0313 12:04:32.641480 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-qzn2w" Mar 13 12:04:32 crc kubenswrapper[4837]: I0313 12:04:32.650335 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b77x9vc" Mar 13 12:04:32 crc kubenswrapper[4837]: W0313 12:04:32.829463 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc19c3466_ab50_4be3_8299_d7b8b3d263df.slice/crio-ad6d1202460b82cfc1e3d7bfcdb8137ca76d8f4e67f2ebf0c7488eb495a97a5a WatchSource:0}: Error finding container ad6d1202460b82cfc1e3d7bfcdb8137ca76d8f4e67f2ebf0c7488eb495a97a5a: Status 404 returned error can't find the container with id ad6d1202460b82cfc1e3d7bfcdb8137ca76d8f4e67f2ebf0c7488eb495a97a5a Mar 13 12:04:32 crc kubenswrapper[4837]: I0313 12:04:32.830745 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-fhlk9"] Mar 13 12:04:32 crc kubenswrapper[4837]: I0313 12:04:32.868132 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b77x9vc"] Mar 13 12:04:32 crc kubenswrapper[4837]: I0313 12:04:32.868438 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-fhlk9" event={"ID":"c19c3466-ab50-4be3-8299-d7b8b3d263df","Type":"ContainerStarted","Data":"ad6d1202460b82cfc1e3d7bfcdb8137ca76d8f4e67f2ebf0c7488eb495a97a5a"} Mar 13 12:04:32 crc kubenswrapper[4837]: W0313 12:04:32.874807 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b38159c_e030_4734_963d_dfc38d29c75c.slice/crio-6257d4c927203d267ef19e259a32affb128d05fca949bd662f5e1462b6e46a77 WatchSource:0}: Error finding container 6257d4c927203d267ef19e259a32affb128d05fca949bd662f5e1462b6e46a77: Status 404 returned error can't find the container with id 6257d4c927203d267ef19e259a32affb128d05fca949bd662f5e1462b6e46a77 Mar 13 12:04:32 crc kubenswrapper[4837]: I0313 12:04:32.905480 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-p6qtb" Mar 13 12:04:33 crc kubenswrapper[4837]: I0313 12:04:33.316085 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p6qtb"] Mar 13 12:04:33 crc kubenswrapper[4837]: I0313 12:04:33.877380 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b77x9vc" event={"ID":"7b38159c-e030-4734-963d-dfc38d29c75c","Type":"ContainerStarted","Data":"6257d4c927203d267ef19e259a32affb128d05fca949bd662f5e1462b6e46a77"} Mar 13 12:04:34 crc kubenswrapper[4837]: I0313 12:04:34.883473 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-p6qtb" podUID="ebbc8197-2f60-4876-8bab-ae450e22db4d" containerName="registry-server" containerID="cri-o://ae0fc7875b5423aa4eeca51206519575c5e9ed80070b4ef32750000a14fbd0df" gracePeriod=2 Mar 13 12:04:35 crc kubenswrapper[4837]: I0313 12:04:35.483372 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:04:35 crc kubenswrapper[4837]: I0313 12:04:35.483594 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:04:35 crc kubenswrapper[4837]: I0313 12:04:35.720491 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-95c7q"] Mar 13 12:04:35 crc kubenswrapper[4837]: E0313 12:04:35.721344 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bda3181-d107-4de8-b754-e5e67dd8dd9c" containerName="oc" Mar 13 12:04:35 crc kubenswrapper[4837]: I0313 12:04:35.721371 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bda3181-d107-4de8-b754-e5e67dd8dd9c" containerName="oc" Mar 13 12:04:35 crc kubenswrapper[4837]: I0313 12:04:35.721538 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bda3181-d107-4de8-b754-e5e67dd8dd9c" containerName="oc" Mar 13 12:04:35 crc kubenswrapper[4837]: I0313 12:04:35.722700 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-95c7q" Mar 13 12:04:35 crc kubenswrapper[4837]: I0313 12:04:35.727344 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-95c7q"] Mar 13 12:04:35 crc kubenswrapper[4837]: I0313 12:04:35.739188 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecbc61c9-ab9c-485e-9112-bb8704851de8-catalog-content\") pod \"redhat-operators-95c7q\" (UID: \"ecbc61c9-ab9c-485e-9112-bb8704851de8\") " pod="openshift-marketplace/redhat-operators-95c7q" Mar 13 12:04:35 crc kubenswrapper[4837]: I0313 12:04:35.739253 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecbc61c9-ab9c-485e-9112-bb8704851de8-utilities\") pod \"redhat-operators-95c7q\" (UID: \"ecbc61c9-ab9c-485e-9112-bb8704851de8\") " pod="openshift-marketplace/redhat-operators-95c7q" Mar 13 12:04:35 crc kubenswrapper[4837]: I0313 12:04:35.739283 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvb2c\" (UniqueName: \"kubernetes.io/projected/ecbc61c9-ab9c-485e-9112-bb8704851de8-kube-api-access-vvb2c\") pod \"redhat-operators-95c7q\" (UID: \"ecbc61c9-ab9c-485e-9112-bb8704851de8\") " pod="openshift-marketplace/redhat-operators-95c7q" Mar 13 12:04:35 crc kubenswrapper[4837]: I0313 12:04:35.840356 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecbc61c9-ab9c-485e-9112-bb8704851de8-catalog-content\") pod \"redhat-operators-95c7q\" (UID: \"ecbc61c9-ab9c-485e-9112-bb8704851de8\") " pod="openshift-marketplace/redhat-operators-95c7q" Mar 13 12:04:35 crc kubenswrapper[4837]: I0313 12:04:35.840400 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecbc61c9-ab9c-485e-9112-bb8704851de8-utilities\") pod \"redhat-operators-95c7q\" (UID: \"ecbc61c9-ab9c-485e-9112-bb8704851de8\") " pod="openshift-marketplace/redhat-operators-95c7q" Mar 13 12:04:35 crc kubenswrapper[4837]: I0313 12:04:35.840421 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvb2c\" (UniqueName: \"kubernetes.io/projected/ecbc61c9-ab9c-485e-9112-bb8704851de8-kube-api-access-vvb2c\") pod \"redhat-operators-95c7q\" (UID: \"ecbc61c9-ab9c-485e-9112-bb8704851de8\") " pod="openshift-marketplace/redhat-operators-95c7q" Mar 13 12:04:35 crc kubenswrapper[4837]: I0313 12:04:35.841148 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecbc61c9-ab9c-485e-9112-bb8704851de8-catalog-content\") pod \"redhat-operators-95c7q\" (UID: \"ecbc61c9-ab9c-485e-9112-bb8704851de8\") " pod="openshift-marketplace/redhat-operators-95c7q" Mar 13 12:04:35 crc kubenswrapper[4837]: I0313 12:04:35.841246 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecbc61c9-ab9c-485e-9112-bb8704851de8-utilities\") pod \"redhat-operators-95c7q\" (UID: \"ecbc61c9-ab9c-485e-9112-bb8704851de8\") " pod="openshift-marketplace/redhat-operators-95c7q" Mar 13 12:04:35 crc kubenswrapper[4837]: I0313 12:04:35.864483 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvb2c\" (UniqueName: \"kubernetes.io/projected/ecbc61c9-ab9c-485e-9112-bb8704851de8-kube-api-access-vvb2c\") pod \"redhat-operators-95c7q\" (UID: \"ecbc61c9-ab9c-485e-9112-bb8704851de8\") " pod="openshift-marketplace/redhat-operators-95c7q" Mar 13 12:04:35 crc kubenswrapper[4837]: I0313 12:04:35.896308 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b77x9vc" event={"ID":"7b38159c-e030-4734-963d-dfc38d29c75c","Type":"ContainerStarted","Data":"2ab86d7ebd8d40fa48f3ce7fd722f9b522ea7d214ba983eb06850bd968519f0d"} Mar 13 12:04:35 crc kubenswrapper[4837]: I0313 12:04:35.897314 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b77x9vc" Mar 13 12:04:35 crc kubenswrapper[4837]: I0313 12:04:35.907189 4837 generic.go:334] "Generic (PLEG): container finished" podID="ebbc8197-2f60-4876-8bab-ae450e22db4d" containerID="ae0fc7875b5423aa4eeca51206519575c5e9ed80070b4ef32750000a14fbd0df" exitCode=0 Mar 13 12:04:35 crc kubenswrapper[4837]: I0313 12:04:35.907251 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p6qtb" event={"ID":"ebbc8197-2f60-4876-8bab-ae450e22db4d","Type":"ContainerDied","Data":"ae0fc7875b5423aa4eeca51206519575c5e9ed80070b4ef32750000a14fbd0df"} Mar 13 12:04:35 crc kubenswrapper[4837]: I0313 12:04:35.908857 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-fhlk9" event={"ID":"c19c3466-ab50-4be3-8299-d7b8b3d263df","Type":"ContainerStarted","Data":"9a5e7c35318040a70b980cd86ad93a24d7c7c060feeff3e0ab9006cae0e2922d"} Mar 13 12:04:35 crc kubenswrapper[4837]: I0313 12:04:35.909450 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-fhlk9" Mar 13 12:04:35 crc kubenswrapper[4837]: I0313 12:04:35.940524 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b77x9vc" podStartSLOduration=33.644429964 podStartE2EDuration="35.940503404s" podCreationTimestamp="2026-03-13 12:04:00 +0000 UTC" firstStartedPulling="2026-03-13 12:04:32.876807909 +0000 UTC m=+988.515074672" lastFinishedPulling="2026-03-13 12:04:35.172881349 +0000 UTC m=+990.811148112" observedRunningTime="2026-03-13 12:04:35.935279318 +0000 UTC m=+991.573546081" watchObservedRunningTime="2026-03-13 12:04:35.940503404 +0000 UTC m=+991.578770167" Mar 13 12:04:35 crc kubenswrapper[4837]: I0313 12:04:35.953427 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p6qtb" Mar 13 12:04:35 crc kubenswrapper[4837]: I0313 12:04:35.965931 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-fhlk9" podStartSLOduration=33.62729279 podStartE2EDuration="35.965557567s" podCreationTimestamp="2026-03-13 12:04:00 +0000 UTC" firstStartedPulling="2026-03-13 12:04:32.839156166 +0000 UTC m=+988.477422969" lastFinishedPulling="2026-03-13 12:04:35.177420983 +0000 UTC m=+990.815687746" observedRunningTime="2026-03-13 12:04:35.955896552 +0000 UTC m=+991.594163315" watchObservedRunningTime="2026-03-13 12:04:35.965557567 +0000 UTC m=+991.603824340" Mar 13 12:04:36 crc kubenswrapper[4837]: I0313 12:04:36.038985 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-95c7q" Mar 13 12:04:36 crc kubenswrapper[4837]: I0313 12:04:36.042526 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sp64z\" (UniqueName: \"kubernetes.io/projected/ebbc8197-2f60-4876-8bab-ae450e22db4d-kube-api-access-sp64z\") pod \"ebbc8197-2f60-4876-8bab-ae450e22db4d\" (UID: \"ebbc8197-2f60-4876-8bab-ae450e22db4d\") " Mar 13 12:04:36 crc kubenswrapper[4837]: I0313 12:04:36.042613 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebbc8197-2f60-4876-8bab-ae450e22db4d-catalog-content\") pod \"ebbc8197-2f60-4876-8bab-ae450e22db4d\" (UID: \"ebbc8197-2f60-4876-8bab-ae450e22db4d\") " Mar 13 12:04:36 crc kubenswrapper[4837]: I0313 12:04:36.042681 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebbc8197-2f60-4876-8bab-ae450e22db4d-utilities\") pod \"ebbc8197-2f60-4876-8bab-ae450e22db4d\" (UID: \"ebbc8197-2f60-4876-8bab-ae450e22db4d\") " Mar 13 12:04:36 crc kubenswrapper[4837]: I0313 12:04:36.043968 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebbc8197-2f60-4876-8bab-ae450e22db4d-utilities" (OuterVolumeSpecName: "utilities") pod "ebbc8197-2f60-4876-8bab-ae450e22db4d" (UID: "ebbc8197-2f60-4876-8bab-ae450e22db4d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:04:36 crc kubenswrapper[4837]: I0313 12:04:36.049228 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebbc8197-2f60-4876-8bab-ae450e22db4d-kube-api-access-sp64z" (OuterVolumeSpecName: "kube-api-access-sp64z") pod "ebbc8197-2f60-4876-8bab-ae450e22db4d" (UID: "ebbc8197-2f60-4876-8bab-ae450e22db4d"). InnerVolumeSpecName "kube-api-access-sp64z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:04:36 crc kubenswrapper[4837]: I0313 12:04:36.109137 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebbc8197-2f60-4876-8bab-ae450e22db4d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ebbc8197-2f60-4876-8bab-ae450e22db4d" (UID: "ebbc8197-2f60-4876-8bab-ae450e22db4d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:04:36 crc kubenswrapper[4837]: I0313 12:04:36.143716 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sp64z\" (UniqueName: \"kubernetes.io/projected/ebbc8197-2f60-4876-8bab-ae450e22db4d-kube-api-access-sp64z\") on node \"crc\" DevicePath \"\"" Mar 13 12:04:36 crc kubenswrapper[4837]: I0313 12:04:36.143746 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebbc8197-2f60-4876-8bab-ae450e22db4d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:04:36 crc kubenswrapper[4837]: I0313 12:04:36.143757 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebbc8197-2f60-4876-8bab-ae450e22db4d-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:04:36 crc kubenswrapper[4837]: I0313 12:04:36.505954 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-95c7q"] Mar 13 12:04:36 crc kubenswrapper[4837]: W0313 12:04:36.509369 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podecbc61c9_ab9c_485e_9112_bb8704851de8.slice/crio-d87a304801c0e9f7051845724164f67c6d9d6a0ecd1e18c769cc77332da22942 WatchSource:0}: Error finding container d87a304801c0e9f7051845724164f67c6d9d6a0ecd1e18c769cc77332da22942: Status 404 returned error can't find the container with id d87a304801c0e9f7051845724164f67c6d9d6a0ecd1e18c769cc77332da22942 Mar 13 12:04:36 crc kubenswrapper[4837]: I0313 12:04:36.916499 4837 generic.go:334] "Generic (PLEG): container finished" podID="ecbc61c9-ab9c-485e-9112-bb8704851de8" containerID="80e7a9f84e3e53924280e958be2d8ebd2909130bea780f65b73b56b9eeb2548c" exitCode=0 Mar 13 12:04:36 crc kubenswrapper[4837]: I0313 12:04:36.916553 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95c7q" event={"ID":"ecbc61c9-ab9c-485e-9112-bb8704851de8","Type":"ContainerDied","Data":"80e7a9f84e3e53924280e958be2d8ebd2909130bea780f65b73b56b9eeb2548c"} Mar 13 12:04:36 crc kubenswrapper[4837]: I0313 12:04:36.916609 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95c7q" event={"ID":"ecbc61c9-ab9c-485e-9112-bb8704851de8","Type":"ContainerStarted","Data":"d87a304801c0e9f7051845724164f67c6d9d6a0ecd1e18c769cc77332da22942"} Mar 13 12:04:36 crc kubenswrapper[4837]: I0313 12:04:36.919619 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p6qtb" event={"ID":"ebbc8197-2f60-4876-8bab-ae450e22db4d","Type":"ContainerDied","Data":"9a83d8f4cff27c5d1da1eb8b95de8aad51c44fbb135c4b97c55466208565344c"} Mar 13 12:04:36 crc kubenswrapper[4837]: I0313 12:04:36.919673 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p6qtb" Mar 13 12:04:36 crc kubenswrapper[4837]: I0313 12:04:36.919702 4837 scope.go:117] "RemoveContainer" containerID="ae0fc7875b5423aa4eeca51206519575c5e9ed80070b4ef32750000a14fbd0df" Mar 13 12:04:36 crc kubenswrapper[4837]: I0313 12:04:36.961173 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p6qtb"] Mar 13 12:04:36 crc kubenswrapper[4837]: I0313 12:04:36.973492 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-p6qtb"] Mar 13 12:04:37 crc kubenswrapper[4837]: I0313 12:04:37.058519 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebbc8197-2f60-4876-8bab-ae450e22db4d" path="/var/lib/kubelet/pods/ebbc8197-2f60-4876-8bab-ae450e22db4d/volumes" Mar 13 12:04:37 crc kubenswrapper[4837]: I0313 12:04:37.663893 4837 scope.go:117] "RemoveContainer" containerID="8bba29e2f9fe402d7901b43545abaa3e8580bab8bbce4d30cd8eeaffafb977c7" Mar 13 12:04:37 crc kubenswrapper[4837]: I0313 12:04:37.686547 4837 scope.go:117] "RemoveContainer" containerID="0ddcb245a681a303f4445dab08f3327e3df698349bd5573d79829fcc09b9c9ef" Mar 13 12:04:39 crc kubenswrapper[4837]: I0313 12:04:39.394596 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-84c45" Mar 13 12:04:41 crc kubenswrapper[4837]: I0313 12:04:41.708941 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-84c45"] Mar 13 12:04:41 crc kubenswrapper[4837]: I0313 12:04:41.709606 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-84c45" podUID="5b5b628a-9d8f-4ce7-b023-adbb2b00ff47" containerName="registry-server" containerID="cri-o://e4431103593ed4766f2c10d7f2c36127f9debf5d279d8749159e768de7bb7cc2" gracePeriod=2 Mar 13 12:04:42 crc kubenswrapper[4837]: I0313 12:04:42.378323 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-fhlk9" Mar 13 12:04:42 crc kubenswrapper[4837]: I0313 12:04:42.658633 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b77x9vc" Mar 13 12:04:42 crc kubenswrapper[4837]: I0313 12:04:42.867717 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-84c45" Mar 13 12:04:42 crc kubenswrapper[4837]: I0313 12:04:42.963925 4837 generic.go:334] "Generic (PLEG): container finished" podID="5b5b628a-9d8f-4ce7-b023-adbb2b00ff47" containerID="e4431103593ed4766f2c10d7f2c36127f9debf5d279d8749159e768de7bb7cc2" exitCode=0 Mar 13 12:04:42 crc kubenswrapper[4837]: I0313 12:04:42.963997 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-84c45" Mar 13 12:04:42 crc kubenswrapper[4837]: I0313 12:04:42.964085 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-84c45" event={"ID":"5b5b628a-9d8f-4ce7-b023-adbb2b00ff47","Type":"ContainerDied","Data":"e4431103593ed4766f2c10d7f2c36127f9debf5d279d8749159e768de7bb7cc2"} Mar 13 12:04:42 crc kubenswrapper[4837]: I0313 12:04:42.964133 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-84c45" event={"ID":"5b5b628a-9d8f-4ce7-b023-adbb2b00ff47","Type":"ContainerDied","Data":"975dad2eacabc4e891be9a4559895099caf82984d20099b8ca5021f0401addb6"} Mar 13 12:04:42 crc kubenswrapper[4837]: I0313 12:04:42.964156 4837 scope.go:117] "RemoveContainer" containerID="e4431103593ed4766f2c10d7f2c36127f9debf5d279d8749159e768de7bb7cc2" Mar 13 12:04:42 crc kubenswrapper[4837]: I0313 12:04:42.968756 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95c7q" event={"ID":"ecbc61c9-ab9c-485e-9112-bb8704851de8","Type":"ContainerStarted","Data":"15effd1f70e513b4a4e478354af67a4a98140eaf51a9874afc59ee39e91f07c2"} Mar 13 12:04:42 crc kubenswrapper[4837]: I0313 12:04:42.991146 4837 scope.go:117] "RemoveContainer" containerID="400082a25f5ff89be657587e2ae6b6c9469f186282f23d0b831a5031bd2d1dfb" Mar 13 12:04:43 crc kubenswrapper[4837]: I0313 12:04:43.025358 4837 scope.go:117] "RemoveContainer" containerID="d879a84e3b9f99d7a40acbf499a69feddf5c1973d117fb58100dc525ee864d7d" Mar 13 12:04:43 crc kubenswrapper[4837]: I0313 12:04:43.038538 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kfl9\" (UniqueName: \"kubernetes.io/projected/5b5b628a-9d8f-4ce7-b023-adbb2b00ff47-kube-api-access-8kfl9\") pod \"5b5b628a-9d8f-4ce7-b023-adbb2b00ff47\" (UID: \"5b5b628a-9d8f-4ce7-b023-adbb2b00ff47\") " Mar 13 12:04:43 crc kubenswrapper[4837]: I0313 12:04:43.038598 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b5b628a-9d8f-4ce7-b023-adbb2b00ff47-utilities\") pod \"5b5b628a-9d8f-4ce7-b023-adbb2b00ff47\" (UID: \"5b5b628a-9d8f-4ce7-b023-adbb2b00ff47\") " Mar 13 12:04:43 crc kubenswrapper[4837]: I0313 12:04:43.038699 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b5b628a-9d8f-4ce7-b023-adbb2b00ff47-catalog-content\") pod \"5b5b628a-9d8f-4ce7-b023-adbb2b00ff47\" (UID: \"5b5b628a-9d8f-4ce7-b023-adbb2b00ff47\") " Mar 13 12:04:43 crc kubenswrapper[4837]: I0313 12:04:43.039970 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b5b628a-9d8f-4ce7-b023-adbb2b00ff47-utilities" (OuterVolumeSpecName: "utilities") pod "5b5b628a-9d8f-4ce7-b023-adbb2b00ff47" (UID: "5b5b628a-9d8f-4ce7-b023-adbb2b00ff47"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:04:43 crc kubenswrapper[4837]: I0313 12:04:43.046302 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b5b628a-9d8f-4ce7-b023-adbb2b00ff47-kube-api-access-8kfl9" (OuterVolumeSpecName: "kube-api-access-8kfl9") pod "5b5b628a-9d8f-4ce7-b023-adbb2b00ff47" (UID: "5b5b628a-9d8f-4ce7-b023-adbb2b00ff47"). InnerVolumeSpecName "kube-api-access-8kfl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:04:43 crc kubenswrapper[4837]: I0313 12:04:43.096607 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b5b628a-9d8f-4ce7-b023-adbb2b00ff47-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5b5b628a-9d8f-4ce7-b023-adbb2b00ff47" (UID: "5b5b628a-9d8f-4ce7-b023-adbb2b00ff47"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:04:43 crc kubenswrapper[4837]: I0313 12:04:43.117157 4837 scope.go:117] "RemoveContainer" containerID="e4431103593ed4766f2c10d7f2c36127f9debf5d279d8749159e768de7bb7cc2" Mar 13 12:04:43 crc kubenswrapper[4837]: E0313 12:04:43.117614 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4431103593ed4766f2c10d7f2c36127f9debf5d279d8749159e768de7bb7cc2\": container with ID starting with e4431103593ed4766f2c10d7f2c36127f9debf5d279d8749159e768de7bb7cc2 not found: ID does not exist" containerID="e4431103593ed4766f2c10d7f2c36127f9debf5d279d8749159e768de7bb7cc2" Mar 13 12:04:43 crc kubenswrapper[4837]: I0313 12:04:43.117714 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4431103593ed4766f2c10d7f2c36127f9debf5d279d8749159e768de7bb7cc2"} err="failed to get container status \"e4431103593ed4766f2c10d7f2c36127f9debf5d279d8749159e768de7bb7cc2\": rpc error: code = NotFound desc = could not find container \"e4431103593ed4766f2c10d7f2c36127f9debf5d279d8749159e768de7bb7cc2\": container with ID starting with e4431103593ed4766f2c10d7f2c36127f9debf5d279d8749159e768de7bb7cc2 not found: ID does not exist" Mar 13 12:04:43 crc kubenswrapper[4837]: I0313 12:04:43.117739 4837 scope.go:117] "RemoveContainer" containerID="400082a25f5ff89be657587e2ae6b6c9469f186282f23d0b831a5031bd2d1dfb" Mar 13 12:04:43 crc kubenswrapper[4837]: E0313 12:04:43.118092 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"400082a25f5ff89be657587e2ae6b6c9469f186282f23d0b831a5031bd2d1dfb\": container with ID starting with 400082a25f5ff89be657587e2ae6b6c9469f186282f23d0b831a5031bd2d1dfb not found: ID does not exist" containerID="400082a25f5ff89be657587e2ae6b6c9469f186282f23d0b831a5031bd2d1dfb" Mar 13 12:04:43 crc kubenswrapper[4837]: I0313 12:04:43.118143 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"400082a25f5ff89be657587e2ae6b6c9469f186282f23d0b831a5031bd2d1dfb"} err="failed to get container status \"400082a25f5ff89be657587e2ae6b6c9469f186282f23d0b831a5031bd2d1dfb\": rpc error: code = NotFound desc = could not find container \"400082a25f5ff89be657587e2ae6b6c9469f186282f23d0b831a5031bd2d1dfb\": container with ID starting with 400082a25f5ff89be657587e2ae6b6c9469f186282f23d0b831a5031bd2d1dfb not found: ID does not exist" Mar 13 12:04:43 crc kubenswrapper[4837]: I0313 12:04:43.118160 4837 scope.go:117] "RemoveContainer" containerID="d879a84e3b9f99d7a40acbf499a69feddf5c1973d117fb58100dc525ee864d7d" Mar 13 12:04:43 crc kubenswrapper[4837]: E0313 12:04:43.119669 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d879a84e3b9f99d7a40acbf499a69feddf5c1973d117fb58100dc525ee864d7d\": container with ID starting with d879a84e3b9f99d7a40acbf499a69feddf5c1973d117fb58100dc525ee864d7d not found: ID does not exist" containerID="d879a84e3b9f99d7a40acbf499a69feddf5c1973d117fb58100dc525ee864d7d" Mar 13 12:04:43 crc kubenswrapper[4837]: I0313 12:04:43.119699 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d879a84e3b9f99d7a40acbf499a69feddf5c1973d117fb58100dc525ee864d7d"} err="failed to get container status \"d879a84e3b9f99d7a40acbf499a69feddf5c1973d117fb58100dc525ee864d7d\": rpc error: code = NotFound desc = could not find container \"d879a84e3b9f99d7a40acbf499a69feddf5c1973d117fb58100dc525ee864d7d\": container with ID starting with d879a84e3b9f99d7a40acbf499a69feddf5c1973d117fb58100dc525ee864d7d not found: ID does not exist" Mar 13 12:04:43 crc kubenswrapper[4837]: I0313 12:04:43.141049 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b5b628a-9d8f-4ce7-b023-adbb2b00ff47-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:04:43 crc kubenswrapper[4837]: I0313 12:04:43.141121 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kfl9\" (UniqueName: \"kubernetes.io/projected/5b5b628a-9d8f-4ce7-b023-adbb2b00ff47-kube-api-access-8kfl9\") on node \"crc\" DevicePath \"\"" Mar 13 12:04:43 crc kubenswrapper[4837]: I0313 12:04:43.141139 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b5b628a-9d8f-4ce7-b023-adbb2b00ff47-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:04:43 crc kubenswrapper[4837]: I0313 12:04:43.298254 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-84c45"] Mar 13 12:04:43 crc kubenswrapper[4837]: I0313 12:04:43.306859 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-84c45"] Mar 13 12:04:43 crc kubenswrapper[4837]: I0313 12:04:43.983914 4837 generic.go:334] "Generic (PLEG): container finished" podID="ecbc61c9-ab9c-485e-9112-bb8704851de8" containerID="15effd1f70e513b4a4e478354af67a4a98140eaf51a9874afc59ee39e91f07c2" exitCode=0 Mar 13 12:04:43 crc kubenswrapper[4837]: I0313 12:04:43.984018 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95c7q" event={"ID":"ecbc61c9-ab9c-485e-9112-bb8704851de8","Type":"ContainerDied","Data":"15effd1f70e513b4a4e478354af67a4a98140eaf51a9874afc59ee39e91f07c2"} Mar 13 12:04:44 crc kubenswrapper[4837]: I0313 12:04:44.116067 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2crl9"] Mar 13 12:04:44 crc kubenswrapper[4837]: E0313 12:04:44.116777 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebbc8197-2f60-4876-8bab-ae450e22db4d" containerName="registry-server" Mar 13 12:04:44 crc kubenswrapper[4837]: I0313 12:04:44.116796 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebbc8197-2f60-4876-8bab-ae450e22db4d" containerName="registry-server" Mar 13 12:04:44 crc kubenswrapper[4837]: E0313 12:04:44.116808 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebbc8197-2f60-4876-8bab-ae450e22db4d" containerName="extract-content" Mar 13 12:04:44 crc kubenswrapper[4837]: I0313 12:04:44.116814 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebbc8197-2f60-4876-8bab-ae450e22db4d" containerName="extract-content" Mar 13 12:04:44 crc kubenswrapper[4837]: E0313 12:04:44.116834 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebbc8197-2f60-4876-8bab-ae450e22db4d" containerName="extract-utilities" Mar 13 12:04:44 crc kubenswrapper[4837]: I0313 12:04:44.116843 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebbc8197-2f60-4876-8bab-ae450e22db4d" containerName="extract-utilities" Mar 13 12:04:44 crc kubenswrapper[4837]: E0313 12:04:44.116854 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b5b628a-9d8f-4ce7-b023-adbb2b00ff47" containerName="extract-utilities" Mar 13 12:04:44 crc kubenswrapper[4837]: I0313 12:04:44.116862 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b5b628a-9d8f-4ce7-b023-adbb2b00ff47" containerName="extract-utilities" Mar 13 12:04:44 crc kubenswrapper[4837]: E0313 12:04:44.116877 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b5b628a-9d8f-4ce7-b023-adbb2b00ff47" containerName="registry-server" Mar 13 12:04:44 crc kubenswrapper[4837]: I0313 12:04:44.116884 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b5b628a-9d8f-4ce7-b023-adbb2b00ff47" containerName="registry-server" Mar 13 12:04:44 crc kubenswrapper[4837]: E0313 12:04:44.116899 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b5b628a-9d8f-4ce7-b023-adbb2b00ff47" containerName="extract-content" Mar 13 12:04:44 crc kubenswrapper[4837]: I0313 12:04:44.116906 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b5b628a-9d8f-4ce7-b023-adbb2b00ff47" containerName="extract-content" Mar 13 12:04:44 crc kubenswrapper[4837]: I0313 12:04:44.117093 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebbc8197-2f60-4876-8bab-ae450e22db4d" containerName="registry-server" Mar 13 12:04:44 crc kubenswrapper[4837]: I0313 12:04:44.117109 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b5b628a-9d8f-4ce7-b023-adbb2b00ff47" containerName="registry-server" Mar 13 12:04:44 crc kubenswrapper[4837]: I0313 12:04:44.118236 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2crl9" Mar 13 12:04:44 crc kubenswrapper[4837]: I0313 12:04:44.127970 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2crl9"] Mar 13 12:04:44 crc kubenswrapper[4837]: I0313 12:04:44.256356 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ebf68eb-aa74-4cb1-8c39-11b467b1cf47-utilities\") pod \"redhat-marketplace-2crl9\" (UID: \"7ebf68eb-aa74-4cb1-8c39-11b467b1cf47\") " pod="openshift-marketplace/redhat-marketplace-2crl9" Mar 13 12:04:44 crc kubenswrapper[4837]: I0313 12:04:44.256469 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7hbk\" (UniqueName: \"kubernetes.io/projected/7ebf68eb-aa74-4cb1-8c39-11b467b1cf47-kube-api-access-l7hbk\") pod \"redhat-marketplace-2crl9\" (UID: \"7ebf68eb-aa74-4cb1-8c39-11b467b1cf47\") " pod="openshift-marketplace/redhat-marketplace-2crl9" Mar 13 12:04:44 crc kubenswrapper[4837]: I0313 12:04:44.256522 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ebf68eb-aa74-4cb1-8c39-11b467b1cf47-catalog-content\") pod \"redhat-marketplace-2crl9\" (UID: \"7ebf68eb-aa74-4cb1-8c39-11b467b1cf47\") " pod="openshift-marketplace/redhat-marketplace-2crl9" Mar 13 12:04:44 crc kubenswrapper[4837]: I0313 12:04:44.357680 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7hbk\" (UniqueName: \"kubernetes.io/projected/7ebf68eb-aa74-4cb1-8c39-11b467b1cf47-kube-api-access-l7hbk\") pod \"redhat-marketplace-2crl9\" (UID: \"7ebf68eb-aa74-4cb1-8c39-11b467b1cf47\") " pod="openshift-marketplace/redhat-marketplace-2crl9" Mar 13 12:04:44 crc kubenswrapper[4837]: I0313 12:04:44.357740 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ebf68eb-aa74-4cb1-8c39-11b467b1cf47-catalog-content\") pod \"redhat-marketplace-2crl9\" (UID: \"7ebf68eb-aa74-4cb1-8c39-11b467b1cf47\") " pod="openshift-marketplace/redhat-marketplace-2crl9" Mar 13 12:04:44 crc kubenswrapper[4837]: I0313 12:04:44.357786 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ebf68eb-aa74-4cb1-8c39-11b467b1cf47-utilities\") pod \"redhat-marketplace-2crl9\" (UID: \"7ebf68eb-aa74-4cb1-8c39-11b467b1cf47\") " pod="openshift-marketplace/redhat-marketplace-2crl9" Mar 13 12:04:44 crc kubenswrapper[4837]: I0313 12:04:44.358220 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ebf68eb-aa74-4cb1-8c39-11b467b1cf47-utilities\") pod \"redhat-marketplace-2crl9\" (UID: \"7ebf68eb-aa74-4cb1-8c39-11b467b1cf47\") " pod="openshift-marketplace/redhat-marketplace-2crl9" Mar 13 12:04:44 crc kubenswrapper[4837]: I0313 12:04:44.358466 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ebf68eb-aa74-4cb1-8c39-11b467b1cf47-catalog-content\") pod \"redhat-marketplace-2crl9\" (UID: \"7ebf68eb-aa74-4cb1-8c39-11b467b1cf47\") " pod="openshift-marketplace/redhat-marketplace-2crl9" Mar 13 12:04:44 crc kubenswrapper[4837]: I0313 12:04:44.378902 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7hbk\" (UniqueName: \"kubernetes.io/projected/7ebf68eb-aa74-4cb1-8c39-11b467b1cf47-kube-api-access-l7hbk\") pod \"redhat-marketplace-2crl9\" (UID: \"7ebf68eb-aa74-4cb1-8c39-11b467b1cf47\") " pod="openshift-marketplace/redhat-marketplace-2crl9" Mar 13 12:04:44 crc kubenswrapper[4837]: I0313 12:04:44.441172 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2crl9" Mar 13 12:04:44 crc kubenswrapper[4837]: I0313 12:04:44.668205 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2crl9"] Mar 13 12:04:44 crc kubenswrapper[4837]: I0313 12:04:44.993432 4837 generic.go:334] "Generic (PLEG): container finished" podID="7ebf68eb-aa74-4cb1-8c39-11b467b1cf47" containerID="de4836aad6222fc6702dec9e38648e5ea3face7c15046e0ff57102279a2cabc9" exitCode=0 Mar 13 12:04:44 crc kubenswrapper[4837]: I0313 12:04:44.993550 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2crl9" event={"ID":"7ebf68eb-aa74-4cb1-8c39-11b467b1cf47","Type":"ContainerDied","Data":"de4836aad6222fc6702dec9e38648e5ea3face7c15046e0ff57102279a2cabc9"} Mar 13 12:04:44 crc kubenswrapper[4837]: I0313 12:04:44.994115 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2crl9" event={"ID":"7ebf68eb-aa74-4cb1-8c39-11b467b1cf47","Type":"ContainerStarted","Data":"494be8647d8bf49d20735dd5e68f3201a4a95fac989aa78a899f4c7f78bb7851"} Mar 13 12:04:45 crc kubenswrapper[4837]: I0313 12:04:45.000695 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95c7q" event={"ID":"ecbc61c9-ab9c-485e-9112-bb8704851de8","Type":"ContainerStarted","Data":"c21f95b043609dd72a88f8756cc2bc37d9f2feadcbd2719605057d08fa6289bd"} Mar 13 12:04:45 crc kubenswrapper[4837]: I0313 12:04:45.034790 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-95c7q" podStartSLOduration=2.421715574 podStartE2EDuration="10.034762222s" podCreationTimestamp="2026-03-13 12:04:35 +0000 UTC" firstStartedPulling="2026-03-13 12:04:36.918736101 +0000 UTC m=+992.557002864" lastFinishedPulling="2026-03-13 12:04:44.531782749 +0000 UTC m=+1000.170049512" observedRunningTime="2026-03-13 12:04:45.03155 +0000 UTC m=+1000.669816773" watchObservedRunningTime="2026-03-13 12:04:45.034762222 +0000 UTC m=+1000.673028985" Mar 13 12:04:45 crc kubenswrapper[4837]: I0313 12:04:45.056979 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b5b628a-9d8f-4ce7-b023-adbb2b00ff47" path="/var/lib/kubelet/pods/5b5b628a-9d8f-4ce7-b023-adbb2b00ff47/volumes" Mar 13 12:04:46 crc kubenswrapper[4837]: I0313 12:04:46.039138 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-95c7q" Mar 13 12:04:46 crc kubenswrapper[4837]: I0313 12:04:46.039337 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-95c7q" Mar 13 12:04:47 crc kubenswrapper[4837]: I0313 12:04:47.014627 4837 generic.go:334] "Generic (PLEG): container finished" podID="7ebf68eb-aa74-4cb1-8c39-11b467b1cf47" containerID="0538e711d2d8831b415f66821458a301532affb1497219de99e29dbcb0da491b" exitCode=0 Mar 13 12:04:47 crc kubenswrapper[4837]: I0313 12:04:47.014735 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2crl9" event={"ID":"7ebf68eb-aa74-4cb1-8c39-11b467b1cf47","Type":"ContainerDied","Data":"0538e711d2d8831b415f66821458a301532affb1497219de99e29dbcb0da491b"} Mar 13 12:04:47 crc kubenswrapper[4837]: I0313 12:04:47.081448 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-95c7q" podUID="ecbc61c9-ab9c-485e-9112-bb8704851de8" containerName="registry-server" probeResult="failure" output=< Mar 13 12:04:47 crc kubenswrapper[4837]: timeout: failed to connect service ":50051" within 1s Mar 13 12:04:47 crc kubenswrapper[4837]: > Mar 13 12:04:48 crc kubenswrapper[4837]: I0313 12:04:48.037116 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2crl9" event={"ID":"7ebf68eb-aa74-4cb1-8c39-11b467b1cf47","Type":"ContainerStarted","Data":"f6edba34bfa2024c1573d537437cd2155fc803ec2e5f085333dfcafbabdc46f8"} Mar 13 12:04:48 crc kubenswrapper[4837]: I0313 12:04:48.062167 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2crl9" podStartSLOduration=1.488537196 podStartE2EDuration="4.062133816s" podCreationTimestamp="2026-03-13 12:04:44 +0000 UTC" firstStartedPulling="2026-03-13 12:04:44.995806278 +0000 UTC m=+1000.634073041" lastFinishedPulling="2026-03-13 12:04:47.569402888 +0000 UTC m=+1003.207669661" observedRunningTime="2026-03-13 12:04:48.056106686 +0000 UTC m=+1003.694373449" watchObservedRunningTime="2026-03-13 12:04:48.062133816 +0000 UTC m=+1003.700400579" Mar 13 12:04:54 crc kubenswrapper[4837]: I0313 12:04:54.443165 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2crl9" Mar 13 12:04:54 crc kubenswrapper[4837]: I0313 12:04:54.444964 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2crl9" Mar 13 12:04:54 crc kubenswrapper[4837]: I0313 12:04:54.486574 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2crl9" Mar 13 12:04:55 crc kubenswrapper[4837]: I0313 12:04:55.121541 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2crl9" Mar 13 12:04:55 crc kubenswrapper[4837]: I0313 12:04:55.163370 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2crl9"] Mar 13 12:04:56 crc kubenswrapper[4837]: I0313 12:04:56.076381 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-95c7q" Mar 13 12:04:56 crc kubenswrapper[4837]: I0313 12:04:56.117296 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-95c7q" Mar 13 12:04:57 crc kubenswrapper[4837]: I0313 12:04:57.096065 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2crl9" podUID="7ebf68eb-aa74-4cb1-8c39-11b467b1cf47" containerName="registry-server" containerID="cri-o://f6edba34bfa2024c1573d537437cd2155fc803ec2e5f085333dfcafbabdc46f8" gracePeriod=2 Mar 13 12:04:57 crc kubenswrapper[4837]: I0313 12:04:57.113744 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-95c7q"] Mar 13 12:04:57 crc kubenswrapper[4837]: I0313 12:04:57.113977 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-95c7q" podUID="ecbc61c9-ab9c-485e-9112-bb8704851de8" containerName="registry-server" containerID="cri-o://c21f95b043609dd72a88f8756cc2bc37d9f2feadcbd2719605057d08fa6289bd" gracePeriod=2 Mar 13 12:04:57 crc kubenswrapper[4837]: I0313 12:04:57.589050 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2crl9" Mar 13 12:04:57 crc kubenswrapper[4837]: I0313 12:04:57.608929 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-95c7q" Mar 13 12:04:57 crc kubenswrapper[4837]: I0313 12:04:57.755505 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7hbk\" (UniqueName: \"kubernetes.io/projected/7ebf68eb-aa74-4cb1-8c39-11b467b1cf47-kube-api-access-l7hbk\") pod \"7ebf68eb-aa74-4cb1-8c39-11b467b1cf47\" (UID: \"7ebf68eb-aa74-4cb1-8c39-11b467b1cf47\") " Mar 13 12:04:57 crc kubenswrapper[4837]: I0313 12:04:57.755862 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecbc61c9-ab9c-485e-9112-bb8704851de8-catalog-content\") pod \"ecbc61c9-ab9c-485e-9112-bb8704851de8\" (UID: \"ecbc61c9-ab9c-485e-9112-bb8704851de8\") " Mar 13 12:04:57 crc kubenswrapper[4837]: I0313 12:04:57.755909 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ebf68eb-aa74-4cb1-8c39-11b467b1cf47-catalog-content\") pod \"7ebf68eb-aa74-4cb1-8c39-11b467b1cf47\" (UID: \"7ebf68eb-aa74-4cb1-8c39-11b467b1cf47\") " Mar 13 12:04:57 crc kubenswrapper[4837]: I0313 12:04:57.755929 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecbc61c9-ab9c-485e-9112-bb8704851de8-utilities\") pod \"ecbc61c9-ab9c-485e-9112-bb8704851de8\" (UID: \"ecbc61c9-ab9c-485e-9112-bb8704851de8\") " Mar 13 12:04:57 crc kubenswrapper[4837]: I0313 12:04:57.755949 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvb2c\" (UniqueName: \"kubernetes.io/projected/ecbc61c9-ab9c-485e-9112-bb8704851de8-kube-api-access-vvb2c\") pod \"ecbc61c9-ab9c-485e-9112-bb8704851de8\" (UID: \"ecbc61c9-ab9c-485e-9112-bb8704851de8\") " Mar 13 12:04:57 crc kubenswrapper[4837]: I0313 12:04:57.755970 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ebf68eb-aa74-4cb1-8c39-11b467b1cf47-utilities\") pod \"7ebf68eb-aa74-4cb1-8c39-11b467b1cf47\" (UID: \"7ebf68eb-aa74-4cb1-8c39-11b467b1cf47\") " Mar 13 12:04:57 crc kubenswrapper[4837]: I0313 12:04:57.756894 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ebf68eb-aa74-4cb1-8c39-11b467b1cf47-utilities" (OuterVolumeSpecName: "utilities") pod "7ebf68eb-aa74-4cb1-8c39-11b467b1cf47" (UID: "7ebf68eb-aa74-4cb1-8c39-11b467b1cf47"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:04:57 crc kubenswrapper[4837]: I0313 12:04:57.757080 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecbc61c9-ab9c-485e-9112-bb8704851de8-utilities" (OuterVolumeSpecName: "utilities") pod "ecbc61c9-ab9c-485e-9112-bb8704851de8" (UID: "ecbc61c9-ab9c-485e-9112-bb8704851de8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:04:57 crc kubenswrapper[4837]: I0313 12:04:57.761922 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecbc61c9-ab9c-485e-9112-bb8704851de8-kube-api-access-vvb2c" (OuterVolumeSpecName: "kube-api-access-vvb2c") pod "ecbc61c9-ab9c-485e-9112-bb8704851de8" (UID: "ecbc61c9-ab9c-485e-9112-bb8704851de8"). InnerVolumeSpecName "kube-api-access-vvb2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:04:57 crc kubenswrapper[4837]: I0313 12:04:57.774944 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ebf68eb-aa74-4cb1-8c39-11b467b1cf47-kube-api-access-l7hbk" (OuterVolumeSpecName: "kube-api-access-l7hbk") pod "7ebf68eb-aa74-4cb1-8c39-11b467b1cf47" (UID: "7ebf68eb-aa74-4cb1-8c39-11b467b1cf47"). InnerVolumeSpecName "kube-api-access-l7hbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:04:57 crc kubenswrapper[4837]: I0313 12:04:57.789258 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ebf68eb-aa74-4cb1-8c39-11b467b1cf47-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7ebf68eb-aa74-4cb1-8c39-11b467b1cf47" (UID: "7ebf68eb-aa74-4cb1-8c39-11b467b1cf47"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:04:57 crc kubenswrapper[4837]: I0313 12:04:57.857463 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecbc61c9-ab9c-485e-9112-bb8704851de8-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:04:57 crc kubenswrapper[4837]: I0313 12:04:57.857502 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvb2c\" (UniqueName: \"kubernetes.io/projected/ecbc61c9-ab9c-485e-9112-bb8704851de8-kube-api-access-vvb2c\") on node \"crc\" DevicePath \"\"" Mar 13 12:04:57 crc kubenswrapper[4837]: I0313 12:04:57.857512 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ebf68eb-aa74-4cb1-8c39-11b467b1cf47-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:04:57 crc kubenswrapper[4837]: I0313 12:04:57.857521 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7hbk\" (UniqueName: \"kubernetes.io/projected/7ebf68eb-aa74-4cb1-8c39-11b467b1cf47-kube-api-access-l7hbk\") on node \"crc\" DevicePath \"\"" Mar 13 12:04:57 crc kubenswrapper[4837]: I0313 12:04:57.857530 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ebf68eb-aa74-4cb1-8c39-11b467b1cf47-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:04:57 crc kubenswrapper[4837]: I0313 12:04:57.896945 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecbc61c9-ab9c-485e-9112-bb8704851de8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ecbc61c9-ab9c-485e-9112-bb8704851de8" (UID: "ecbc61c9-ab9c-485e-9112-bb8704851de8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:04:57 crc kubenswrapper[4837]: I0313 12:04:57.959004 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecbc61c9-ab9c-485e-9112-bb8704851de8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.104201 4837 generic.go:334] "Generic (PLEG): container finished" podID="ecbc61c9-ab9c-485e-9112-bb8704851de8" containerID="c21f95b043609dd72a88f8756cc2bc37d9f2feadcbd2719605057d08fa6289bd" exitCode=0 Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.104277 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95c7q" event={"ID":"ecbc61c9-ab9c-485e-9112-bb8704851de8","Type":"ContainerDied","Data":"c21f95b043609dd72a88f8756cc2bc37d9f2feadcbd2719605057d08fa6289bd"} Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.104321 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95c7q" event={"ID":"ecbc61c9-ab9c-485e-9112-bb8704851de8","Type":"ContainerDied","Data":"d87a304801c0e9f7051845724164f67c6d9d6a0ecd1e18c769cc77332da22942"} Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.104343 4837 scope.go:117] "RemoveContainer" containerID="c21f95b043609dd72a88f8756cc2bc37d9f2feadcbd2719605057d08fa6289bd" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.105354 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-95c7q" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.106817 4837 generic.go:334] "Generic (PLEG): container finished" podID="7ebf68eb-aa74-4cb1-8c39-11b467b1cf47" containerID="f6edba34bfa2024c1573d537437cd2155fc803ec2e5f085333dfcafbabdc46f8" exitCode=0 Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.106968 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2crl9" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.107734 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2crl9" event={"ID":"7ebf68eb-aa74-4cb1-8c39-11b467b1cf47","Type":"ContainerDied","Data":"f6edba34bfa2024c1573d537437cd2155fc803ec2e5f085333dfcafbabdc46f8"} Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.107784 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2crl9" event={"ID":"7ebf68eb-aa74-4cb1-8c39-11b467b1cf47","Type":"ContainerDied","Data":"494be8647d8bf49d20735dd5e68f3201a4a95fac989aa78a899f4c7f78bb7851"} Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.149255 4837 scope.go:117] "RemoveContainer" containerID="15effd1f70e513b4a4e478354af67a4a98140eaf51a9874afc59ee39e91f07c2" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.179359 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-95c7q"] Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.179554 4837 scope.go:117] "RemoveContainer" containerID="80e7a9f84e3e53924280e958be2d8ebd2909130bea780f65b73b56b9eeb2548c" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.187313 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-95c7q"] Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.199166 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2crl9"] Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.209392 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2crl9"] Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.211871 4837 scope.go:117] "RemoveContainer" containerID="c21f95b043609dd72a88f8756cc2bc37d9f2feadcbd2719605057d08fa6289bd" Mar 13 12:04:58 crc kubenswrapper[4837]: E0313 12:04:58.212410 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c21f95b043609dd72a88f8756cc2bc37d9f2feadcbd2719605057d08fa6289bd\": container with ID starting with c21f95b043609dd72a88f8756cc2bc37d9f2feadcbd2719605057d08fa6289bd not found: ID does not exist" containerID="c21f95b043609dd72a88f8756cc2bc37d9f2feadcbd2719605057d08fa6289bd" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.212442 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c21f95b043609dd72a88f8756cc2bc37d9f2feadcbd2719605057d08fa6289bd"} err="failed to get container status \"c21f95b043609dd72a88f8756cc2bc37d9f2feadcbd2719605057d08fa6289bd\": rpc error: code = NotFound desc = could not find container \"c21f95b043609dd72a88f8756cc2bc37d9f2feadcbd2719605057d08fa6289bd\": container with ID starting with c21f95b043609dd72a88f8756cc2bc37d9f2feadcbd2719605057d08fa6289bd not found: ID does not exist" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.212464 4837 scope.go:117] "RemoveContainer" containerID="15effd1f70e513b4a4e478354af67a4a98140eaf51a9874afc59ee39e91f07c2" Mar 13 12:04:58 crc kubenswrapper[4837]: E0313 12:04:58.214048 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15effd1f70e513b4a4e478354af67a4a98140eaf51a9874afc59ee39e91f07c2\": container with ID starting with 15effd1f70e513b4a4e478354af67a4a98140eaf51a9874afc59ee39e91f07c2 not found: ID does not exist" containerID="15effd1f70e513b4a4e478354af67a4a98140eaf51a9874afc59ee39e91f07c2" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.214082 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15effd1f70e513b4a4e478354af67a4a98140eaf51a9874afc59ee39e91f07c2"} err="failed to get container status \"15effd1f70e513b4a4e478354af67a4a98140eaf51a9874afc59ee39e91f07c2\": rpc error: code = NotFound desc = could not find container \"15effd1f70e513b4a4e478354af67a4a98140eaf51a9874afc59ee39e91f07c2\": container with ID starting with 15effd1f70e513b4a4e478354af67a4a98140eaf51a9874afc59ee39e91f07c2 not found: ID does not exist" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.214105 4837 scope.go:117] "RemoveContainer" containerID="80e7a9f84e3e53924280e958be2d8ebd2909130bea780f65b73b56b9eeb2548c" Mar 13 12:04:58 crc kubenswrapper[4837]: E0313 12:04:58.214506 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80e7a9f84e3e53924280e958be2d8ebd2909130bea780f65b73b56b9eeb2548c\": container with ID starting with 80e7a9f84e3e53924280e958be2d8ebd2909130bea780f65b73b56b9eeb2548c not found: ID does not exist" containerID="80e7a9f84e3e53924280e958be2d8ebd2909130bea780f65b73b56b9eeb2548c" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.214529 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80e7a9f84e3e53924280e958be2d8ebd2909130bea780f65b73b56b9eeb2548c"} err="failed to get container status \"80e7a9f84e3e53924280e958be2d8ebd2909130bea780f65b73b56b9eeb2548c\": rpc error: code = NotFound desc = could not find container \"80e7a9f84e3e53924280e958be2d8ebd2909130bea780f65b73b56b9eeb2548c\": container with ID starting with 80e7a9f84e3e53924280e958be2d8ebd2909130bea780f65b73b56b9eeb2548c not found: ID does not exist" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.214542 4837 scope.go:117] "RemoveContainer" containerID="f6edba34bfa2024c1573d537437cd2155fc803ec2e5f085333dfcafbabdc46f8" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.239037 4837 scope.go:117] "RemoveContainer" containerID="0538e711d2d8831b415f66821458a301532affb1497219de99e29dbcb0da491b" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.256079 4837 scope.go:117] "RemoveContainer" containerID="de4836aad6222fc6702dec9e38648e5ea3face7c15046e0ff57102279a2cabc9" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.294808 4837 scope.go:117] "RemoveContainer" containerID="f6edba34bfa2024c1573d537437cd2155fc803ec2e5f085333dfcafbabdc46f8" Mar 13 12:04:58 crc kubenswrapper[4837]: E0313 12:04:58.295389 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6edba34bfa2024c1573d537437cd2155fc803ec2e5f085333dfcafbabdc46f8\": container with ID starting with f6edba34bfa2024c1573d537437cd2155fc803ec2e5f085333dfcafbabdc46f8 not found: ID does not exist" containerID="f6edba34bfa2024c1573d537437cd2155fc803ec2e5f085333dfcafbabdc46f8" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.295497 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6edba34bfa2024c1573d537437cd2155fc803ec2e5f085333dfcafbabdc46f8"} err="failed to get container status \"f6edba34bfa2024c1573d537437cd2155fc803ec2e5f085333dfcafbabdc46f8\": rpc error: code = NotFound desc = could not find container \"f6edba34bfa2024c1573d537437cd2155fc803ec2e5f085333dfcafbabdc46f8\": container with ID starting with f6edba34bfa2024c1573d537437cd2155fc803ec2e5f085333dfcafbabdc46f8 not found: ID does not exist" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.295591 4837 scope.go:117] "RemoveContainer" containerID="0538e711d2d8831b415f66821458a301532affb1497219de99e29dbcb0da491b" Mar 13 12:04:58 crc kubenswrapper[4837]: E0313 12:04:58.296010 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0538e711d2d8831b415f66821458a301532affb1497219de99e29dbcb0da491b\": container with ID starting with 0538e711d2d8831b415f66821458a301532affb1497219de99e29dbcb0da491b not found: ID does not exist" containerID="0538e711d2d8831b415f66821458a301532affb1497219de99e29dbcb0da491b" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.296124 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0538e711d2d8831b415f66821458a301532affb1497219de99e29dbcb0da491b"} err="failed to get container status \"0538e711d2d8831b415f66821458a301532affb1497219de99e29dbcb0da491b\": rpc error: code = NotFound desc = could not find container \"0538e711d2d8831b415f66821458a301532affb1497219de99e29dbcb0da491b\": container with ID starting with 0538e711d2d8831b415f66821458a301532affb1497219de99e29dbcb0da491b not found: ID does not exist" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.296222 4837 scope.go:117] "RemoveContainer" containerID="de4836aad6222fc6702dec9e38648e5ea3face7c15046e0ff57102279a2cabc9" Mar 13 12:04:58 crc kubenswrapper[4837]: E0313 12:04:58.296580 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de4836aad6222fc6702dec9e38648e5ea3face7c15046e0ff57102279a2cabc9\": container with ID starting with de4836aad6222fc6702dec9e38648e5ea3face7c15046e0ff57102279a2cabc9 not found: ID does not exist" containerID="de4836aad6222fc6702dec9e38648e5ea3face7c15046e0ff57102279a2cabc9" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.296705 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de4836aad6222fc6702dec9e38648e5ea3face7c15046e0ff57102279a2cabc9"} err="failed to get container status \"de4836aad6222fc6702dec9e38648e5ea3face7c15046e0ff57102279a2cabc9\": rpc error: code = NotFound desc = could not find container \"de4836aad6222fc6702dec9e38648e5ea3face7c15046e0ff57102279a2cabc9\": container with ID starting with de4836aad6222fc6702dec9e38648e5ea3face7c15046e0ff57102279a2cabc9 not found: ID does not exist" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.723627 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-29x4s"] Mar 13 12:04:58 crc kubenswrapper[4837]: E0313 12:04:58.724292 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecbc61c9-ab9c-485e-9112-bb8704851de8" containerName="registry-server" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.724316 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecbc61c9-ab9c-485e-9112-bb8704851de8" containerName="registry-server" Mar 13 12:04:58 crc kubenswrapper[4837]: E0313 12:04:58.724330 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecbc61c9-ab9c-485e-9112-bb8704851de8" containerName="extract-content" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.724338 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecbc61c9-ab9c-485e-9112-bb8704851de8" containerName="extract-content" Mar 13 12:04:58 crc kubenswrapper[4837]: E0313 12:04:58.724356 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecbc61c9-ab9c-485e-9112-bb8704851de8" containerName="extract-utilities" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.724366 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecbc61c9-ab9c-485e-9112-bb8704851de8" containerName="extract-utilities" Mar 13 12:04:58 crc kubenswrapper[4837]: E0313 12:04:58.724383 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ebf68eb-aa74-4cb1-8c39-11b467b1cf47" containerName="registry-server" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.724391 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ebf68eb-aa74-4cb1-8c39-11b467b1cf47" containerName="registry-server" Mar 13 12:04:58 crc kubenswrapper[4837]: E0313 12:04:58.724403 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ebf68eb-aa74-4cb1-8c39-11b467b1cf47" containerName="extract-content" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.724411 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ebf68eb-aa74-4cb1-8c39-11b467b1cf47" containerName="extract-content" Mar 13 12:04:58 crc kubenswrapper[4837]: E0313 12:04:58.724426 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ebf68eb-aa74-4cb1-8c39-11b467b1cf47" containerName="extract-utilities" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.724433 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ebf68eb-aa74-4cb1-8c39-11b467b1cf47" containerName="extract-utilities" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.724609 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecbc61c9-ab9c-485e-9112-bb8704851de8" containerName="registry-server" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.724650 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ebf68eb-aa74-4cb1-8c39-11b467b1cf47" containerName="registry-server" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.725548 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-29x4s" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.728159 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.728402 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.729998 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.735994 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-29x4s"] Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.736119 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-ng87d" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.787966 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4pw9n"] Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.789412 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-4pw9n" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.792397 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.800179 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4pw9n"] Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.871833 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6rdw\" (UniqueName: \"kubernetes.io/projected/5b9562f6-0527-40b4-9b2e-f5b2f22aa272-kube-api-access-d6rdw\") pod \"dnsmasq-dns-78dd6ddcc-4pw9n\" (UID: \"5b9562f6-0527-40b4-9b2e-f5b2f22aa272\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4pw9n" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.872107 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b9562f6-0527-40b4-9b2e-f5b2f22aa272-config\") pod \"dnsmasq-dns-78dd6ddcc-4pw9n\" (UID: \"5b9562f6-0527-40b4-9b2e-f5b2f22aa272\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4pw9n" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.872248 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63688ba3-e68c-4f88-a6e4-6c373b30f929-config\") pod \"dnsmasq-dns-675f4bcbfc-29x4s\" (UID: \"63688ba3-e68c-4f88-a6e4-6c373b30f929\") " pod="openstack/dnsmasq-dns-675f4bcbfc-29x4s" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.872324 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wlx2\" (UniqueName: \"kubernetes.io/projected/63688ba3-e68c-4f88-a6e4-6c373b30f929-kube-api-access-6wlx2\") pod \"dnsmasq-dns-675f4bcbfc-29x4s\" (UID: \"63688ba3-e68c-4f88-a6e4-6c373b30f929\") " pod="openstack/dnsmasq-dns-675f4bcbfc-29x4s" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.872432 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b9562f6-0527-40b4-9b2e-f5b2f22aa272-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-4pw9n\" (UID: \"5b9562f6-0527-40b4-9b2e-f5b2f22aa272\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4pw9n" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.973115 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6rdw\" (UniqueName: \"kubernetes.io/projected/5b9562f6-0527-40b4-9b2e-f5b2f22aa272-kube-api-access-d6rdw\") pod \"dnsmasq-dns-78dd6ddcc-4pw9n\" (UID: \"5b9562f6-0527-40b4-9b2e-f5b2f22aa272\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4pw9n" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.973179 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b9562f6-0527-40b4-9b2e-f5b2f22aa272-config\") pod \"dnsmasq-dns-78dd6ddcc-4pw9n\" (UID: \"5b9562f6-0527-40b4-9b2e-f5b2f22aa272\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4pw9n" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.973213 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63688ba3-e68c-4f88-a6e4-6c373b30f929-config\") pod \"dnsmasq-dns-675f4bcbfc-29x4s\" (UID: \"63688ba3-e68c-4f88-a6e4-6c373b30f929\") " pod="openstack/dnsmasq-dns-675f4bcbfc-29x4s" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.973235 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wlx2\" (UniqueName: \"kubernetes.io/projected/63688ba3-e68c-4f88-a6e4-6c373b30f929-kube-api-access-6wlx2\") pod \"dnsmasq-dns-675f4bcbfc-29x4s\" (UID: \"63688ba3-e68c-4f88-a6e4-6c373b30f929\") " pod="openstack/dnsmasq-dns-675f4bcbfc-29x4s" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.973265 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b9562f6-0527-40b4-9b2e-f5b2f22aa272-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-4pw9n\" (UID: \"5b9562f6-0527-40b4-9b2e-f5b2f22aa272\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4pw9n" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.974108 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b9562f6-0527-40b4-9b2e-f5b2f22aa272-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-4pw9n\" (UID: \"5b9562f6-0527-40b4-9b2e-f5b2f22aa272\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4pw9n" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.974510 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b9562f6-0527-40b4-9b2e-f5b2f22aa272-config\") pod \"dnsmasq-dns-78dd6ddcc-4pw9n\" (UID: \"5b9562f6-0527-40b4-9b2e-f5b2f22aa272\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4pw9n" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.974509 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63688ba3-e68c-4f88-a6e4-6c373b30f929-config\") pod \"dnsmasq-dns-675f4bcbfc-29x4s\" (UID: \"63688ba3-e68c-4f88-a6e4-6c373b30f929\") " pod="openstack/dnsmasq-dns-675f4bcbfc-29x4s" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.990557 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6rdw\" (UniqueName: \"kubernetes.io/projected/5b9562f6-0527-40b4-9b2e-f5b2f22aa272-kube-api-access-d6rdw\") pod \"dnsmasq-dns-78dd6ddcc-4pw9n\" (UID: \"5b9562f6-0527-40b4-9b2e-f5b2f22aa272\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4pw9n" Mar 13 12:04:58 crc kubenswrapper[4837]: I0313 12:04:58.990577 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wlx2\" (UniqueName: \"kubernetes.io/projected/63688ba3-e68c-4f88-a6e4-6c373b30f929-kube-api-access-6wlx2\") pod \"dnsmasq-dns-675f4bcbfc-29x4s\" (UID: \"63688ba3-e68c-4f88-a6e4-6c373b30f929\") " pod="openstack/dnsmasq-dns-675f4bcbfc-29x4s" Mar 13 12:04:59 crc kubenswrapper[4837]: I0313 12:04:59.043571 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-29x4s" Mar 13 12:04:59 crc kubenswrapper[4837]: I0313 12:04:59.056237 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ebf68eb-aa74-4cb1-8c39-11b467b1cf47" path="/var/lib/kubelet/pods/7ebf68eb-aa74-4cb1-8c39-11b467b1cf47/volumes" Mar 13 12:04:59 crc kubenswrapper[4837]: I0313 12:04:59.057149 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecbc61c9-ab9c-485e-9112-bb8704851de8" path="/var/lib/kubelet/pods/ecbc61c9-ab9c-485e-9112-bb8704851de8/volumes" Mar 13 12:04:59 crc kubenswrapper[4837]: I0313 12:04:59.120599 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-4pw9n" Mar 13 12:04:59 crc kubenswrapper[4837]: I0313 12:04:59.445905 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-29x4s"] Mar 13 12:04:59 crc kubenswrapper[4837]: I0313 12:04:59.582101 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4pw9n"] Mar 13 12:05:00 crc kubenswrapper[4837]: I0313 12:05:00.137167 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-4pw9n" event={"ID":"5b9562f6-0527-40b4-9b2e-f5b2f22aa272","Type":"ContainerStarted","Data":"da674bf6aef47158bcb9f95c5eb9d1a420c65f8f1989031a5ce339d17724e353"} Mar 13 12:05:00 crc kubenswrapper[4837]: I0313 12:05:00.138617 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-29x4s" event={"ID":"63688ba3-e68c-4f88-a6e4-6c373b30f929","Type":"ContainerStarted","Data":"a94e60840947634f6bf1cb267e8fd71afc4b1db7581a3cc51184314c3d6b19e7"} Mar 13 12:05:01 crc kubenswrapper[4837]: I0313 12:05:01.654070 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-29x4s"] Mar 13 12:05:01 crc kubenswrapper[4837]: I0313 12:05:01.688941 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-7wbz7"] Mar 13 12:05:01 crc kubenswrapper[4837]: I0313 12:05:01.690337 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-7wbz7" Mar 13 12:05:01 crc kubenswrapper[4837]: I0313 12:05:01.700421 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-7wbz7"] Mar 13 12:05:01 crc kubenswrapper[4837]: I0313 12:05:01.717164 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b082689f-6a6d-4da0-b2b1-f78343ba1e85-config\") pod \"dnsmasq-dns-5ccc8479f9-7wbz7\" (UID: \"b082689f-6a6d-4da0-b2b1-f78343ba1e85\") " pod="openstack/dnsmasq-dns-5ccc8479f9-7wbz7" Mar 13 12:05:01 crc kubenswrapper[4837]: I0313 12:05:01.717286 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcjgv\" (UniqueName: \"kubernetes.io/projected/b082689f-6a6d-4da0-b2b1-f78343ba1e85-kube-api-access-bcjgv\") pod \"dnsmasq-dns-5ccc8479f9-7wbz7\" (UID: \"b082689f-6a6d-4da0-b2b1-f78343ba1e85\") " pod="openstack/dnsmasq-dns-5ccc8479f9-7wbz7" Mar 13 12:05:01 crc kubenswrapper[4837]: I0313 12:05:01.717313 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b082689f-6a6d-4da0-b2b1-f78343ba1e85-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-7wbz7\" (UID: \"b082689f-6a6d-4da0-b2b1-f78343ba1e85\") " pod="openstack/dnsmasq-dns-5ccc8479f9-7wbz7" Mar 13 12:05:01 crc kubenswrapper[4837]: I0313 12:05:01.819079 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcjgv\" (UniqueName: \"kubernetes.io/projected/b082689f-6a6d-4da0-b2b1-f78343ba1e85-kube-api-access-bcjgv\") pod \"dnsmasq-dns-5ccc8479f9-7wbz7\" (UID: \"b082689f-6a6d-4da0-b2b1-f78343ba1e85\") " pod="openstack/dnsmasq-dns-5ccc8479f9-7wbz7" Mar 13 12:05:01 crc kubenswrapper[4837]: I0313 12:05:01.819135 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b082689f-6a6d-4da0-b2b1-f78343ba1e85-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-7wbz7\" (UID: \"b082689f-6a6d-4da0-b2b1-f78343ba1e85\") " pod="openstack/dnsmasq-dns-5ccc8479f9-7wbz7" Mar 13 12:05:01 crc kubenswrapper[4837]: I0313 12:05:01.819161 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b082689f-6a6d-4da0-b2b1-f78343ba1e85-config\") pod \"dnsmasq-dns-5ccc8479f9-7wbz7\" (UID: \"b082689f-6a6d-4da0-b2b1-f78343ba1e85\") " pod="openstack/dnsmasq-dns-5ccc8479f9-7wbz7" Mar 13 12:05:01 crc kubenswrapper[4837]: I0313 12:05:01.820213 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b082689f-6a6d-4da0-b2b1-f78343ba1e85-config\") pod \"dnsmasq-dns-5ccc8479f9-7wbz7\" (UID: \"b082689f-6a6d-4da0-b2b1-f78343ba1e85\") " pod="openstack/dnsmasq-dns-5ccc8479f9-7wbz7" Mar 13 12:05:01 crc kubenswrapper[4837]: I0313 12:05:01.822058 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b082689f-6a6d-4da0-b2b1-f78343ba1e85-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-7wbz7\" (UID: \"b082689f-6a6d-4da0-b2b1-f78343ba1e85\") " pod="openstack/dnsmasq-dns-5ccc8479f9-7wbz7" Mar 13 12:05:01 crc kubenswrapper[4837]: I0313 12:05:01.848161 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcjgv\" (UniqueName: \"kubernetes.io/projected/b082689f-6a6d-4da0-b2b1-f78343ba1e85-kube-api-access-bcjgv\") pod \"dnsmasq-dns-5ccc8479f9-7wbz7\" (UID: \"b082689f-6a6d-4da0-b2b1-f78343ba1e85\") " pod="openstack/dnsmasq-dns-5ccc8479f9-7wbz7" Mar 13 12:05:01 crc kubenswrapper[4837]: I0313 12:05:01.939189 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4pw9n"] Mar 13 12:05:01 crc kubenswrapper[4837]: I0313 12:05:01.963137 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-7vs6f"] Mar 13 12:05:01 crc kubenswrapper[4837]: I0313 12:05:01.966806 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-7vs6f" Mar 13 12:05:01 crc kubenswrapper[4837]: I0313 12:05:01.982372 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-7vs6f"] Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.011982 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-7wbz7" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.022622 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae5deee0-59c4-4fa7-8d8c-e12b516885dc-config\") pod \"dnsmasq-dns-57d769cc4f-7vs6f\" (UID: \"ae5deee0-59c4-4fa7-8d8c-e12b516885dc\") " pod="openstack/dnsmasq-dns-57d769cc4f-7vs6f" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.022730 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhg95\" (UniqueName: \"kubernetes.io/projected/ae5deee0-59c4-4fa7-8d8c-e12b516885dc-kube-api-access-lhg95\") pod \"dnsmasq-dns-57d769cc4f-7vs6f\" (UID: \"ae5deee0-59c4-4fa7-8d8c-e12b516885dc\") " pod="openstack/dnsmasq-dns-57d769cc4f-7vs6f" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.022752 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae5deee0-59c4-4fa7-8d8c-e12b516885dc-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-7vs6f\" (UID: \"ae5deee0-59c4-4fa7-8d8c-e12b516885dc\") " pod="openstack/dnsmasq-dns-57d769cc4f-7vs6f" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.124427 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae5deee0-59c4-4fa7-8d8c-e12b516885dc-config\") pod \"dnsmasq-dns-57d769cc4f-7vs6f\" (UID: \"ae5deee0-59c4-4fa7-8d8c-e12b516885dc\") " pod="openstack/dnsmasq-dns-57d769cc4f-7vs6f" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.124469 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhg95\" (UniqueName: \"kubernetes.io/projected/ae5deee0-59c4-4fa7-8d8c-e12b516885dc-kube-api-access-lhg95\") pod \"dnsmasq-dns-57d769cc4f-7vs6f\" (UID: \"ae5deee0-59c4-4fa7-8d8c-e12b516885dc\") " pod="openstack/dnsmasq-dns-57d769cc4f-7vs6f" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.125842 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae5deee0-59c4-4fa7-8d8c-e12b516885dc-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-7vs6f\" (UID: \"ae5deee0-59c4-4fa7-8d8c-e12b516885dc\") " pod="openstack/dnsmasq-dns-57d769cc4f-7vs6f" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.127659 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae5deee0-59c4-4fa7-8d8c-e12b516885dc-config\") pod \"dnsmasq-dns-57d769cc4f-7vs6f\" (UID: \"ae5deee0-59c4-4fa7-8d8c-e12b516885dc\") " pod="openstack/dnsmasq-dns-57d769cc4f-7vs6f" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.128375 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae5deee0-59c4-4fa7-8d8c-e12b516885dc-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-7vs6f\" (UID: \"ae5deee0-59c4-4fa7-8d8c-e12b516885dc\") " pod="openstack/dnsmasq-dns-57d769cc4f-7vs6f" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.149108 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhg95\" (UniqueName: \"kubernetes.io/projected/ae5deee0-59c4-4fa7-8d8c-e12b516885dc-kube-api-access-lhg95\") pod \"dnsmasq-dns-57d769cc4f-7vs6f\" (UID: \"ae5deee0-59c4-4fa7-8d8c-e12b516885dc\") " pod="openstack/dnsmasq-dns-57d769cc4f-7vs6f" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.291342 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-7vs6f" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.822946 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.824358 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.832713 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.832750 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.832920 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-mb2tp" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.833025 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.833121 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.833217 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.833307 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.837219 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.939261 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.939653 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfz87\" (UniqueName: \"kubernetes.io/projected/13254c8b-516c-435e-9db2-a8d518434f29-kube-api-access-wfz87\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.939686 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/13254c8b-516c-435e-9db2-a8d518434f29-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.939726 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/13254c8b-516c-435e-9db2-a8d518434f29-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.939769 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/13254c8b-516c-435e-9db2-a8d518434f29-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.939810 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/13254c8b-516c-435e-9db2-a8d518434f29-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.939842 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/13254c8b-516c-435e-9db2-a8d518434f29-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.939866 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/13254c8b-516c-435e-9db2-a8d518434f29-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.939891 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/13254c8b-516c-435e-9db2-a8d518434f29-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.939974 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/13254c8b-516c-435e-9db2-a8d518434f29-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:02 crc kubenswrapper[4837]: I0313 12:05:02.940153 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/13254c8b-516c-435e-9db2-a8d518434f29-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.041436 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/13254c8b-516c-435e-9db2-a8d518434f29-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.041506 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.041548 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfz87\" (UniqueName: \"kubernetes.io/projected/13254c8b-516c-435e-9db2-a8d518434f29-kube-api-access-wfz87\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.041573 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/13254c8b-516c-435e-9db2-a8d518434f29-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.041604 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/13254c8b-516c-435e-9db2-a8d518434f29-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.041627 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/13254c8b-516c-435e-9db2-a8d518434f29-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.041685 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/13254c8b-516c-435e-9db2-a8d518434f29-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.041706 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/13254c8b-516c-435e-9db2-a8d518434f29-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.041730 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/13254c8b-516c-435e-9db2-a8d518434f29-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.041755 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/13254c8b-516c-435e-9db2-a8d518434f29-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.041782 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/13254c8b-516c-435e-9db2-a8d518434f29-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.042342 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/13254c8b-516c-435e-9db2-a8d518434f29-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.042371 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.042754 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/13254c8b-516c-435e-9db2-a8d518434f29-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.043059 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/13254c8b-516c-435e-9db2-a8d518434f29-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.043278 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/13254c8b-516c-435e-9db2-a8d518434f29-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.044078 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/13254c8b-516c-435e-9db2-a8d518434f29-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.046860 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/13254c8b-516c-435e-9db2-a8d518434f29-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.046941 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/13254c8b-516c-435e-9db2-a8d518434f29-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.053658 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/13254c8b-516c-435e-9db2-a8d518434f29-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.054439 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/13254c8b-516c-435e-9db2-a8d518434f29-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.058909 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfz87\" (UniqueName: \"kubernetes.io/projected/13254c8b-516c-435e-9db2-a8d518434f29-kube-api-access-wfz87\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.070534 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.117476 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.119933 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.122984 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.123323 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.123429 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.123719 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.124203 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-8bxdt" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.124788 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.129470 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.144682 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.158365 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.243742 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e7b01be4-73b6-48eb-a06d-4fb38863d982-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.243795 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e7b01be4-73b6-48eb-a06d-4fb38863d982-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.243817 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.243831 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e7b01be4-73b6-48eb-a06d-4fb38863d982-config-data\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.243860 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e7b01be4-73b6-48eb-a06d-4fb38863d982-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.243873 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e7b01be4-73b6-48eb-a06d-4fb38863d982-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.243892 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e7b01be4-73b6-48eb-a06d-4fb38863d982-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.243948 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e7b01be4-73b6-48eb-a06d-4fb38863d982-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.243964 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e7b01be4-73b6-48eb-a06d-4fb38863d982-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.243987 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl7pd\" (UniqueName: \"kubernetes.io/projected/e7b01be4-73b6-48eb-a06d-4fb38863d982-kube-api-access-pl7pd\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.244005 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e7b01be4-73b6-48eb-a06d-4fb38863d982-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.345030 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e7b01be4-73b6-48eb-a06d-4fb38863d982-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.345085 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e7b01be4-73b6-48eb-a06d-4fb38863d982-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.345111 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e7b01be4-73b6-48eb-a06d-4fb38863d982-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.345175 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e7b01be4-73b6-48eb-a06d-4fb38863d982-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.345192 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e7b01be4-73b6-48eb-a06d-4fb38863d982-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.345215 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pl7pd\" (UniqueName: \"kubernetes.io/projected/e7b01be4-73b6-48eb-a06d-4fb38863d982-kube-api-access-pl7pd\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.345234 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e7b01be4-73b6-48eb-a06d-4fb38863d982-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.345258 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e7b01be4-73b6-48eb-a06d-4fb38863d982-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.345278 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e7b01be4-73b6-48eb-a06d-4fb38863d982-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.345301 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.345317 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e7b01be4-73b6-48eb-a06d-4fb38863d982-config-data\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.346191 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e7b01be4-73b6-48eb-a06d-4fb38863d982-config-data\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.347251 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e7b01be4-73b6-48eb-a06d-4fb38863d982-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.348497 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e7b01be4-73b6-48eb-a06d-4fb38863d982-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.349348 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e7b01be4-73b6-48eb-a06d-4fb38863d982-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.349455 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.349491 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e7b01be4-73b6-48eb-a06d-4fb38863d982-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.349659 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e7b01be4-73b6-48eb-a06d-4fb38863d982-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.351826 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e7b01be4-73b6-48eb-a06d-4fb38863d982-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.354025 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e7b01be4-73b6-48eb-a06d-4fb38863d982-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.360598 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e7b01be4-73b6-48eb-a06d-4fb38863d982-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.371449 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.379340 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl7pd\" (UniqueName: \"kubernetes.io/projected/e7b01be4-73b6-48eb-a06d-4fb38863d982-kube-api-access-pl7pd\") pod \"rabbitmq-server-0\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " pod="openstack/rabbitmq-server-0" Mar 13 12:05:03 crc kubenswrapper[4837]: I0313 12:05:03.453791 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.417437 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.420150 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.422576 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.422805 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.422947 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-nz226" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.426529 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.430170 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.433051 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.563775 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd\") " pod="openstack/openstack-galera-0" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.563897 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd-operator-scripts\") pod \"openstack-galera-0\" (UID: \"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd\") " pod="openstack/openstack-galera-0" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.563939 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncdgz\" (UniqueName: \"kubernetes.io/projected/adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd-kube-api-access-ncdgz\") pod \"openstack-galera-0\" (UID: \"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd\") " pod="openstack/openstack-galera-0" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.563966 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd-config-data-default\") pod \"openstack-galera-0\" (UID: \"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd\") " pod="openstack/openstack-galera-0" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.563997 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd\") " pod="openstack/openstack-galera-0" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.564203 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd-kolla-config\") pod \"openstack-galera-0\" (UID: \"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd\") " pod="openstack/openstack-galera-0" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.564341 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd-config-data-generated\") pod \"openstack-galera-0\" (UID: \"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd\") " pod="openstack/openstack-galera-0" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.564393 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd\") " pod="openstack/openstack-galera-0" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.666209 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd\") " pod="openstack/openstack-galera-0" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.666305 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd-operator-scripts\") pod \"openstack-galera-0\" (UID: \"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd\") " pod="openstack/openstack-galera-0" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.666339 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncdgz\" (UniqueName: \"kubernetes.io/projected/adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd-kube-api-access-ncdgz\") pod \"openstack-galera-0\" (UID: \"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd\") " pod="openstack/openstack-galera-0" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.666366 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd-config-data-default\") pod \"openstack-galera-0\" (UID: \"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd\") " pod="openstack/openstack-galera-0" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.666399 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd\") " pod="openstack/openstack-galera-0" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.666480 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd-kolla-config\") pod \"openstack-galera-0\" (UID: \"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd\") " pod="openstack/openstack-galera-0" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.666525 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd-config-data-generated\") pod \"openstack-galera-0\" (UID: \"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd\") " pod="openstack/openstack-galera-0" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.666554 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd\") " pod="openstack/openstack-galera-0" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.666689 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/openstack-galera-0" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.667465 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd-kolla-config\") pod \"openstack-galera-0\" (UID: \"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd\") " pod="openstack/openstack-galera-0" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.667620 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd-operator-scripts\") pod \"openstack-galera-0\" (UID: \"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd\") " pod="openstack/openstack-galera-0" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.667871 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd-config-data-default\") pod \"openstack-galera-0\" (UID: \"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd\") " pod="openstack/openstack-galera-0" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.667891 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd-config-data-generated\") pod \"openstack-galera-0\" (UID: \"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd\") " pod="openstack/openstack-galera-0" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.671591 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd\") " pod="openstack/openstack-galera-0" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.677317 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd\") " pod="openstack/openstack-galera-0" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.694768 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd\") " pod="openstack/openstack-galera-0" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.702248 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncdgz\" (UniqueName: \"kubernetes.io/projected/adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd-kube-api-access-ncdgz\") pod \"openstack-galera-0\" (UID: \"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd\") " pod="openstack/openstack-galera-0" Mar 13 12:05:04 crc kubenswrapper[4837]: I0313 12:05:04.745310 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.484300 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.484618 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.484736 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.485771 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"62df99fa64e257c350cea1390039e0bd2f2c672bf6d80836ec3df94beec3d8d1"} pod="openshift-machine-config-operator/machine-config-daemon-2td4d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.485835 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" containerID="cri-o://62df99fa64e257c350cea1390039e0bd2f2c672bf6d80836ec3df94beec3d8d1" gracePeriod=600 Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.792323 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.802864 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.805340 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.813696 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.813915 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-5bq6m" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.814100 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.814877 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.886837 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/362e31d4-ea62-40ed-8426-982d47559472-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"362e31d4-ea62-40ed-8426-982d47559472\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.886930 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/362e31d4-ea62-40ed-8426-982d47559472-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"362e31d4-ea62-40ed-8426-982d47559472\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.886980 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/362e31d4-ea62-40ed-8426-982d47559472-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"362e31d4-ea62-40ed-8426-982d47559472\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.887012 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/362e31d4-ea62-40ed-8426-982d47559472-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"362e31d4-ea62-40ed-8426-982d47559472\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.887039 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"362e31d4-ea62-40ed-8426-982d47559472\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.887066 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq4vp\" (UniqueName: \"kubernetes.io/projected/362e31d4-ea62-40ed-8426-982d47559472-kube-api-access-dq4vp\") pod \"openstack-cell1-galera-0\" (UID: \"362e31d4-ea62-40ed-8426-982d47559472\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.887232 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/362e31d4-ea62-40ed-8426-982d47559472-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"362e31d4-ea62-40ed-8426-982d47559472\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.887285 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/362e31d4-ea62-40ed-8426-982d47559472-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"362e31d4-ea62-40ed-8426-982d47559472\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.913483 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.914612 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.918077 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.918890 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.919064 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-8shmp" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.952430 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.988992 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dq4vp\" (UniqueName: \"kubernetes.io/projected/362e31d4-ea62-40ed-8426-982d47559472-kube-api-access-dq4vp\") pod \"openstack-cell1-galera-0\" (UID: \"362e31d4-ea62-40ed-8426-982d47559472\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.989046 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ae39431b-5fa4-4a09-b76f-44b4d256c129-config-data\") pod \"memcached-0\" (UID: \"ae39431b-5fa4-4a09-b76f-44b4d256c129\") " pod="openstack/memcached-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.989065 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae39431b-5fa4-4a09-b76f-44b4d256c129-memcached-tls-certs\") pod \"memcached-0\" (UID: \"ae39431b-5fa4-4a09-b76f-44b4d256c129\") " pod="openstack/memcached-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.989084 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/362e31d4-ea62-40ed-8426-982d47559472-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"362e31d4-ea62-40ed-8426-982d47559472\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.989101 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/362e31d4-ea62-40ed-8426-982d47559472-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"362e31d4-ea62-40ed-8426-982d47559472\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.989141 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/362e31d4-ea62-40ed-8426-982d47559472-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"362e31d4-ea62-40ed-8426-982d47559472\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.989478 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/362e31d4-ea62-40ed-8426-982d47559472-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"362e31d4-ea62-40ed-8426-982d47559472\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.990019 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/362e31d4-ea62-40ed-8426-982d47559472-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"362e31d4-ea62-40ed-8426-982d47559472\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.990659 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/362e31d4-ea62-40ed-8426-982d47559472-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"362e31d4-ea62-40ed-8426-982d47559472\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.994092 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/362e31d4-ea62-40ed-8426-982d47559472-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"362e31d4-ea62-40ed-8426-982d47559472\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.994231 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/362e31d4-ea62-40ed-8426-982d47559472-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"362e31d4-ea62-40ed-8426-982d47559472\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.994390 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvsbl\" (UniqueName: \"kubernetes.io/projected/ae39431b-5fa4-4a09-b76f-44b4d256c129-kube-api-access-lvsbl\") pod \"memcached-0\" (UID: \"ae39431b-5fa4-4a09-b76f-44b4d256c129\") " pod="openstack/memcached-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.994441 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/362e31d4-ea62-40ed-8426-982d47559472-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"362e31d4-ea62-40ed-8426-982d47559472\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.994562 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae39431b-5fa4-4a09-b76f-44b4d256c129-combined-ca-bundle\") pod \"memcached-0\" (UID: \"ae39431b-5fa4-4a09-b76f-44b4d256c129\") " pod="openstack/memcached-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.994705 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"362e31d4-ea62-40ed-8426-982d47559472\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.994758 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ae39431b-5fa4-4a09-b76f-44b4d256c129-kolla-config\") pod \"memcached-0\" (UID: \"ae39431b-5fa4-4a09-b76f-44b4d256c129\") " pod="openstack/memcached-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.994899 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/362e31d4-ea62-40ed-8426-982d47559472-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"362e31d4-ea62-40ed-8426-982d47559472\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.994905 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/362e31d4-ea62-40ed-8426-982d47559472-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"362e31d4-ea62-40ed-8426-982d47559472\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.995436 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"362e31d4-ea62-40ed-8426-982d47559472\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:05 crc kubenswrapper[4837]: I0313 12:05:05.995561 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/362e31d4-ea62-40ed-8426-982d47559472-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"362e31d4-ea62-40ed-8426-982d47559472\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:06 crc kubenswrapper[4837]: I0313 12:05:06.009966 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq4vp\" (UniqueName: \"kubernetes.io/projected/362e31d4-ea62-40ed-8426-982d47559472-kube-api-access-dq4vp\") pod \"openstack-cell1-galera-0\" (UID: \"362e31d4-ea62-40ed-8426-982d47559472\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:06 crc kubenswrapper[4837]: I0313 12:05:06.022623 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"362e31d4-ea62-40ed-8426-982d47559472\") " pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:06 crc kubenswrapper[4837]: I0313 12:05:06.096965 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvsbl\" (UniqueName: \"kubernetes.io/projected/ae39431b-5fa4-4a09-b76f-44b4d256c129-kube-api-access-lvsbl\") pod \"memcached-0\" (UID: \"ae39431b-5fa4-4a09-b76f-44b4d256c129\") " pod="openstack/memcached-0" Mar 13 12:05:06 crc kubenswrapper[4837]: I0313 12:05:06.097037 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae39431b-5fa4-4a09-b76f-44b4d256c129-combined-ca-bundle\") pod \"memcached-0\" (UID: \"ae39431b-5fa4-4a09-b76f-44b4d256c129\") " pod="openstack/memcached-0" Mar 13 12:05:06 crc kubenswrapper[4837]: I0313 12:05:06.097067 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ae39431b-5fa4-4a09-b76f-44b4d256c129-kolla-config\") pod \"memcached-0\" (UID: \"ae39431b-5fa4-4a09-b76f-44b4d256c129\") " pod="openstack/memcached-0" Mar 13 12:05:06 crc kubenswrapper[4837]: I0313 12:05:06.097099 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ae39431b-5fa4-4a09-b76f-44b4d256c129-config-data\") pod \"memcached-0\" (UID: \"ae39431b-5fa4-4a09-b76f-44b4d256c129\") " pod="openstack/memcached-0" Mar 13 12:05:06 crc kubenswrapper[4837]: I0313 12:05:06.097140 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae39431b-5fa4-4a09-b76f-44b4d256c129-memcached-tls-certs\") pod \"memcached-0\" (UID: \"ae39431b-5fa4-4a09-b76f-44b4d256c129\") " pod="openstack/memcached-0" Mar 13 12:05:06 crc kubenswrapper[4837]: I0313 12:05:06.098952 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ae39431b-5fa4-4a09-b76f-44b4d256c129-config-data\") pod \"memcached-0\" (UID: \"ae39431b-5fa4-4a09-b76f-44b4d256c129\") " pod="openstack/memcached-0" Mar 13 12:05:06 crc kubenswrapper[4837]: I0313 12:05:06.099119 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ae39431b-5fa4-4a09-b76f-44b4d256c129-kolla-config\") pod \"memcached-0\" (UID: \"ae39431b-5fa4-4a09-b76f-44b4d256c129\") " pod="openstack/memcached-0" Mar 13 12:05:06 crc kubenswrapper[4837]: I0313 12:05:06.102404 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae39431b-5fa4-4a09-b76f-44b4d256c129-memcached-tls-certs\") pod \"memcached-0\" (UID: \"ae39431b-5fa4-4a09-b76f-44b4d256c129\") " pod="openstack/memcached-0" Mar 13 12:05:06 crc kubenswrapper[4837]: I0313 12:05:06.109934 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae39431b-5fa4-4a09-b76f-44b4d256c129-combined-ca-bundle\") pod \"memcached-0\" (UID: \"ae39431b-5fa4-4a09-b76f-44b4d256c129\") " pod="openstack/memcached-0" Mar 13 12:05:06 crc kubenswrapper[4837]: I0313 12:05:06.120827 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvsbl\" (UniqueName: \"kubernetes.io/projected/ae39431b-5fa4-4a09-b76f-44b4d256c129-kube-api-access-lvsbl\") pod \"memcached-0\" (UID: \"ae39431b-5fa4-4a09-b76f-44b4d256c129\") " pod="openstack/memcached-0" Mar 13 12:05:06 crc kubenswrapper[4837]: I0313 12:05:06.184333 4837 generic.go:334] "Generic (PLEG): container finished" podID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerID="62df99fa64e257c350cea1390039e0bd2f2c672bf6d80836ec3df94beec3d8d1" exitCode=0 Mar 13 12:05:06 crc kubenswrapper[4837]: I0313 12:05:06.184387 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerDied","Data":"62df99fa64e257c350cea1390039e0bd2f2c672bf6d80836ec3df94beec3d8d1"} Mar 13 12:05:06 crc kubenswrapper[4837]: I0313 12:05:06.184427 4837 scope.go:117] "RemoveContainer" containerID="86010f8ae6e03e22840b0db405e4816a52e1a80af0eff6188dd5d3d81e63937a" Mar 13 12:05:06 crc kubenswrapper[4837]: I0313 12:05:06.186663 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:06 crc kubenswrapper[4837]: I0313 12:05:06.238040 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 13 12:05:08 crc kubenswrapper[4837]: I0313 12:05:08.093229 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 12:05:08 crc kubenswrapper[4837]: I0313 12:05:08.094787 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 12:05:08 crc kubenswrapper[4837]: I0313 12:05:08.096700 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-ct7wn" Mar 13 12:05:08 crc kubenswrapper[4837]: I0313 12:05:08.106254 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 12:05:08 crc kubenswrapper[4837]: I0313 12:05:08.138589 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vwh6\" (UniqueName: \"kubernetes.io/projected/a250849d-ca15-40fa-8b1d-a32b5abc6861-kube-api-access-9vwh6\") pod \"kube-state-metrics-0\" (UID: \"a250849d-ca15-40fa-8b1d-a32b5abc6861\") " pod="openstack/kube-state-metrics-0" Mar 13 12:05:08 crc kubenswrapper[4837]: I0313 12:05:08.239733 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vwh6\" (UniqueName: \"kubernetes.io/projected/a250849d-ca15-40fa-8b1d-a32b5abc6861-kube-api-access-9vwh6\") pod \"kube-state-metrics-0\" (UID: \"a250849d-ca15-40fa-8b1d-a32b5abc6861\") " pod="openstack/kube-state-metrics-0" Mar 13 12:05:08 crc kubenswrapper[4837]: I0313 12:05:08.268941 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vwh6\" (UniqueName: \"kubernetes.io/projected/a250849d-ca15-40fa-8b1d-a32b5abc6861-kube-api-access-9vwh6\") pod \"kube-state-metrics-0\" (UID: \"a250849d-ca15-40fa-8b1d-a32b5abc6861\") " pod="openstack/kube-state-metrics-0" Mar 13 12:05:08 crc kubenswrapper[4837]: I0313 12:05:08.422275 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 12:05:10 crc kubenswrapper[4837]: I0313 12:05:10.224961 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-7wbz7"] Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.607902 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-nbhpw"] Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.614623 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nbhpw" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.618439 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nbhpw"] Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.620318 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-hlqhf" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.620600 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.620807 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.671466 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-ls998"] Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.673361 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-ls998" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.684618 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-ls998"] Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.704049 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw65n\" (UniqueName: \"kubernetes.io/projected/32dc51d9-5638-4530-91c8-5be8c13e60f3-kube-api-access-dw65n\") pod \"ovn-controller-nbhpw\" (UID: \"32dc51d9-5638-4530-91c8-5be8c13e60f3\") " pod="openstack/ovn-controller-nbhpw" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.704142 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/32dc51d9-5638-4530-91c8-5be8c13e60f3-var-run-ovn\") pod \"ovn-controller-nbhpw\" (UID: \"32dc51d9-5638-4530-91c8-5be8c13e60f3\") " pod="openstack/ovn-controller-nbhpw" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.704174 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/32dc51d9-5638-4530-91c8-5be8c13e60f3-ovn-controller-tls-certs\") pod \"ovn-controller-nbhpw\" (UID: \"32dc51d9-5638-4530-91c8-5be8c13e60f3\") " pod="openstack/ovn-controller-nbhpw" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.704223 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32dc51d9-5638-4530-91c8-5be8c13e60f3-combined-ca-bundle\") pod \"ovn-controller-nbhpw\" (UID: \"32dc51d9-5638-4530-91c8-5be8c13e60f3\") " pod="openstack/ovn-controller-nbhpw" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.704250 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/32dc51d9-5638-4530-91c8-5be8c13e60f3-var-log-ovn\") pod \"ovn-controller-nbhpw\" (UID: \"32dc51d9-5638-4530-91c8-5be8c13e60f3\") " pod="openstack/ovn-controller-nbhpw" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.704282 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/32dc51d9-5638-4530-91c8-5be8c13e60f3-var-run\") pod \"ovn-controller-nbhpw\" (UID: \"32dc51d9-5638-4530-91c8-5be8c13e60f3\") " pod="openstack/ovn-controller-nbhpw" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.704305 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/32dc51d9-5638-4530-91c8-5be8c13e60f3-scripts\") pod \"ovn-controller-nbhpw\" (UID: \"32dc51d9-5638-4530-91c8-5be8c13e60f3\") " pod="openstack/ovn-controller-nbhpw" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.805955 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/32dc51d9-5638-4530-91c8-5be8c13e60f3-ovn-controller-tls-certs\") pod \"ovn-controller-nbhpw\" (UID: \"32dc51d9-5638-4530-91c8-5be8c13e60f3\") " pod="openstack/ovn-controller-nbhpw" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.805999 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/71e00962-6b2f-495c-8f34-52993f66cef9-var-lib\") pod \"ovn-controller-ovs-ls998\" (UID: \"71e00962-6b2f-495c-8f34-52993f66cef9\") " pod="openstack/ovn-controller-ovs-ls998" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.806046 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32dc51d9-5638-4530-91c8-5be8c13e60f3-combined-ca-bundle\") pod \"ovn-controller-nbhpw\" (UID: \"32dc51d9-5638-4530-91c8-5be8c13e60f3\") " pod="openstack/ovn-controller-nbhpw" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.806069 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/32dc51d9-5638-4530-91c8-5be8c13e60f3-var-log-ovn\") pod \"ovn-controller-nbhpw\" (UID: \"32dc51d9-5638-4530-91c8-5be8c13e60f3\") " pod="openstack/ovn-controller-nbhpw" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.806096 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/32dc51d9-5638-4530-91c8-5be8c13e60f3-var-run\") pod \"ovn-controller-nbhpw\" (UID: \"32dc51d9-5638-4530-91c8-5be8c13e60f3\") " pod="openstack/ovn-controller-nbhpw" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.806114 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/32dc51d9-5638-4530-91c8-5be8c13e60f3-scripts\") pod \"ovn-controller-nbhpw\" (UID: \"32dc51d9-5638-4530-91c8-5be8c13e60f3\") " pod="openstack/ovn-controller-nbhpw" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.806137 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw65n\" (UniqueName: \"kubernetes.io/projected/32dc51d9-5638-4530-91c8-5be8c13e60f3-kube-api-access-dw65n\") pod \"ovn-controller-nbhpw\" (UID: \"32dc51d9-5638-4530-91c8-5be8c13e60f3\") " pod="openstack/ovn-controller-nbhpw" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.806153 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/71e00962-6b2f-495c-8f34-52993f66cef9-var-run\") pod \"ovn-controller-ovs-ls998\" (UID: \"71e00962-6b2f-495c-8f34-52993f66cef9\") " pod="openstack/ovn-controller-ovs-ls998" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.806177 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxp7g\" (UniqueName: \"kubernetes.io/projected/71e00962-6b2f-495c-8f34-52993f66cef9-kube-api-access-mxp7g\") pod \"ovn-controller-ovs-ls998\" (UID: \"71e00962-6b2f-495c-8f34-52993f66cef9\") " pod="openstack/ovn-controller-ovs-ls998" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.806199 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/71e00962-6b2f-495c-8f34-52993f66cef9-var-log\") pod \"ovn-controller-ovs-ls998\" (UID: \"71e00962-6b2f-495c-8f34-52993f66cef9\") " pod="openstack/ovn-controller-ovs-ls998" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.806226 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71e00962-6b2f-495c-8f34-52993f66cef9-scripts\") pod \"ovn-controller-ovs-ls998\" (UID: \"71e00962-6b2f-495c-8f34-52993f66cef9\") " pod="openstack/ovn-controller-ovs-ls998" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.806248 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/71e00962-6b2f-495c-8f34-52993f66cef9-etc-ovs\") pod \"ovn-controller-ovs-ls998\" (UID: \"71e00962-6b2f-495c-8f34-52993f66cef9\") " pod="openstack/ovn-controller-ovs-ls998" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.806274 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/32dc51d9-5638-4530-91c8-5be8c13e60f3-var-run-ovn\") pod \"ovn-controller-nbhpw\" (UID: \"32dc51d9-5638-4530-91c8-5be8c13e60f3\") " pod="openstack/ovn-controller-nbhpw" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.807311 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/32dc51d9-5638-4530-91c8-5be8c13e60f3-var-run-ovn\") pod \"ovn-controller-nbhpw\" (UID: \"32dc51d9-5638-4530-91c8-5be8c13e60f3\") " pod="openstack/ovn-controller-nbhpw" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.807377 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/32dc51d9-5638-4530-91c8-5be8c13e60f3-var-log-ovn\") pod \"ovn-controller-nbhpw\" (UID: \"32dc51d9-5638-4530-91c8-5be8c13e60f3\") " pod="openstack/ovn-controller-nbhpw" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.807406 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/32dc51d9-5638-4530-91c8-5be8c13e60f3-var-run\") pod \"ovn-controller-nbhpw\" (UID: \"32dc51d9-5638-4530-91c8-5be8c13e60f3\") " pod="openstack/ovn-controller-nbhpw" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.808467 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/32dc51d9-5638-4530-91c8-5be8c13e60f3-scripts\") pod \"ovn-controller-nbhpw\" (UID: \"32dc51d9-5638-4530-91c8-5be8c13e60f3\") " pod="openstack/ovn-controller-nbhpw" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.815395 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/32dc51d9-5638-4530-91c8-5be8c13e60f3-ovn-controller-tls-certs\") pod \"ovn-controller-nbhpw\" (UID: \"32dc51d9-5638-4530-91c8-5be8c13e60f3\") " pod="openstack/ovn-controller-nbhpw" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.816342 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32dc51d9-5638-4530-91c8-5be8c13e60f3-combined-ca-bundle\") pod \"ovn-controller-nbhpw\" (UID: \"32dc51d9-5638-4530-91c8-5be8c13e60f3\") " pod="openstack/ovn-controller-nbhpw" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.826354 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw65n\" (UniqueName: \"kubernetes.io/projected/32dc51d9-5638-4530-91c8-5be8c13e60f3-kube-api-access-dw65n\") pod \"ovn-controller-nbhpw\" (UID: \"32dc51d9-5638-4530-91c8-5be8c13e60f3\") " pod="openstack/ovn-controller-nbhpw" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.907682 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/71e00962-6b2f-495c-8f34-52993f66cef9-etc-ovs\") pod \"ovn-controller-ovs-ls998\" (UID: \"71e00962-6b2f-495c-8f34-52993f66cef9\") " pod="openstack/ovn-controller-ovs-ls998" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.907768 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/71e00962-6b2f-495c-8f34-52993f66cef9-var-lib\") pod \"ovn-controller-ovs-ls998\" (UID: \"71e00962-6b2f-495c-8f34-52993f66cef9\") " pod="openstack/ovn-controller-ovs-ls998" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.907862 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/71e00962-6b2f-495c-8f34-52993f66cef9-var-run\") pod \"ovn-controller-ovs-ls998\" (UID: \"71e00962-6b2f-495c-8f34-52993f66cef9\") " pod="openstack/ovn-controller-ovs-ls998" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.907898 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxp7g\" (UniqueName: \"kubernetes.io/projected/71e00962-6b2f-495c-8f34-52993f66cef9-kube-api-access-mxp7g\") pod \"ovn-controller-ovs-ls998\" (UID: \"71e00962-6b2f-495c-8f34-52993f66cef9\") " pod="openstack/ovn-controller-ovs-ls998" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.907926 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/71e00962-6b2f-495c-8f34-52993f66cef9-var-log\") pod \"ovn-controller-ovs-ls998\" (UID: \"71e00962-6b2f-495c-8f34-52993f66cef9\") " pod="openstack/ovn-controller-ovs-ls998" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.907957 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71e00962-6b2f-495c-8f34-52993f66cef9-scripts\") pod \"ovn-controller-ovs-ls998\" (UID: \"71e00962-6b2f-495c-8f34-52993f66cef9\") " pod="openstack/ovn-controller-ovs-ls998" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.908008 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/71e00962-6b2f-495c-8f34-52993f66cef9-etc-ovs\") pod \"ovn-controller-ovs-ls998\" (UID: \"71e00962-6b2f-495c-8f34-52993f66cef9\") " pod="openstack/ovn-controller-ovs-ls998" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.908079 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/71e00962-6b2f-495c-8f34-52993f66cef9-var-run\") pod \"ovn-controller-ovs-ls998\" (UID: \"71e00962-6b2f-495c-8f34-52993f66cef9\") " pod="openstack/ovn-controller-ovs-ls998" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.908226 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/71e00962-6b2f-495c-8f34-52993f66cef9-var-lib\") pod \"ovn-controller-ovs-ls998\" (UID: \"71e00962-6b2f-495c-8f34-52993f66cef9\") " pod="openstack/ovn-controller-ovs-ls998" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.908619 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/71e00962-6b2f-495c-8f34-52993f66cef9-var-log\") pod \"ovn-controller-ovs-ls998\" (UID: \"71e00962-6b2f-495c-8f34-52993f66cef9\") " pod="openstack/ovn-controller-ovs-ls998" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.910277 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71e00962-6b2f-495c-8f34-52993f66cef9-scripts\") pod \"ovn-controller-ovs-ls998\" (UID: \"71e00962-6b2f-495c-8f34-52993f66cef9\") " pod="openstack/ovn-controller-ovs-ls998" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.934353 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxp7g\" (UniqueName: \"kubernetes.io/projected/71e00962-6b2f-495c-8f34-52993f66cef9-kube-api-access-mxp7g\") pod \"ovn-controller-ovs-ls998\" (UID: \"71e00962-6b2f-495c-8f34-52993f66cef9\") " pod="openstack/ovn-controller-ovs-ls998" Mar 13 12:05:11 crc kubenswrapper[4837]: I0313 12:05:11.934825 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nbhpw" Mar 13 12:05:12 crc kubenswrapper[4837]: I0313 12:05:12.004000 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-ls998" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.692207 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.693973 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.696176 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-rlhxr" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.696207 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.696222 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.696514 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.703403 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.707006 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.754369 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/38d61ffe-3c44-4657-bc91-d849f766a3e1-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"38d61ffe-3c44-4657-bc91-d849f766a3e1\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.754423 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"38d61ffe-3c44-4657-bc91-d849f766a3e1\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.754459 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38d61ffe-3c44-4657-bc91-d849f766a3e1-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"38d61ffe-3c44-4657-bc91-d849f766a3e1\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.754486 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38d61ffe-3c44-4657-bc91-d849f766a3e1-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"38d61ffe-3c44-4657-bc91-d849f766a3e1\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.754503 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbxmh\" (UniqueName: \"kubernetes.io/projected/38d61ffe-3c44-4657-bc91-d849f766a3e1-kube-api-access-fbxmh\") pod \"ovsdbserver-nb-0\" (UID: \"38d61ffe-3c44-4657-bc91-d849f766a3e1\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.754531 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/38d61ffe-3c44-4657-bc91-d849f766a3e1-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"38d61ffe-3c44-4657-bc91-d849f766a3e1\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.754564 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38d61ffe-3c44-4657-bc91-d849f766a3e1-config\") pod \"ovsdbserver-nb-0\" (UID: \"38d61ffe-3c44-4657-bc91-d849f766a3e1\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.754600 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/38d61ffe-3c44-4657-bc91-d849f766a3e1-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"38d61ffe-3c44-4657-bc91-d849f766a3e1\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.856486 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38d61ffe-3c44-4657-bc91-d849f766a3e1-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"38d61ffe-3c44-4657-bc91-d849f766a3e1\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.856540 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38d61ffe-3c44-4657-bc91-d849f766a3e1-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"38d61ffe-3c44-4657-bc91-d849f766a3e1\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.856574 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbxmh\" (UniqueName: \"kubernetes.io/projected/38d61ffe-3c44-4657-bc91-d849f766a3e1-kube-api-access-fbxmh\") pod \"ovsdbserver-nb-0\" (UID: \"38d61ffe-3c44-4657-bc91-d849f766a3e1\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.856612 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/38d61ffe-3c44-4657-bc91-d849f766a3e1-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"38d61ffe-3c44-4657-bc91-d849f766a3e1\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.856673 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38d61ffe-3c44-4657-bc91-d849f766a3e1-config\") pod \"ovsdbserver-nb-0\" (UID: \"38d61ffe-3c44-4657-bc91-d849f766a3e1\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.856724 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/38d61ffe-3c44-4657-bc91-d849f766a3e1-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"38d61ffe-3c44-4657-bc91-d849f766a3e1\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.856807 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/38d61ffe-3c44-4657-bc91-d849f766a3e1-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"38d61ffe-3c44-4657-bc91-d849f766a3e1\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.856843 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"38d61ffe-3c44-4657-bc91-d849f766a3e1\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.857300 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"38d61ffe-3c44-4657-bc91-d849f766a3e1\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.861345 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/38d61ffe-3c44-4657-bc91-d849f766a3e1-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"38d61ffe-3c44-4657-bc91-d849f766a3e1\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.862193 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38d61ffe-3c44-4657-bc91-d849f766a3e1-config\") pod \"ovsdbserver-nb-0\" (UID: \"38d61ffe-3c44-4657-bc91-d849f766a3e1\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.863084 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38d61ffe-3c44-4657-bc91-d849f766a3e1-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"38d61ffe-3c44-4657-bc91-d849f766a3e1\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.867908 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/38d61ffe-3c44-4657-bc91-d849f766a3e1-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"38d61ffe-3c44-4657-bc91-d849f766a3e1\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.869972 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38d61ffe-3c44-4657-bc91-d849f766a3e1-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"38d61ffe-3c44-4657-bc91-d849f766a3e1\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.870736 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/38d61ffe-3c44-4657-bc91-d849f766a3e1-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"38d61ffe-3c44-4657-bc91-d849f766a3e1\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.880397 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbxmh\" (UniqueName: \"kubernetes.io/projected/38d61ffe-3c44-4657-bc91-d849f766a3e1-kube-api-access-fbxmh\") pod \"ovsdbserver-nb-0\" (UID: \"38d61ffe-3c44-4657-bc91-d849f766a3e1\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.882515 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.884353 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.888243 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.888824 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.889123 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-nmsn4" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.889558 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"38d61ffe-3c44-4657-bc91-d849f766a3e1\") " pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.892714 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.893036 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.957807 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d10fcb0-4d45-45bf-a663-971b8ce74010-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3d10fcb0-4d45-45bf-a663-971b8ce74010\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.957874 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d10fcb0-4d45-45bf-a663-971b8ce74010-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3d10fcb0-4d45-45bf-a663-971b8ce74010\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.957910 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xpnw\" (UniqueName: \"kubernetes.io/projected/3d10fcb0-4d45-45bf-a663-971b8ce74010-kube-api-access-2xpnw\") pod \"ovsdbserver-sb-0\" (UID: \"3d10fcb0-4d45-45bf-a663-971b8ce74010\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.958267 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3d10fcb0-4d45-45bf-a663-971b8ce74010-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"3d10fcb0-4d45-45bf-a663-971b8ce74010\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.958305 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"3d10fcb0-4d45-45bf-a663-971b8ce74010\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.958343 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d10fcb0-4d45-45bf-a663-971b8ce74010-config\") pod \"ovsdbserver-sb-0\" (UID: \"3d10fcb0-4d45-45bf-a663-971b8ce74010\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.958365 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3d10fcb0-4d45-45bf-a663-971b8ce74010-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"3d10fcb0-4d45-45bf-a663-971b8ce74010\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.958388 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d10fcb0-4d45-45bf-a663-971b8ce74010-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"3d10fcb0-4d45-45bf-a663-971b8ce74010\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:14 crc kubenswrapper[4837]: I0313 12:05:14.962419 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 12:05:15 crc kubenswrapper[4837]: I0313 12:05:15.020284 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:15 crc kubenswrapper[4837]: I0313 12:05:15.059312 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d10fcb0-4d45-45bf-a663-971b8ce74010-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3d10fcb0-4d45-45bf-a663-971b8ce74010\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:15 crc kubenswrapper[4837]: I0313 12:05:15.059360 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xpnw\" (UniqueName: \"kubernetes.io/projected/3d10fcb0-4d45-45bf-a663-971b8ce74010-kube-api-access-2xpnw\") pod \"ovsdbserver-sb-0\" (UID: \"3d10fcb0-4d45-45bf-a663-971b8ce74010\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:15 crc kubenswrapper[4837]: I0313 12:05:15.059412 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3d10fcb0-4d45-45bf-a663-971b8ce74010-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"3d10fcb0-4d45-45bf-a663-971b8ce74010\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:15 crc kubenswrapper[4837]: I0313 12:05:15.059436 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"3d10fcb0-4d45-45bf-a663-971b8ce74010\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:15 crc kubenswrapper[4837]: I0313 12:05:15.059468 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d10fcb0-4d45-45bf-a663-971b8ce74010-config\") pod \"ovsdbserver-sb-0\" (UID: \"3d10fcb0-4d45-45bf-a663-971b8ce74010\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:15 crc kubenswrapper[4837]: I0313 12:05:15.059491 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3d10fcb0-4d45-45bf-a663-971b8ce74010-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"3d10fcb0-4d45-45bf-a663-971b8ce74010\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:15 crc kubenswrapper[4837]: I0313 12:05:15.059511 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d10fcb0-4d45-45bf-a663-971b8ce74010-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"3d10fcb0-4d45-45bf-a663-971b8ce74010\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:15 crc kubenswrapper[4837]: I0313 12:05:15.059546 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d10fcb0-4d45-45bf-a663-971b8ce74010-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3d10fcb0-4d45-45bf-a663-971b8ce74010\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:15 crc kubenswrapper[4837]: I0313 12:05:15.060218 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"3d10fcb0-4d45-45bf-a663-971b8ce74010\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:15 crc kubenswrapper[4837]: I0313 12:05:15.060514 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3d10fcb0-4d45-45bf-a663-971b8ce74010-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"3d10fcb0-4d45-45bf-a663-971b8ce74010\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:15 crc kubenswrapper[4837]: I0313 12:05:15.069200 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d10fcb0-4d45-45bf-a663-971b8ce74010-config\") pod \"ovsdbserver-sb-0\" (UID: \"3d10fcb0-4d45-45bf-a663-971b8ce74010\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:15 crc kubenswrapper[4837]: I0313 12:05:15.073412 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3d10fcb0-4d45-45bf-a663-971b8ce74010-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"3d10fcb0-4d45-45bf-a663-971b8ce74010\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:15 crc kubenswrapper[4837]: I0313 12:05:15.077855 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d10fcb0-4d45-45bf-a663-971b8ce74010-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3d10fcb0-4d45-45bf-a663-971b8ce74010\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:15 crc kubenswrapper[4837]: I0313 12:05:15.079548 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d10fcb0-4d45-45bf-a663-971b8ce74010-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3d10fcb0-4d45-45bf-a663-971b8ce74010\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:15 crc kubenswrapper[4837]: I0313 12:05:15.098192 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d10fcb0-4d45-45bf-a663-971b8ce74010-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"3d10fcb0-4d45-45bf-a663-971b8ce74010\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:15 crc kubenswrapper[4837]: I0313 12:05:15.100775 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xpnw\" (UniqueName: \"kubernetes.io/projected/3d10fcb0-4d45-45bf-a663-971b8ce74010-kube-api-access-2xpnw\") pod \"ovsdbserver-sb-0\" (UID: \"3d10fcb0-4d45-45bf-a663-971b8ce74010\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:15 crc kubenswrapper[4837]: I0313 12:05:15.104278 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"3d10fcb0-4d45-45bf-a663-971b8ce74010\") " pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:15 crc kubenswrapper[4837]: I0313 12:05:15.255581 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:15 crc kubenswrapper[4837]: I0313 12:05:15.281792 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-7wbz7" event={"ID":"b082689f-6a6d-4da0-b2b1-f78343ba1e85","Type":"ContainerStarted","Data":"cf45881769a320d80a0b475b78878572d21663b4dd8fafe7ae1c9681a95c4a07"} Mar 13 12:05:15 crc kubenswrapper[4837]: W0313 12:05:15.764494 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7b01be4_73b6_48eb_a06d_4fb38863d982.slice/crio-c8180a84e0af5653dde0ac3c7b4b0a9aa55749048023693725605b5733ff15c7 WatchSource:0}: Error finding container c8180a84e0af5653dde0ac3c7b4b0a9aa55749048023693725605b5733ff15c7: Status 404 returned error can't find the container with id c8180a84e0af5653dde0ac3c7b4b0a9aa55749048023693725605b5733ff15c7 Mar 13 12:05:15 crc kubenswrapper[4837]: E0313 12:05:15.792618 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 13 12:05:15 crc kubenswrapper[4837]: E0313 12:05:15.792853 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6wlx2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-29x4s_openstack(63688ba3-e68c-4f88-a6e4-6c373b30f929): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 12:05:15 crc kubenswrapper[4837]: E0313 12:05:15.794151 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-29x4s" podUID="63688ba3-e68c-4f88-a6e4-6c373b30f929" Mar 13 12:05:15 crc kubenswrapper[4837]: E0313 12:05:15.841303 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 13 12:05:15 crc kubenswrapper[4837]: E0313 12:05:15.841499 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d6rdw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-4pw9n_openstack(5b9562f6-0527-40b4-9b2e-f5b2f22aa272): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 12:05:15 crc kubenswrapper[4837]: E0313 12:05:15.842891 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-4pw9n" podUID="5b9562f6-0527-40b4-9b2e-f5b2f22aa272" Mar 13 12:05:16 crc kubenswrapper[4837]: I0313 12:05:16.304131 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e7b01be4-73b6-48eb-a06d-4fb38863d982","Type":"ContainerStarted","Data":"c8180a84e0af5653dde0ac3c7b4b0a9aa55749048023693725605b5733ff15c7"} Mar 13 12:05:16 crc kubenswrapper[4837]: I0313 12:05:16.311555 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerStarted","Data":"75c6e15833f1c4c6d83b741f42f4ce0c9378844641d1d149fd75349d257dfc71"} Mar 13 12:05:16 crc kubenswrapper[4837]: I0313 12:05:16.457724 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 12:05:16 crc kubenswrapper[4837]: I0313 12:05:16.476518 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 12:05:16 crc kubenswrapper[4837]: W0313 12:05:16.484584 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13254c8b_516c_435e_9db2_a8d518434f29.slice/crio-5d7d6eb76793e1d7753bbea8ba4648a937e2549b33bfd032cf29e8f6d1e62f4c WatchSource:0}: Error finding container 5d7d6eb76793e1d7753bbea8ba4648a937e2549b33bfd032cf29e8f6d1e62f4c: Status 404 returned error can't find the container with id 5d7d6eb76793e1d7753bbea8ba4648a937e2549b33bfd032cf29e8f6d1e62f4c Mar 13 12:05:16 crc kubenswrapper[4837]: I0313 12:05:16.885484 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 13 12:05:16 crc kubenswrapper[4837]: I0313 12:05:16.892779 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-4pw9n" Mar 13 12:05:16 crc kubenswrapper[4837]: I0313 12:05:16.924443 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-29x4s" Mar 13 12:05:16 crc kubenswrapper[4837]: I0313 12:05:16.932482 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-7vs6f"] Mar 13 12:05:16 crc kubenswrapper[4837]: I0313 12:05:16.956524 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 13 12:05:16 crc kubenswrapper[4837]: I0313 12:05:16.980798 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.023864 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b9562f6-0527-40b4-9b2e-f5b2f22aa272-config\") pod \"5b9562f6-0527-40b4-9b2e-f5b2f22aa272\" (UID: \"5b9562f6-0527-40b4-9b2e-f5b2f22aa272\") " Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.023925 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wlx2\" (UniqueName: \"kubernetes.io/projected/63688ba3-e68c-4f88-a6e4-6c373b30f929-kube-api-access-6wlx2\") pod \"63688ba3-e68c-4f88-a6e4-6c373b30f929\" (UID: \"63688ba3-e68c-4f88-a6e4-6c373b30f929\") " Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.023995 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6rdw\" (UniqueName: \"kubernetes.io/projected/5b9562f6-0527-40b4-9b2e-f5b2f22aa272-kube-api-access-d6rdw\") pod \"5b9562f6-0527-40b4-9b2e-f5b2f22aa272\" (UID: \"5b9562f6-0527-40b4-9b2e-f5b2f22aa272\") " Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.024019 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b9562f6-0527-40b4-9b2e-f5b2f22aa272-dns-svc\") pod \"5b9562f6-0527-40b4-9b2e-f5b2f22aa272\" (UID: \"5b9562f6-0527-40b4-9b2e-f5b2f22aa272\") " Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.024034 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63688ba3-e68c-4f88-a6e4-6c373b30f929-config\") pod \"63688ba3-e68c-4f88-a6e4-6c373b30f929\" (UID: \"63688ba3-e68c-4f88-a6e4-6c373b30f929\") " Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.024836 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63688ba3-e68c-4f88-a6e4-6c373b30f929-config" (OuterVolumeSpecName: "config") pod "63688ba3-e68c-4f88-a6e4-6c373b30f929" (UID: "63688ba3-e68c-4f88-a6e4-6c373b30f929"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.024879 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b9562f6-0527-40b4-9b2e-f5b2f22aa272-config" (OuterVolumeSpecName: "config") pod "5b9562f6-0527-40b4-9b2e-f5b2f22aa272" (UID: "5b9562f6-0527-40b4-9b2e-f5b2f22aa272"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.026152 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b9562f6-0527-40b4-9b2e-f5b2f22aa272-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5b9562f6-0527-40b4-9b2e-f5b2f22aa272" (UID: "5b9562f6-0527-40b4-9b2e-f5b2f22aa272"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.035983 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63688ba3-e68c-4f88-a6e4-6c373b30f929-kube-api-access-6wlx2" (OuterVolumeSpecName: "kube-api-access-6wlx2") pod "63688ba3-e68c-4f88-a6e4-6c373b30f929" (UID: "63688ba3-e68c-4f88-a6e4-6c373b30f929"). InnerVolumeSpecName "kube-api-access-6wlx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.047921 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b9562f6-0527-40b4-9b2e-f5b2f22aa272-kube-api-access-d6rdw" (OuterVolumeSpecName: "kube-api-access-d6rdw") pod "5b9562f6-0527-40b4-9b2e-f5b2f22aa272" (UID: "5b9562f6-0527-40b4-9b2e-f5b2f22aa272"). InnerVolumeSpecName "kube-api-access-d6rdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.112229 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-ls998"] Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.118884 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nbhpw"] Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.125628 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wlx2\" (UniqueName: \"kubernetes.io/projected/63688ba3-e68c-4f88-a6e4-6c373b30f929-kube-api-access-6wlx2\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.125690 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6rdw\" (UniqueName: \"kubernetes.io/projected/5b9562f6-0527-40b4-9b2e-f5b2f22aa272-kube-api-access-d6rdw\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.125705 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b9562f6-0527-40b4-9b2e-f5b2f22aa272-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.125717 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63688ba3-e68c-4f88-a6e4-6c373b30f929-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.125728 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b9562f6-0527-40b4-9b2e-f5b2f22aa272-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:17 crc kubenswrapper[4837]: W0313 12:05:17.141939 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podadb9ab64_aa4b_45f4_8738_0ed74c3ed2bd.slice/crio-f98462c5662e516a739e1682b280767c7d46d013e84a15caaf3ddb8663b73d9a WatchSource:0}: Error finding container f98462c5662e516a739e1682b280767c7d46d013e84a15caaf3ddb8663b73d9a: Status 404 returned error can't find the container with id f98462c5662e516a739e1682b280767c7d46d013e84a15caaf3ddb8663b73d9a Mar 13 12:05:17 crc kubenswrapper[4837]: W0313 12:05:17.145201 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae5deee0_59c4_4fa7_8d8c_e12b516885dc.slice/crio-dac372c91256bbeabdb4ee95a6241431746c1c91b7bb9f40ca6c3bd206fe1f51 WatchSource:0}: Error finding container dac372c91256bbeabdb4ee95a6241431746c1c91b7bb9f40ca6c3bd206fe1f51: Status 404 returned error can't find the container with id dac372c91256bbeabdb4ee95a6241431746c1c91b7bb9f40ca6c3bd206fe1f51 Mar 13 12:05:17 crc kubenswrapper[4837]: W0313 12:05:17.150723 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae39431b_5fa4_4a09_b76f_44b4d256c129.slice/crio-eb9226cf039456e6835939bdf8a4b2eeafcc714a35f2dde06005ef4ba14f3c23 WatchSource:0}: Error finding container eb9226cf039456e6835939bdf8a4b2eeafcc714a35f2dde06005ef4ba14f3c23: Status 404 returned error can't find the container with id eb9226cf039456e6835939bdf8a4b2eeafcc714a35f2dde06005ef4ba14f3c23 Mar 13 12:05:17 crc kubenswrapper[4837]: W0313 12:05:17.155212 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32dc51d9_5638_4530_91c8_5be8c13e60f3.slice/crio-0789b60c660687f217e3da05e0adc970132c74550fab068133b05224809ee33f WatchSource:0}: Error finding container 0789b60c660687f217e3da05e0adc970132c74550fab068133b05224809ee33f: Status 404 returned error can't find the container with id 0789b60c660687f217e3da05e0adc970132c74550fab068133b05224809ee33f Mar 13 12:05:17 crc kubenswrapper[4837]: W0313 12:05:17.155729 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71e00962_6b2f_495c_8f34_52993f66cef9.slice/crio-5b6e7b77ac7881a09bdbef9dacc95d17e67aaa52c4bc3122591a1f9bfca019c3 WatchSource:0}: Error finding container 5b6e7b77ac7881a09bdbef9dacc95d17e67aaa52c4bc3122591a1f9bfca019c3: Status 404 returned error can't find the container with id 5b6e7b77ac7881a09bdbef9dacc95d17e67aaa52c4bc3122591a1f9bfca019c3 Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.319887 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"ae39431b-5fa4-4a09-b76f-44b4d256c129","Type":"ContainerStarted","Data":"eb9226cf039456e6835939bdf8a4b2eeafcc714a35f2dde06005ef4ba14f3c23"} Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.321488 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"13254c8b-516c-435e-9db2-a8d518434f29","Type":"ContainerStarted","Data":"5d7d6eb76793e1d7753bbea8ba4648a937e2549b33bfd032cf29e8f6d1e62f4c"} Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.323482 4837 generic.go:334] "Generic (PLEG): container finished" podID="b082689f-6a6d-4da0-b2b1-f78343ba1e85" containerID="31ac9d4659c535248d95613eed33e384ae2a9e10c0405e3e50bc5460cd46aeea" exitCode=0 Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.323541 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-7wbz7" event={"ID":"b082689f-6a6d-4da0-b2b1-f78343ba1e85","Type":"ContainerDied","Data":"31ac9d4659c535248d95613eed33e384ae2a9e10c0405e3e50bc5460cd46aeea"} Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.325368 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-4pw9n" event={"ID":"5b9562f6-0527-40b4-9b2e-f5b2f22aa272","Type":"ContainerDied","Data":"da674bf6aef47158bcb9f95c5eb9d1a420c65f8f1989031a5ce339d17724e353"} Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.325666 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-4pw9n" Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.329797 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"362e31d4-ea62-40ed-8426-982d47559472","Type":"ContainerStarted","Data":"ddc538520d8e600b4e67cdf449278e1a15b98855f0bcdb3070e43bd4632a4dc3"} Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.339338 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd","Type":"ContainerStarted","Data":"f98462c5662e516a739e1682b280767c7d46d013e84a15caaf3ddb8663b73d9a"} Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.368235 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-7vs6f" event={"ID":"ae5deee0-59c4-4fa7-8d8c-e12b516885dc","Type":"ContainerStarted","Data":"dac372c91256bbeabdb4ee95a6241431746c1c91b7bb9f40ca6c3bd206fe1f51"} Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.372807 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-29x4s" event={"ID":"63688ba3-e68c-4f88-a6e4-6c373b30f929","Type":"ContainerDied","Data":"a94e60840947634f6bf1cb267e8fd71afc4b1db7581a3cc51184314c3d6b19e7"} Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.372955 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-29x4s" Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.381337 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-ls998" event={"ID":"71e00962-6b2f-495c-8f34-52993f66cef9","Type":"ContainerStarted","Data":"5b6e7b77ac7881a09bdbef9dacc95d17e67aaa52c4bc3122591a1f9bfca019c3"} Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.420990 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a250849d-ca15-40fa-8b1d-a32b5abc6861","Type":"ContainerStarted","Data":"7a22f32b80bf3ec02fab7028c9c981153ef89481c11b18583b8c1e3f0c67df24"} Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.423376 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nbhpw" event={"ID":"32dc51d9-5638-4530-91c8-5be8c13e60f3","Type":"ContainerStarted","Data":"0789b60c660687f217e3da05e0adc970132c74550fab068133b05224809ee33f"} Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.431020 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4pw9n"] Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.454272 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4pw9n"] Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.475132 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-29x4s"] Mar 13 12:05:17 crc kubenswrapper[4837]: I0313 12:05:17.483492 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-29x4s"] Mar 13 12:05:18 crc kubenswrapper[4837]: I0313 12:05:18.093584 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 13 12:05:18 crc kubenswrapper[4837]: I0313 12:05:18.231017 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 13 12:05:18 crc kubenswrapper[4837]: W0313 12:05:18.360144 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d10fcb0_4d45_45bf_a663_971b8ce74010.slice/crio-8195b06a2da393e3a7948f6a45bf28cc58d86c3113ec50ede9d3d585d004a6f6 WatchSource:0}: Error finding container 8195b06a2da393e3a7948f6a45bf28cc58d86c3113ec50ede9d3d585d004a6f6: Status 404 returned error can't find the container with id 8195b06a2da393e3a7948f6a45bf28cc58d86c3113ec50ede9d3d585d004a6f6 Mar 13 12:05:18 crc kubenswrapper[4837]: W0313 12:05:18.364768 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod38d61ffe_3c44_4657_bc91_d849f766a3e1.slice/crio-b7ce67bc082f03585d388dd5a305fcdc2954f5f487cbe3ce865785cd6c8555a4 WatchSource:0}: Error finding container b7ce67bc082f03585d388dd5a305fcdc2954f5f487cbe3ce865785cd6c8555a4: Status 404 returned error can't find the container with id b7ce67bc082f03585d388dd5a305fcdc2954f5f487cbe3ce865785cd6c8555a4 Mar 13 12:05:18 crc kubenswrapper[4837]: I0313 12:05:18.433555 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"3d10fcb0-4d45-45bf-a663-971b8ce74010","Type":"ContainerStarted","Data":"8195b06a2da393e3a7948f6a45bf28cc58d86c3113ec50ede9d3d585d004a6f6"} Mar 13 12:05:18 crc kubenswrapper[4837]: I0313 12:05:18.436815 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"38d61ffe-3c44-4657-bc91-d849f766a3e1","Type":"ContainerStarted","Data":"b7ce67bc082f03585d388dd5a305fcdc2954f5f487cbe3ce865785cd6c8555a4"} Mar 13 12:05:19 crc kubenswrapper[4837]: I0313 12:05:19.060030 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b9562f6-0527-40b4-9b2e-f5b2f22aa272" path="/var/lib/kubelet/pods/5b9562f6-0527-40b4-9b2e-f5b2f22aa272/volumes" Mar 13 12:05:19 crc kubenswrapper[4837]: I0313 12:05:19.060567 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63688ba3-e68c-4f88-a6e4-6c373b30f929" path="/var/lib/kubelet/pods/63688ba3-e68c-4f88-a6e4-6c373b30f929/volumes" Mar 13 12:05:25 crc kubenswrapper[4837]: I0313 12:05:25.502228 4837 generic.go:334] "Generic (PLEG): container finished" podID="ae5deee0-59c4-4fa7-8d8c-e12b516885dc" containerID="86bd795421dbb8f853056d0ba23fbd4446b0d4a512f91ff6b5cf999a58e9698a" exitCode=0 Mar 13 12:05:25 crc kubenswrapper[4837]: I0313 12:05:25.502432 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-7vs6f" event={"ID":"ae5deee0-59c4-4fa7-8d8c-e12b516885dc","Type":"ContainerDied","Data":"86bd795421dbb8f853056d0ba23fbd4446b0d4a512f91ff6b5cf999a58e9698a"} Mar 13 12:05:26 crc kubenswrapper[4837]: I0313 12:05:26.511675 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"38d61ffe-3c44-4657-bc91-d849f766a3e1","Type":"ContainerStarted","Data":"9c66450db75ad1df997cb59ba25629075bbdda7b2d722c10e976e13d82921a53"} Mar 13 12:05:26 crc kubenswrapper[4837]: I0313 12:05:26.513181 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"362e31d4-ea62-40ed-8426-982d47559472","Type":"ContainerStarted","Data":"e6271d6852050c2d1ad25293179089e8f2036d9ca9c515d25f1b4682afdad63a"} Mar 13 12:05:26 crc kubenswrapper[4837]: I0313 12:05:26.515530 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-7wbz7" event={"ID":"b082689f-6a6d-4da0-b2b1-f78343ba1e85","Type":"ContainerStarted","Data":"5938df2cd4038d22580ec15dba2d8841cc84e40dbbc2c06c0a6f8ca76641fd6c"} Mar 13 12:05:26 crc kubenswrapper[4837]: I0313 12:05:26.515684 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc8479f9-7wbz7" Mar 13 12:05:26 crc kubenswrapper[4837]: I0313 12:05:26.517243 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-ls998" event={"ID":"71e00962-6b2f-495c-8f34-52993f66cef9","Type":"ContainerStarted","Data":"12a87353cee06d3c720268e4190ee48375a699885842e4a67424170d2dca396e"} Mar 13 12:05:26 crc kubenswrapper[4837]: I0313 12:05:26.558459 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc8479f9-7wbz7" podStartSLOduration=24.063680955 podStartE2EDuration="25.558435223s" podCreationTimestamp="2026-03-13 12:05:01 +0000 UTC" firstStartedPulling="2026-03-13 12:05:14.539071641 +0000 UTC m=+1030.177338404" lastFinishedPulling="2026-03-13 12:05:16.033825909 +0000 UTC m=+1031.672092672" observedRunningTime="2026-03-13 12:05:26.557310777 +0000 UTC m=+1042.195577540" watchObservedRunningTime="2026-03-13 12:05:26.558435223 +0000 UTC m=+1042.196701996" Mar 13 12:05:27 crc kubenswrapper[4837]: I0313 12:05:27.541257 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e7b01be4-73b6-48eb-a06d-4fb38863d982","Type":"ContainerStarted","Data":"afe3a88a0e8205fefe122a8099e4acc29a3ebc22c1a9a9cfe3c00f5ab1794007"} Mar 13 12:05:27 crc kubenswrapper[4837]: I0313 12:05:27.545309 4837 generic.go:334] "Generic (PLEG): container finished" podID="71e00962-6b2f-495c-8f34-52993f66cef9" containerID="12a87353cee06d3c720268e4190ee48375a699885842e4a67424170d2dca396e" exitCode=0 Mar 13 12:05:27 crc kubenswrapper[4837]: I0313 12:05:27.545409 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-ls998" event={"ID":"71e00962-6b2f-495c-8f34-52993f66cef9","Type":"ContainerDied","Data":"12a87353cee06d3c720268e4190ee48375a699885842e4a67424170d2dca396e"} Mar 13 12:05:27 crc kubenswrapper[4837]: I0313 12:05:27.554338 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd","Type":"ContainerStarted","Data":"42d9d96b2ed8394546063531c8e1be585dbcac59412f38099bec42e73ac4a269"} Mar 13 12:05:27 crc kubenswrapper[4837]: I0313 12:05:27.563075 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-7vs6f" event={"ID":"ae5deee0-59c4-4fa7-8d8c-e12b516885dc","Type":"ContainerStarted","Data":"5550e9e7c07715ced202f75aa48fd7f0efbb60ae462c21120aa95405a645e06f"} Mar 13 12:05:27 crc kubenswrapper[4837]: I0313 12:05:27.563244 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-7vs6f" Mar 13 12:05:27 crc kubenswrapper[4837]: I0313 12:05:27.574516 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"13254c8b-516c-435e-9db2-a8d518434f29","Type":"ContainerStarted","Data":"d7e3a2439b933c4a76e0a0472aaff3cf352a36e55d6c5a3aa674478c5299b9be"} Mar 13 12:05:27 crc kubenswrapper[4837]: I0313 12:05:27.580482 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"3d10fcb0-4d45-45bf-a663-971b8ce74010","Type":"ContainerStarted","Data":"f6f3d45ae4a8f0eb2588879a86f07d1cae21417054813611a749565840b2152d"} Mar 13 12:05:27 crc kubenswrapper[4837]: I0313 12:05:27.593540 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nbhpw" event={"ID":"32dc51d9-5638-4530-91c8-5be8c13e60f3","Type":"ContainerStarted","Data":"fcd880f044b42eae49579b77a948d25c9288848e485da833c17099adb9827f62"} Mar 13 12:05:27 crc kubenswrapper[4837]: I0313 12:05:27.595137 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-nbhpw" Mar 13 12:05:27 crc kubenswrapper[4837]: I0313 12:05:27.597220 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-7vs6f" podStartSLOduration=26.597180556 podStartE2EDuration="26.597180556s" podCreationTimestamp="2026-03-13 12:05:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:05:27.589830033 +0000 UTC m=+1043.228096796" watchObservedRunningTime="2026-03-13 12:05:27.597180556 +0000 UTC m=+1043.235447329" Mar 13 12:05:27 crc kubenswrapper[4837]: I0313 12:05:27.601486 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a250849d-ca15-40fa-8b1d-a32b5abc6861","Type":"ContainerStarted","Data":"07fc1a83feb8d7932c2b80f34ffbd6218ef230bb996e92d9892feae57b23c402"} Mar 13 12:05:27 crc kubenswrapper[4837]: I0313 12:05:27.601978 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 13 12:05:27 crc kubenswrapper[4837]: I0313 12:05:27.610884 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"ae39431b-5fa4-4a09-b76f-44b4d256c129","Type":"ContainerStarted","Data":"57ca39395feb458a8f10d2261da889c29c26f61eed4082b3099aea11d6719d00"} Mar 13 12:05:27 crc kubenswrapper[4837]: I0313 12:05:27.611942 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 13 12:05:27 crc kubenswrapper[4837]: I0313 12:05:27.674681 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=14.865654186 podStartE2EDuration="22.674626719s" podCreationTimestamp="2026-03-13 12:05:05 +0000 UTC" firstStartedPulling="2026-03-13 12:05:17.155047825 +0000 UTC m=+1032.793314588" lastFinishedPulling="2026-03-13 12:05:24.964020368 +0000 UTC m=+1040.602287121" observedRunningTime="2026-03-13 12:05:27.669680272 +0000 UTC m=+1043.307947035" watchObservedRunningTime="2026-03-13 12:05:27.674626719 +0000 UTC m=+1043.312893482" Mar 13 12:05:27 crc kubenswrapper[4837]: I0313 12:05:27.688870 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=10.419684323 podStartE2EDuration="19.688844119s" podCreationTimestamp="2026-03-13 12:05:08 +0000 UTC" firstStartedPulling="2026-03-13 12:05:16.471252385 +0000 UTC m=+1032.109519148" lastFinishedPulling="2026-03-13 12:05:25.740412181 +0000 UTC m=+1041.378678944" observedRunningTime="2026-03-13 12:05:27.686988021 +0000 UTC m=+1043.325254784" watchObservedRunningTime="2026-03-13 12:05:27.688844119 +0000 UTC m=+1043.327110882" Mar 13 12:05:27 crc kubenswrapper[4837]: I0313 12:05:27.722382 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-nbhpw" podStartSLOduration=8.816655873 podStartE2EDuration="16.72234785s" podCreationTimestamp="2026-03-13 12:05:11 +0000 UTC" firstStartedPulling="2026-03-13 12:05:17.159817716 +0000 UTC m=+1032.798084479" lastFinishedPulling="2026-03-13 12:05:25.065509693 +0000 UTC m=+1040.703776456" observedRunningTime="2026-03-13 12:05:27.714284025 +0000 UTC m=+1043.352550788" watchObservedRunningTime="2026-03-13 12:05:27.72234785 +0000 UTC m=+1043.360614613" Mar 13 12:05:28 crc kubenswrapper[4837]: I0313 12:05:28.628207 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-ls998" event={"ID":"71e00962-6b2f-495c-8f34-52993f66cef9","Type":"ContainerStarted","Data":"04a1193282a85494c2ab05d91c89fc7c4037180e86d5b306a180cc69d1011c14"} Mar 13 12:05:28 crc kubenswrapper[4837]: I0313 12:05:28.628827 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-ls998" event={"ID":"71e00962-6b2f-495c-8f34-52993f66cef9","Type":"ContainerStarted","Data":"8ad679f31ac7d1ea967741bb7fc2b12c07ed73d803e94bc78bc90f1333ccac41"} Mar 13 12:05:29 crc kubenswrapper[4837]: I0313 12:05:29.635421 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-ls998" Mar 13 12:05:29 crc kubenswrapper[4837]: I0313 12:05:29.635473 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-ls998" Mar 13 12:05:30 crc kubenswrapper[4837]: I0313 12:05:30.656727 4837 generic.go:334] "Generic (PLEG): container finished" podID="adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd" containerID="42d9d96b2ed8394546063531c8e1be585dbcac59412f38099bec42e73ac4a269" exitCode=0 Mar 13 12:05:30 crc kubenswrapper[4837]: I0313 12:05:30.656831 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd","Type":"ContainerDied","Data":"42d9d96b2ed8394546063531c8e1be585dbcac59412f38099bec42e73ac4a269"} Mar 13 12:05:30 crc kubenswrapper[4837]: I0313 12:05:30.662317 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"38d61ffe-3c44-4657-bc91-d849f766a3e1","Type":"ContainerStarted","Data":"79e9c120dab3e4846bad3812fbe3a0b8fd9ee6861488dbae0fbc00248f43dc50"} Mar 13 12:05:30 crc kubenswrapper[4837]: I0313 12:05:30.664609 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"3d10fcb0-4d45-45bf-a663-971b8ce74010","Type":"ContainerStarted","Data":"9f39dd85c04bee002aa20f6ebdae34460e35ba660b37d65ea48aa4af4d70b080"} Mar 13 12:05:30 crc kubenswrapper[4837]: I0313 12:05:30.666954 4837 generic.go:334] "Generic (PLEG): container finished" podID="362e31d4-ea62-40ed-8426-982d47559472" containerID="e6271d6852050c2d1ad25293179089e8f2036d9ca9c515d25f1b4682afdad63a" exitCode=0 Mar 13 12:05:30 crc kubenswrapper[4837]: I0313 12:05:30.667598 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"362e31d4-ea62-40ed-8426-982d47559472","Type":"ContainerDied","Data":"e6271d6852050c2d1ad25293179089e8f2036d9ca9c515d25f1b4682afdad63a"} Mar 13 12:05:30 crc kubenswrapper[4837]: I0313 12:05:30.683222 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-ls998" podStartSLOduration=12.016478999 podStartE2EDuration="19.683204907s" podCreationTimestamp="2026-03-13 12:05:11 +0000 UTC" firstStartedPulling="2026-03-13 12:05:17.159926819 +0000 UTC m=+1032.798193602" lastFinishedPulling="2026-03-13 12:05:24.826652747 +0000 UTC m=+1040.464919510" observedRunningTime="2026-03-13 12:05:28.657315886 +0000 UTC m=+1044.295582649" watchObservedRunningTime="2026-03-13 12:05:30.683204907 +0000 UTC m=+1046.321471670" Mar 13 12:05:30 crc kubenswrapper[4837]: I0313 12:05:30.713753 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=5.900969137 podStartE2EDuration="17.713725724s" podCreationTimestamp="2026-03-13 12:05:13 +0000 UTC" firstStartedPulling="2026-03-13 12:05:18.362623155 +0000 UTC m=+1034.000889918" lastFinishedPulling="2026-03-13 12:05:30.175379742 +0000 UTC m=+1045.813646505" observedRunningTime="2026-03-13 12:05:30.708462408 +0000 UTC m=+1046.346729181" watchObservedRunningTime="2026-03-13 12:05:30.713725724 +0000 UTC m=+1046.351992487" Mar 13 12:05:30 crc kubenswrapper[4837]: I0313 12:05:30.741923 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=5.952283422 podStartE2EDuration="17.741895136s" podCreationTimestamp="2026-03-13 12:05:13 +0000 UTC" firstStartedPulling="2026-03-13 12:05:18.367990405 +0000 UTC m=+1034.006257168" lastFinishedPulling="2026-03-13 12:05:30.157602119 +0000 UTC m=+1045.795868882" observedRunningTime="2026-03-13 12:05:30.730106714 +0000 UTC m=+1046.368373477" watchObservedRunningTime="2026-03-13 12:05:30.741895136 +0000 UTC m=+1046.380161899" Mar 13 12:05:31 crc kubenswrapper[4837]: I0313 12:05:31.239290 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 13 12:05:31 crc kubenswrapper[4837]: I0313 12:05:31.675675 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd","Type":"ContainerStarted","Data":"4da61201dc7ec1c5870826529a37d07172ef4f8646e96fa3dc5baf6b06eeb75f"} Mar 13 12:05:31 crc kubenswrapper[4837]: I0313 12:05:31.677395 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"362e31d4-ea62-40ed-8426-982d47559472","Type":"ContainerStarted","Data":"7833797c9eba5aca08144d6c0ccc75cf7c6d31ad43b74cf435cc15ebd56e332f"} Mar 13 12:05:31 crc kubenswrapper[4837]: I0313 12:05:31.697532 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=20.878127273 podStartE2EDuration="28.697511217s" podCreationTimestamp="2026-03-13 12:05:03 +0000 UTC" firstStartedPulling="2026-03-13 12:05:17.14513735 +0000 UTC m=+1032.783404113" lastFinishedPulling="2026-03-13 12:05:24.964521294 +0000 UTC m=+1040.602788057" observedRunningTime="2026-03-13 12:05:31.693915592 +0000 UTC m=+1047.332182375" watchObservedRunningTime="2026-03-13 12:05:31.697511217 +0000 UTC m=+1047.335777980" Mar 13 12:05:31 crc kubenswrapper[4837]: I0313 12:05:31.713876 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=19.025298679 podStartE2EDuration="27.713856744s" podCreationTimestamp="2026-03-13 12:05:04 +0000 UTC" firstStartedPulling="2026-03-13 12:05:16.925361859 +0000 UTC m=+1032.563628622" lastFinishedPulling="2026-03-13 12:05:25.613919924 +0000 UTC m=+1041.252186687" observedRunningTime="2026-03-13 12:05:31.712150881 +0000 UTC m=+1047.350417644" watchObservedRunningTime="2026-03-13 12:05:31.713856744 +0000 UTC m=+1047.352123507" Mar 13 12:05:32 crc kubenswrapper[4837]: I0313 12:05:32.013820 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5ccc8479f9-7wbz7" Mar 13 12:05:32 crc kubenswrapper[4837]: I0313 12:05:32.292589 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-7vs6f" Mar 13 12:05:32 crc kubenswrapper[4837]: I0313 12:05:32.349417 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-7wbz7"] Mar 13 12:05:32 crc kubenswrapper[4837]: I0313 12:05:32.684344 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc8479f9-7wbz7" podUID="b082689f-6a6d-4da0-b2b1-f78343ba1e85" containerName="dnsmasq-dns" containerID="cri-o://5938df2cd4038d22580ec15dba2d8841cc84e40dbbc2c06c0a6f8ca76641fd6c" gracePeriod=10 Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.021530 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.063520 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.068779 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-7wbz7" Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.120063 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b082689f-6a6d-4da0-b2b1-f78343ba1e85-dns-svc\") pod \"b082689f-6a6d-4da0-b2b1-f78343ba1e85\" (UID: \"b082689f-6a6d-4da0-b2b1-f78343ba1e85\") " Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.120140 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b082689f-6a6d-4da0-b2b1-f78343ba1e85-config\") pod \"b082689f-6a6d-4da0-b2b1-f78343ba1e85\" (UID: \"b082689f-6a6d-4da0-b2b1-f78343ba1e85\") " Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.120851 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcjgv\" (UniqueName: \"kubernetes.io/projected/b082689f-6a6d-4da0-b2b1-f78343ba1e85-kube-api-access-bcjgv\") pod \"b082689f-6a6d-4da0-b2b1-f78343ba1e85\" (UID: \"b082689f-6a6d-4da0-b2b1-f78343ba1e85\") " Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.165557 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b082689f-6a6d-4da0-b2b1-f78343ba1e85-kube-api-access-bcjgv" (OuterVolumeSpecName: "kube-api-access-bcjgv") pod "b082689f-6a6d-4da0-b2b1-f78343ba1e85" (UID: "b082689f-6a6d-4da0-b2b1-f78343ba1e85"). InnerVolumeSpecName "kube-api-access-bcjgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.204304 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b082689f-6a6d-4da0-b2b1-f78343ba1e85-config" (OuterVolumeSpecName: "config") pod "b082689f-6a6d-4da0-b2b1-f78343ba1e85" (UID: "b082689f-6a6d-4da0-b2b1-f78343ba1e85"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.215426 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b082689f-6a6d-4da0-b2b1-f78343ba1e85-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b082689f-6a6d-4da0-b2b1-f78343ba1e85" (UID: "b082689f-6a6d-4da0-b2b1-f78343ba1e85"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.256265 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.263101 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b082689f-6a6d-4da0-b2b1-f78343ba1e85-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.263141 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcjgv\" (UniqueName: \"kubernetes.io/projected/b082689f-6a6d-4da0-b2b1-f78343ba1e85-kube-api-access-bcjgv\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.263157 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b082689f-6a6d-4da0-b2b1-f78343ba1e85-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.290052 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.692336 4837 generic.go:334] "Generic (PLEG): container finished" podID="b082689f-6a6d-4da0-b2b1-f78343ba1e85" containerID="5938df2cd4038d22580ec15dba2d8841cc84e40dbbc2c06c0a6f8ca76641fd6c" exitCode=0 Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.692420 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-7wbz7" Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.693340 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-7wbz7" event={"ID":"b082689f-6a6d-4da0-b2b1-f78343ba1e85","Type":"ContainerDied","Data":"5938df2cd4038d22580ec15dba2d8841cc84e40dbbc2c06c0a6f8ca76641fd6c"} Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.693470 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-7wbz7" event={"ID":"b082689f-6a6d-4da0-b2b1-f78343ba1e85","Type":"ContainerDied","Data":"cf45881769a320d80a0b475b78878572d21663b4dd8fafe7ae1c9681a95c4a07"} Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.693585 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.693704 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.693668 4837 scope.go:117] "RemoveContainer" containerID="5938df2cd4038d22580ec15dba2d8841cc84e40dbbc2c06c0a6f8ca76641fd6c" Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.724529 4837 scope.go:117] "RemoveContainer" containerID="31ac9d4659c535248d95613eed33e384ae2a9e10c0405e3e50bc5460cd46aeea" Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.736371 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-7wbz7"] Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.743990 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-7wbz7"] Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.744190 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.744436 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.757394 4837 scope.go:117] "RemoveContainer" containerID="5938df2cd4038d22580ec15dba2d8841cc84e40dbbc2c06c0a6f8ca76641fd6c" Mar 13 12:05:33 crc kubenswrapper[4837]: E0313 12:05:33.760115 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5938df2cd4038d22580ec15dba2d8841cc84e40dbbc2c06c0a6f8ca76641fd6c\": container with ID starting with 5938df2cd4038d22580ec15dba2d8841cc84e40dbbc2c06c0a6f8ca76641fd6c not found: ID does not exist" containerID="5938df2cd4038d22580ec15dba2d8841cc84e40dbbc2c06c0a6f8ca76641fd6c" Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.760161 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5938df2cd4038d22580ec15dba2d8841cc84e40dbbc2c06c0a6f8ca76641fd6c"} err="failed to get container status \"5938df2cd4038d22580ec15dba2d8841cc84e40dbbc2c06c0a6f8ca76641fd6c\": rpc error: code = NotFound desc = could not find container \"5938df2cd4038d22580ec15dba2d8841cc84e40dbbc2c06c0a6f8ca76641fd6c\": container with ID starting with 5938df2cd4038d22580ec15dba2d8841cc84e40dbbc2c06c0a6f8ca76641fd6c not found: ID does not exist" Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.760188 4837 scope.go:117] "RemoveContainer" containerID="31ac9d4659c535248d95613eed33e384ae2a9e10c0405e3e50bc5460cd46aeea" Mar 13 12:05:33 crc kubenswrapper[4837]: E0313 12:05:33.760496 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31ac9d4659c535248d95613eed33e384ae2a9e10c0405e3e50bc5460cd46aeea\": container with ID starting with 31ac9d4659c535248d95613eed33e384ae2a9e10c0405e3e50bc5460cd46aeea not found: ID does not exist" containerID="31ac9d4659c535248d95613eed33e384ae2a9e10c0405e3e50bc5460cd46aeea" Mar 13 12:05:33 crc kubenswrapper[4837]: I0313 12:05:33.760533 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31ac9d4659c535248d95613eed33e384ae2a9e10c0405e3e50bc5460cd46aeea"} err="failed to get container status \"31ac9d4659c535248d95613eed33e384ae2a9e10c0405e3e50bc5460cd46aeea\": rpc error: code = NotFound desc = could not find container \"31ac9d4659c535248d95613eed33e384ae2a9e10c0405e3e50bc5460cd46aeea\": container with ID starting with 31ac9d4659c535248d95613eed33e384ae2a9e10c0405e3e50bc5460cd46aeea not found: ID does not exist" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.005936 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-c7zsw"] Mar 13 12:05:34 crc kubenswrapper[4837]: E0313 12:05:34.006326 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b082689f-6a6d-4da0-b2b1-f78343ba1e85" containerName="dnsmasq-dns" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.006346 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="b082689f-6a6d-4da0-b2b1-f78343ba1e85" containerName="dnsmasq-dns" Mar 13 12:05:34 crc kubenswrapper[4837]: E0313 12:05:34.006377 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b082689f-6a6d-4da0-b2b1-f78343ba1e85" containerName="init" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.006384 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="b082689f-6a6d-4da0-b2b1-f78343ba1e85" containerName="init" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.006528 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="b082689f-6a6d-4da0-b2b1-f78343ba1e85" containerName="dnsmasq-dns" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.007380 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-c7zsw" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.013320 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.023895 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-c7zsw"] Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.073427 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tblrl\" (UniqueName: \"kubernetes.io/projected/3a34d203-ee9d-4cb5-b2a4-64f57334e1e5-kube-api-access-tblrl\") pod \"dnsmasq-dns-7fd796d7df-c7zsw\" (UID: \"3a34d203-ee9d-4cb5-b2a4-64f57334e1e5\") " pod="openstack/dnsmasq-dns-7fd796d7df-c7zsw" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.073494 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a34d203-ee9d-4cb5-b2a4-64f57334e1e5-config\") pod \"dnsmasq-dns-7fd796d7df-c7zsw\" (UID: \"3a34d203-ee9d-4cb5-b2a4-64f57334e1e5\") " pod="openstack/dnsmasq-dns-7fd796d7df-c7zsw" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.073775 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a34d203-ee9d-4cb5-b2a4-64f57334e1e5-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-c7zsw\" (UID: \"3a34d203-ee9d-4cb5-b2a4-64f57334e1e5\") " pod="openstack/dnsmasq-dns-7fd796d7df-c7zsw" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.074044 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a34d203-ee9d-4cb5-b2a4-64f57334e1e5-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-c7zsw\" (UID: \"3a34d203-ee9d-4cb5-b2a4-64f57334e1e5\") " pod="openstack/dnsmasq-dns-7fd796d7df-c7zsw" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.175628 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a34d203-ee9d-4cb5-b2a4-64f57334e1e5-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-c7zsw\" (UID: \"3a34d203-ee9d-4cb5-b2a4-64f57334e1e5\") " pod="openstack/dnsmasq-dns-7fd796d7df-c7zsw" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.175747 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a34d203-ee9d-4cb5-b2a4-64f57334e1e5-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-c7zsw\" (UID: \"3a34d203-ee9d-4cb5-b2a4-64f57334e1e5\") " pod="openstack/dnsmasq-dns-7fd796d7df-c7zsw" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.175790 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tblrl\" (UniqueName: \"kubernetes.io/projected/3a34d203-ee9d-4cb5-b2a4-64f57334e1e5-kube-api-access-tblrl\") pod \"dnsmasq-dns-7fd796d7df-c7zsw\" (UID: \"3a34d203-ee9d-4cb5-b2a4-64f57334e1e5\") " pod="openstack/dnsmasq-dns-7fd796d7df-c7zsw" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.175827 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a34d203-ee9d-4cb5-b2a4-64f57334e1e5-config\") pod \"dnsmasq-dns-7fd796d7df-c7zsw\" (UID: \"3a34d203-ee9d-4cb5-b2a4-64f57334e1e5\") " pod="openstack/dnsmasq-dns-7fd796d7df-c7zsw" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.176563 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a34d203-ee9d-4cb5-b2a4-64f57334e1e5-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-c7zsw\" (UID: \"3a34d203-ee9d-4cb5-b2a4-64f57334e1e5\") " pod="openstack/dnsmasq-dns-7fd796d7df-c7zsw" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.176584 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a34d203-ee9d-4cb5-b2a4-64f57334e1e5-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-c7zsw\" (UID: \"3a34d203-ee9d-4cb5-b2a4-64f57334e1e5\") " pod="openstack/dnsmasq-dns-7fd796d7df-c7zsw" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.176829 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a34d203-ee9d-4cb5-b2a4-64f57334e1e5-config\") pod \"dnsmasq-dns-7fd796d7df-c7zsw\" (UID: \"3a34d203-ee9d-4cb5-b2a4-64f57334e1e5\") " pod="openstack/dnsmasq-dns-7fd796d7df-c7zsw" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.178673 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-w69p6"] Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.179616 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-w69p6" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.183203 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.221463 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tblrl\" (UniqueName: \"kubernetes.io/projected/3a34d203-ee9d-4cb5-b2a4-64f57334e1e5-kube-api-access-tblrl\") pod \"dnsmasq-dns-7fd796d7df-c7zsw\" (UID: \"3a34d203-ee9d-4cb5-b2a4-64f57334e1e5\") " pod="openstack/dnsmasq-dns-7fd796d7df-c7zsw" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.252205 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-w69p6"] Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.281253 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5g7q\" (UniqueName: \"kubernetes.io/projected/18eb496a-7d9f-4bf6-af71-3b7b585d0f7d-kube-api-access-m5g7q\") pod \"ovn-controller-metrics-w69p6\" (UID: \"18eb496a-7d9f-4bf6-af71-3b7b585d0f7d\") " pod="openstack/ovn-controller-metrics-w69p6" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.281302 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18eb496a-7d9f-4bf6-af71-3b7b585d0f7d-combined-ca-bundle\") pod \"ovn-controller-metrics-w69p6\" (UID: \"18eb496a-7d9f-4bf6-af71-3b7b585d0f7d\") " pod="openstack/ovn-controller-metrics-w69p6" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.281349 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/18eb496a-7d9f-4bf6-af71-3b7b585d0f7d-ovn-rundir\") pod \"ovn-controller-metrics-w69p6\" (UID: \"18eb496a-7d9f-4bf6-af71-3b7b585d0f7d\") " pod="openstack/ovn-controller-metrics-w69p6" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.281371 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18eb496a-7d9f-4bf6-af71-3b7b585d0f7d-config\") pod \"ovn-controller-metrics-w69p6\" (UID: \"18eb496a-7d9f-4bf6-af71-3b7b585d0f7d\") " pod="openstack/ovn-controller-metrics-w69p6" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.281403 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/18eb496a-7d9f-4bf6-af71-3b7b585d0f7d-ovs-rundir\") pod \"ovn-controller-metrics-w69p6\" (UID: \"18eb496a-7d9f-4bf6-af71-3b7b585d0f7d\") " pod="openstack/ovn-controller-metrics-w69p6" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.281443 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/18eb496a-7d9f-4bf6-af71-3b7b585d0f7d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-w69p6\" (UID: \"18eb496a-7d9f-4bf6-af71-3b7b585d0f7d\") " pod="openstack/ovn-controller-metrics-w69p6" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.283428 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-c7zsw"] Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.289040 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-c7zsw" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.321025 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-f6w8l"] Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.322404 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-f6w8l" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.326235 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.353098 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-f6w8l"] Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.382758 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18eb496a-7d9f-4bf6-af71-3b7b585d0f7d-combined-ca-bundle\") pod \"ovn-controller-metrics-w69p6\" (UID: \"18eb496a-7d9f-4bf6-af71-3b7b585d0f7d\") " pod="openstack/ovn-controller-metrics-w69p6" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.382838 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s72rx\" (UniqueName: \"kubernetes.io/projected/177d4af4-1f81-43ff-bcbc-b3d74689452f-kube-api-access-s72rx\") pod \"dnsmasq-dns-86db49b7ff-f6w8l\" (UID: \"177d4af4-1f81-43ff-bcbc-b3d74689452f\") " pod="openstack/dnsmasq-dns-86db49b7ff-f6w8l" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.382886 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/18eb496a-7d9f-4bf6-af71-3b7b585d0f7d-ovn-rundir\") pod \"ovn-controller-metrics-w69p6\" (UID: \"18eb496a-7d9f-4bf6-af71-3b7b585d0f7d\") " pod="openstack/ovn-controller-metrics-w69p6" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.382912 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18eb496a-7d9f-4bf6-af71-3b7b585d0f7d-config\") pod \"ovn-controller-metrics-w69p6\" (UID: \"18eb496a-7d9f-4bf6-af71-3b7b585d0f7d\") " pod="openstack/ovn-controller-metrics-w69p6" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.382950 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/18eb496a-7d9f-4bf6-af71-3b7b585d0f7d-ovs-rundir\") pod \"ovn-controller-metrics-w69p6\" (UID: \"18eb496a-7d9f-4bf6-af71-3b7b585d0f7d\") " pod="openstack/ovn-controller-metrics-w69p6" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.383007 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/177d4af4-1f81-43ff-bcbc-b3d74689452f-config\") pod \"dnsmasq-dns-86db49b7ff-f6w8l\" (UID: \"177d4af4-1f81-43ff-bcbc-b3d74689452f\") " pod="openstack/dnsmasq-dns-86db49b7ff-f6w8l" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.383033 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/177d4af4-1f81-43ff-bcbc-b3d74689452f-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-f6w8l\" (UID: \"177d4af4-1f81-43ff-bcbc-b3d74689452f\") " pod="openstack/dnsmasq-dns-86db49b7ff-f6w8l" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.383066 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/18eb496a-7d9f-4bf6-af71-3b7b585d0f7d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-w69p6\" (UID: \"18eb496a-7d9f-4bf6-af71-3b7b585d0f7d\") " pod="openstack/ovn-controller-metrics-w69p6" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.383110 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/177d4af4-1f81-43ff-bcbc-b3d74689452f-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-f6w8l\" (UID: \"177d4af4-1f81-43ff-bcbc-b3d74689452f\") " pod="openstack/dnsmasq-dns-86db49b7ff-f6w8l" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.383167 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/177d4af4-1f81-43ff-bcbc-b3d74689452f-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-f6w8l\" (UID: \"177d4af4-1f81-43ff-bcbc-b3d74689452f\") " pod="openstack/dnsmasq-dns-86db49b7ff-f6w8l" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.383192 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5g7q\" (UniqueName: \"kubernetes.io/projected/18eb496a-7d9f-4bf6-af71-3b7b585d0f7d-kube-api-access-m5g7q\") pod \"ovn-controller-metrics-w69p6\" (UID: \"18eb496a-7d9f-4bf6-af71-3b7b585d0f7d\") " pod="openstack/ovn-controller-metrics-w69p6" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.384980 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/18eb496a-7d9f-4bf6-af71-3b7b585d0f7d-ovn-rundir\") pod \"ovn-controller-metrics-w69p6\" (UID: \"18eb496a-7d9f-4bf6-af71-3b7b585d0f7d\") " pod="openstack/ovn-controller-metrics-w69p6" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.385721 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18eb496a-7d9f-4bf6-af71-3b7b585d0f7d-config\") pod \"ovn-controller-metrics-w69p6\" (UID: \"18eb496a-7d9f-4bf6-af71-3b7b585d0f7d\") " pod="openstack/ovn-controller-metrics-w69p6" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.385798 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/18eb496a-7d9f-4bf6-af71-3b7b585d0f7d-ovs-rundir\") pod \"ovn-controller-metrics-w69p6\" (UID: \"18eb496a-7d9f-4bf6-af71-3b7b585d0f7d\") " pod="openstack/ovn-controller-metrics-w69p6" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.393621 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18eb496a-7d9f-4bf6-af71-3b7b585d0f7d-combined-ca-bundle\") pod \"ovn-controller-metrics-w69p6\" (UID: \"18eb496a-7d9f-4bf6-af71-3b7b585d0f7d\") " pod="openstack/ovn-controller-metrics-w69p6" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.409228 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/18eb496a-7d9f-4bf6-af71-3b7b585d0f7d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-w69p6\" (UID: \"18eb496a-7d9f-4bf6-af71-3b7b585d0f7d\") " pod="openstack/ovn-controller-metrics-w69p6" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.444428 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.445024 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5g7q\" (UniqueName: \"kubernetes.io/projected/18eb496a-7d9f-4bf6-af71-3b7b585d0f7d-kube-api-access-m5g7q\") pod \"ovn-controller-metrics-w69p6\" (UID: \"18eb496a-7d9f-4bf6-af71-3b7b585d0f7d\") " pod="openstack/ovn-controller-metrics-w69p6" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.446210 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.456042 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.464135 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.464357 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.464378 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.464469 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-wbggf" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.484394 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/25ea0f5e-e277-4944-8c9d-2c7709e1a8cf-scripts\") pod \"ovn-northd-0\" (UID: \"25ea0f5e-e277-4944-8c9d-2c7709e1a8cf\") " pod="openstack/ovn-northd-0" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.484435 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25ea0f5e-e277-4944-8c9d-2c7709e1a8cf-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"25ea0f5e-e277-4944-8c9d-2c7709e1a8cf\") " pod="openstack/ovn-northd-0" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.484470 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/177d4af4-1f81-43ff-bcbc-b3d74689452f-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-f6w8l\" (UID: \"177d4af4-1f81-43ff-bcbc-b3d74689452f\") " pod="openstack/dnsmasq-dns-86db49b7ff-f6w8l" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.484512 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/177d4af4-1f81-43ff-bcbc-b3d74689452f-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-f6w8l\" (UID: \"177d4af4-1f81-43ff-bcbc-b3d74689452f\") " pod="openstack/dnsmasq-dns-86db49b7ff-f6w8l" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.484543 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25ea0f5e-e277-4944-8c9d-2c7709e1a8cf-config\") pod \"ovn-northd-0\" (UID: \"25ea0f5e-e277-4944-8c9d-2c7709e1a8cf\") " pod="openstack/ovn-northd-0" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.484571 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s72rx\" (UniqueName: \"kubernetes.io/projected/177d4af4-1f81-43ff-bcbc-b3d74689452f-kube-api-access-s72rx\") pod \"dnsmasq-dns-86db49b7ff-f6w8l\" (UID: \"177d4af4-1f81-43ff-bcbc-b3d74689452f\") " pod="openstack/dnsmasq-dns-86db49b7ff-f6w8l" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.484597 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/25ea0f5e-e277-4944-8c9d-2c7709e1a8cf-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"25ea0f5e-e277-4944-8c9d-2c7709e1a8cf\") " pod="openstack/ovn-northd-0" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.484625 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtvhq\" (UniqueName: \"kubernetes.io/projected/25ea0f5e-e277-4944-8c9d-2c7709e1a8cf-kube-api-access-vtvhq\") pod \"ovn-northd-0\" (UID: \"25ea0f5e-e277-4944-8c9d-2c7709e1a8cf\") " pod="openstack/ovn-northd-0" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.484727 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/177d4af4-1f81-43ff-bcbc-b3d74689452f-config\") pod \"dnsmasq-dns-86db49b7ff-f6w8l\" (UID: \"177d4af4-1f81-43ff-bcbc-b3d74689452f\") " pod="openstack/dnsmasq-dns-86db49b7ff-f6w8l" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.484750 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/177d4af4-1f81-43ff-bcbc-b3d74689452f-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-f6w8l\" (UID: \"177d4af4-1f81-43ff-bcbc-b3d74689452f\") " pod="openstack/dnsmasq-dns-86db49b7ff-f6w8l" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.484766 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/25ea0f5e-e277-4944-8c9d-2c7709e1a8cf-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"25ea0f5e-e277-4944-8c9d-2c7709e1a8cf\") " pod="openstack/ovn-northd-0" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.484784 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/25ea0f5e-e277-4944-8c9d-2c7709e1a8cf-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"25ea0f5e-e277-4944-8c9d-2c7709e1a8cf\") " pod="openstack/ovn-northd-0" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.486099 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/177d4af4-1f81-43ff-bcbc-b3d74689452f-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-f6w8l\" (UID: \"177d4af4-1f81-43ff-bcbc-b3d74689452f\") " pod="openstack/dnsmasq-dns-86db49b7ff-f6w8l" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.486693 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/177d4af4-1f81-43ff-bcbc-b3d74689452f-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-f6w8l\" (UID: \"177d4af4-1f81-43ff-bcbc-b3d74689452f\") " pod="openstack/dnsmasq-dns-86db49b7ff-f6w8l" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.488346 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/177d4af4-1f81-43ff-bcbc-b3d74689452f-config\") pod \"dnsmasq-dns-86db49b7ff-f6w8l\" (UID: \"177d4af4-1f81-43ff-bcbc-b3d74689452f\") " pod="openstack/dnsmasq-dns-86db49b7ff-f6w8l" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.488997 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/177d4af4-1f81-43ff-bcbc-b3d74689452f-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-f6w8l\" (UID: \"177d4af4-1f81-43ff-bcbc-b3d74689452f\") " pod="openstack/dnsmasq-dns-86db49b7ff-f6w8l" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.497190 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-w69p6" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.521469 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s72rx\" (UniqueName: \"kubernetes.io/projected/177d4af4-1f81-43ff-bcbc-b3d74689452f-kube-api-access-s72rx\") pod \"dnsmasq-dns-86db49b7ff-f6w8l\" (UID: \"177d4af4-1f81-43ff-bcbc-b3d74689452f\") " pod="openstack/dnsmasq-dns-86db49b7ff-f6w8l" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.587076 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/25ea0f5e-e277-4944-8c9d-2c7709e1a8cf-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"25ea0f5e-e277-4944-8c9d-2c7709e1a8cf\") " pod="openstack/ovn-northd-0" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.587224 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/25ea0f5e-e277-4944-8c9d-2c7709e1a8cf-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"25ea0f5e-e277-4944-8c9d-2c7709e1a8cf\") " pod="openstack/ovn-northd-0" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.587248 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/25ea0f5e-e277-4944-8c9d-2c7709e1a8cf-scripts\") pod \"ovn-northd-0\" (UID: \"25ea0f5e-e277-4944-8c9d-2c7709e1a8cf\") " pod="openstack/ovn-northd-0" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.587295 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25ea0f5e-e277-4944-8c9d-2c7709e1a8cf-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"25ea0f5e-e277-4944-8c9d-2c7709e1a8cf\") " pod="openstack/ovn-northd-0" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.588702 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25ea0f5e-e277-4944-8c9d-2c7709e1a8cf-config\") pod \"ovn-northd-0\" (UID: \"25ea0f5e-e277-4944-8c9d-2c7709e1a8cf\") " pod="openstack/ovn-northd-0" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.590229 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25ea0f5e-e277-4944-8c9d-2c7709e1a8cf-config\") pod \"ovn-northd-0\" (UID: \"25ea0f5e-e277-4944-8c9d-2c7709e1a8cf\") " pod="openstack/ovn-northd-0" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.590550 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/25ea0f5e-e277-4944-8c9d-2c7709e1a8cf-scripts\") pod \"ovn-northd-0\" (UID: \"25ea0f5e-e277-4944-8c9d-2c7709e1a8cf\") " pod="openstack/ovn-northd-0" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.592284 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/25ea0f5e-e277-4944-8c9d-2c7709e1a8cf-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"25ea0f5e-e277-4944-8c9d-2c7709e1a8cf\") " pod="openstack/ovn-northd-0" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.592440 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/25ea0f5e-e277-4944-8c9d-2c7709e1a8cf-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"25ea0f5e-e277-4944-8c9d-2c7709e1a8cf\") " pod="openstack/ovn-northd-0" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.592839 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/25ea0f5e-e277-4944-8c9d-2c7709e1a8cf-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"25ea0f5e-e277-4944-8c9d-2c7709e1a8cf\") " pod="openstack/ovn-northd-0" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.592938 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtvhq\" (UniqueName: \"kubernetes.io/projected/25ea0f5e-e277-4944-8c9d-2c7709e1a8cf-kube-api-access-vtvhq\") pod \"ovn-northd-0\" (UID: \"25ea0f5e-e277-4944-8c9d-2c7709e1a8cf\") " pod="openstack/ovn-northd-0" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.593165 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/25ea0f5e-e277-4944-8c9d-2c7709e1a8cf-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"25ea0f5e-e277-4944-8c9d-2c7709e1a8cf\") " pod="openstack/ovn-northd-0" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.595686 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25ea0f5e-e277-4944-8c9d-2c7709e1a8cf-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"25ea0f5e-e277-4944-8c9d-2c7709e1a8cf\") " pod="openstack/ovn-northd-0" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.618700 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtvhq\" (UniqueName: \"kubernetes.io/projected/25ea0f5e-e277-4944-8c9d-2c7709e1a8cf-kube-api-access-vtvhq\") pod \"ovn-northd-0\" (UID: \"25ea0f5e-e277-4944-8c9d-2c7709e1a8cf\") " pod="openstack/ovn-northd-0" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.745868 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.747881 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.818329 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-f6w8l" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.854224 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 13 12:05:34 crc kubenswrapper[4837]: I0313 12:05:34.926876 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-c7zsw"] Mar 13 12:05:35 crc kubenswrapper[4837]: I0313 12:05:35.066512 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b082689f-6a6d-4da0-b2b1-f78343ba1e85" path="/var/lib/kubelet/pods/b082689f-6a6d-4da0-b2b1-f78343ba1e85/volumes" Mar 13 12:05:35 crc kubenswrapper[4837]: I0313 12:05:35.067432 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-w69p6"] Mar 13 12:05:35 crc kubenswrapper[4837]: W0313 12:05:35.288317 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod177d4af4_1f81_43ff_bcbc_b3d74689452f.slice/crio-e5c07a9f609638a177aae5955938d17ec2d0fa96ca0f604f6cfe39e8b5c432a3 WatchSource:0}: Error finding container e5c07a9f609638a177aae5955938d17ec2d0fa96ca0f604f6cfe39e8b5c432a3: Status 404 returned error can't find the container with id e5c07a9f609638a177aae5955938d17ec2d0fa96ca0f604f6cfe39e8b5c432a3 Mar 13 12:05:35 crc kubenswrapper[4837]: I0313 12:05:35.290306 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-f6w8l"] Mar 13 12:05:35 crc kubenswrapper[4837]: I0313 12:05:35.405258 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 13 12:05:35 crc kubenswrapper[4837]: W0313 12:05:35.411772 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25ea0f5e_e277_4944_8c9d_2c7709e1a8cf.slice/crio-4008525c8372527fc7d833af378cd520d521880ba97e58c049fd34a0eb17fd54 WatchSource:0}: Error finding container 4008525c8372527fc7d833af378cd520d521880ba97e58c049fd34a0eb17fd54: Status 404 returned error can't find the container with id 4008525c8372527fc7d833af378cd520d521880ba97e58c049fd34a0eb17fd54 Mar 13 12:05:35 crc kubenswrapper[4837]: I0313 12:05:35.708913 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-f6w8l" event={"ID":"177d4af4-1f81-43ff-bcbc-b3d74689452f","Type":"ContainerStarted","Data":"e5c07a9f609638a177aae5955938d17ec2d0fa96ca0f604f6cfe39e8b5c432a3"} Mar 13 12:05:35 crc kubenswrapper[4837]: I0313 12:05:35.710072 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"25ea0f5e-e277-4944-8c9d-2c7709e1a8cf","Type":"ContainerStarted","Data":"4008525c8372527fc7d833af378cd520d521880ba97e58c049fd34a0eb17fd54"} Mar 13 12:05:35 crc kubenswrapper[4837]: I0313 12:05:35.711184 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-w69p6" event={"ID":"18eb496a-7d9f-4bf6-af71-3b7b585d0f7d","Type":"ContainerStarted","Data":"0f115ab9b4ae23f3688aa37ceef85bf3548b247962900d3d66b9e2f0e067e2d5"} Mar 13 12:05:35 crc kubenswrapper[4837]: I0313 12:05:35.712438 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-c7zsw" event={"ID":"3a34d203-ee9d-4cb5-b2a4-64f57334e1e5","Type":"ContainerStarted","Data":"be59ccb857947065a413e020e34f05fc612d0f53e32fd7765216d90d410b005f"} Mar 13 12:05:36 crc kubenswrapper[4837]: I0313 12:05:36.187170 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:36 crc kubenswrapper[4837]: I0313 12:05:36.187848 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:36 crc kubenswrapper[4837]: I0313 12:05:36.741758 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:36 crc kubenswrapper[4837]: I0313 12:05:36.820979 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 13 12:05:38 crc kubenswrapper[4837]: I0313 12:05:38.442242 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-f6w8l"] Mar 13 12:05:38 crc kubenswrapper[4837]: I0313 12:05:38.442958 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 13 12:05:38 crc kubenswrapper[4837]: I0313 12:05:38.479028 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-gqrt7"] Mar 13 12:05:38 crc kubenswrapper[4837]: I0313 12:05:38.482832 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-gqrt7" Mar 13 12:05:38 crc kubenswrapper[4837]: I0313 12:05:38.496924 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-gqrt7"] Mar 13 12:05:38 crc kubenswrapper[4837]: I0313 12:05:38.559425 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq8p7\" (UniqueName: \"kubernetes.io/projected/de68f8fe-0650-4ef4-9445-d31e119de423-kube-api-access-dq8p7\") pod \"dnsmasq-dns-698758b865-gqrt7\" (UID: \"de68f8fe-0650-4ef4-9445-d31e119de423\") " pod="openstack/dnsmasq-dns-698758b865-gqrt7" Mar 13 12:05:38 crc kubenswrapper[4837]: I0313 12:05:38.559486 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de68f8fe-0650-4ef4-9445-d31e119de423-config\") pod \"dnsmasq-dns-698758b865-gqrt7\" (UID: \"de68f8fe-0650-4ef4-9445-d31e119de423\") " pod="openstack/dnsmasq-dns-698758b865-gqrt7" Mar 13 12:05:38 crc kubenswrapper[4837]: I0313 12:05:38.559522 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/de68f8fe-0650-4ef4-9445-d31e119de423-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-gqrt7\" (UID: \"de68f8fe-0650-4ef4-9445-d31e119de423\") " pod="openstack/dnsmasq-dns-698758b865-gqrt7" Mar 13 12:05:38 crc kubenswrapper[4837]: I0313 12:05:38.559594 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/de68f8fe-0650-4ef4-9445-d31e119de423-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-gqrt7\" (UID: \"de68f8fe-0650-4ef4-9445-d31e119de423\") " pod="openstack/dnsmasq-dns-698758b865-gqrt7" Mar 13 12:05:38 crc kubenswrapper[4837]: I0313 12:05:38.559764 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de68f8fe-0650-4ef4-9445-d31e119de423-dns-svc\") pod \"dnsmasq-dns-698758b865-gqrt7\" (UID: \"de68f8fe-0650-4ef4-9445-d31e119de423\") " pod="openstack/dnsmasq-dns-698758b865-gqrt7" Mar 13 12:05:38 crc kubenswrapper[4837]: I0313 12:05:38.661382 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de68f8fe-0650-4ef4-9445-d31e119de423-dns-svc\") pod \"dnsmasq-dns-698758b865-gqrt7\" (UID: \"de68f8fe-0650-4ef4-9445-d31e119de423\") " pod="openstack/dnsmasq-dns-698758b865-gqrt7" Mar 13 12:05:38 crc kubenswrapper[4837]: I0313 12:05:38.661891 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dq8p7\" (UniqueName: \"kubernetes.io/projected/de68f8fe-0650-4ef4-9445-d31e119de423-kube-api-access-dq8p7\") pod \"dnsmasq-dns-698758b865-gqrt7\" (UID: \"de68f8fe-0650-4ef4-9445-d31e119de423\") " pod="openstack/dnsmasq-dns-698758b865-gqrt7" Mar 13 12:05:38 crc kubenswrapper[4837]: I0313 12:05:38.661941 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de68f8fe-0650-4ef4-9445-d31e119de423-config\") pod \"dnsmasq-dns-698758b865-gqrt7\" (UID: \"de68f8fe-0650-4ef4-9445-d31e119de423\") " pod="openstack/dnsmasq-dns-698758b865-gqrt7" Mar 13 12:05:38 crc kubenswrapper[4837]: I0313 12:05:38.661977 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/de68f8fe-0650-4ef4-9445-d31e119de423-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-gqrt7\" (UID: \"de68f8fe-0650-4ef4-9445-d31e119de423\") " pod="openstack/dnsmasq-dns-698758b865-gqrt7" Mar 13 12:05:38 crc kubenswrapper[4837]: I0313 12:05:38.662030 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/de68f8fe-0650-4ef4-9445-d31e119de423-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-gqrt7\" (UID: \"de68f8fe-0650-4ef4-9445-d31e119de423\") " pod="openstack/dnsmasq-dns-698758b865-gqrt7" Mar 13 12:05:38 crc kubenswrapper[4837]: I0313 12:05:38.662189 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de68f8fe-0650-4ef4-9445-d31e119de423-dns-svc\") pod \"dnsmasq-dns-698758b865-gqrt7\" (UID: \"de68f8fe-0650-4ef4-9445-d31e119de423\") " pod="openstack/dnsmasq-dns-698758b865-gqrt7" Mar 13 12:05:38 crc kubenswrapper[4837]: I0313 12:05:38.662817 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de68f8fe-0650-4ef4-9445-d31e119de423-config\") pod \"dnsmasq-dns-698758b865-gqrt7\" (UID: \"de68f8fe-0650-4ef4-9445-d31e119de423\") " pod="openstack/dnsmasq-dns-698758b865-gqrt7" Mar 13 12:05:38 crc kubenswrapper[4837]: I0313 12:05:38.663117 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/de68f8fe-0650-4ef4-9445-d31e119de423-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-gqrt7\" (UID: \"de68f8fe-0650-4ef4-9445-d31e119de423\") " pod="openstack/dnsmasq-dns-698758b865-gqrt7" Mar 13 12:05:38 crc kubenswrapper[4837]: I0313 12:05:38.663630 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/de68f8fe-0650-4ef4-9445-d31e119de423-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-gqrt7\" (UID: \"de68f8fe-0650-4ef4-9445-d31e119de423\") " pod="openstack/dnsmasq-dns-698758b865-gqrt7" Mar 13 12:05:38 crc kubenswrapper[4837]: I0313 12:05:38.682075 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq8p7\" (UniqueName: \"kubernetes.io/projected/de68f8fe-0650-4ef4-9445-d31e119de423-kube-api-access-dq8p7\") pod \"dnsmasq-dns-698758b865-gqrt7\" (UID: \"de68f8fe-0650-4ef4-9445-d31e119de423\") " pod="openstack/dnsmasq-dns-698758b865-gqrt7" Mar 13 12:05:38 crc kubenswrapper[4837]: I0313 12:05:38.736388 4837 generic.go:334] "Generic (PLEG): container finished" podID="3a34d203-ee9d-4cb5-b2a4-64f57334e1e5" containerID="008473fe0498516be934d5e47eeaa2a9839d4bbed34d1703c646c45767593f85" exitCode=0 Mar 13 12:05:38 crc kubenswrapper[4837]: I0313 12:05:38.736442 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-c7zsw" event={"ID":"3a34d203-ee9d-4cb5-b2a4-64f57334e1e5","Type":"ContainerDied","Data":"008473fe0498516be934d5e47eeaa2a9839d4bbed34d1703c646c45767593f85"} Mar 13 12:05:38 crc kubenswrapper[4837]: I0313 12:05:38.739190 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-f6w8l" event={"ID":"177d4af4-1f81-43ff-bcbc-b3d74689452f","Type":"ContainerStarted","Data":"f9c06e52cd86e1b4bf6834ffdc5c8a0fcbac08a96f3424206d1b66cf562bc09a"} Mar 13 12:05:38 crc kubenswrapper[4837]: I0313 12:05:38.740431 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-w69p6" event={"ID":"18eb496a-7d9f-4bf6-af71-3b7b585d0f7d","Type":"ContainerStarted","Data":"e69192ea0ed98994a829f21ad52593d7568b0b12212fa0fc89516fd2ecf81eff"} Mar 13 12:05:38 crc kubenswrapper[4837]: I0313 12:05:38.761068 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-w69p6" podStartSLOduration=4.76104943 podStartE2EDuration="4.76104943s" podCreationTimestamp="2026-03-13 12:05:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:05:38.756617949 +0000 UTC m=+1054.394884712" watchObservedRunningTime="2026-03-13 12:05:38.76104943 +0000 UTC m=+1054.399316193" Mar 13 12:05:38 crc kubenswrapper[4837]: I0313 12:05:38.814260 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-gqrt7" Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.276316 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-gqrt7"] Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.583864 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.594543 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.597878 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.598120 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.598250 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-lccz5" Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.598250 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.613689 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.678821 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/59565710-b9bc-46e6-ad92-7f12376de17c-etc-swift\") pod \"swift-storage-0\" (UID: \"59565710-b9bc-46e6-ad92-7f12376de17c\") " pod="openstack/swift-storage-0" Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.678903 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmlmq\" (UniqueName: \"kubernetes.io/projected/59565710-b9bc-46e6-ad92-7f12376de17c-kube-api-access-wmlmq\") pod \"swift-storage-0\" (UID: \"59565710-b9bc-46e6-ad92-7f12376de17c\") " pod="openstack/swift-storage-0" Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.678928 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"59565710-b9bc-46e6-ad92-7f12376de17c\") " pod="openstack/swift-storage-0" Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.679072 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/59565710-b9bc-46e6-ad92-7f12376de17c-cache\") pod \"swift-storage-0\" (UID: \"59565710-b9bc-46e6-ad92-7f12376de17c\") " pod="openstack/swift-storage-0" Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.679283 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59565710-b9bc-46e6-ad92-7f12376de17c-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"59565710-b9bc-46e6-ad92-7f12376de17c\") " pod="openstack/swift-storage-0" Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.679372 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/59565710-b9bc-46e6-ad92-7f12376de17c-lock\") pod \"swift-storage-0\" (UID: \"59565710-b9bc-46e6-ad92-7f12376de17c\") " pod="openstack/swift-storage-0" Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.749691 4837 generic.go:334] "Generic (PLEG): container finished" podID="177d4af4-1f81-43ff-bcbc-b3d74689452f" containerID="f9c06e52cd86e1b4bf6834ffdc5c8a0fcbac08a96f3424206d1b66cf562bc09a" exitCode=0 Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.749842 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-f6w8l" event={"ID":"177d4af4-1f81-43ff-bcbc-b3d74689452f","Type":"ContainerDied","Data":"f9c06e52cd86e1b4bf6834ffdc5c8a0fcbac08a96f3424206d1b66cf562bc09a"} Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.752923 4837 generic.go:334] "Generic (PLEG): container finished" podID="de68f8fe-0650-4ef4-9445-d31e119de423" containerID="e03f96aaa50d1c9241f9e2fad6e8df257f1e78642de37d3f89872036b5b55220" exitCode=0 Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.754183 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-gqrt7" event={"ID":"de68f8fe-0650-4ef4-9445-d31e119de423","Type":"ContainerDied","Data":"e03f96aaa50d1c9241f9e2fad6e8df257f1e78642de37d3f89872036b5b55220"} Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.754239 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-gqrt7" event={"ID":"de68f8fe-0650-4ef4-9445-d31e119de423","Type":"ContainerStarted","Data":"d23a17995d98b2790b62117dc60f3874a46893982c985ce77e930333e0f2f46d"} Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.784925 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/59565710-b9bc-46e6-ad92-7f12376de17c-etc-swift\") pod \"swift-storage-0\" (UID: \"59565710-b9bc-46e6-ad92-7f12376de17c\") " pod="openstack/swift-storage-0" Mar 13 12:05:39 crc kubenswrapper[4837]: E0313 12:05:39.785062 4837 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 13 12:05:39 crc kubenswrapper[4837]: E0313 12:05:39.785074 4837 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 13 12:05:39 crc kubenswrapper[4837]: E0313 12:05:39.785112 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/59565710-b9bc-46e6-ad92-7f12376de17c-etc-swift podName:59565710-b9bc-46e6-ad92-7f12376de17c nodeName:}" failed. No retries permitted until 2026-03-13 12:05:40.285097497 +0000 UTC m=+1055.923364260 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/59565710-b9bc-46e6-ad92-7f12376de17c-etc-swift") pod "swift-storage-0" (UID: "59565710-b9bc-46e6-ad92-7f12376de17c") : configmap "swift-ring-files" not found Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.785300 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmlmq\" (UniqueName: \"kubernetes.io/projected/59565710-b9bc-46e6-ad92-7f12376de17c-kube-api-access-wmlmq\") pod \"swift-storage-0\" (UID: \"59565710-b9bc-46e6-ad92-7f12376de17c\") " pod="openstack/swift-storage-0" Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.785337 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"59565710-b9bc-46e6-ad92-7f12376de17c\") " pod="openstack/swift-storage-0" Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.785384 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/59565710-b9bc-46e6-ad92-7f12376de17c-cache\") pod \"swift-storage-0\" (UID: \"59565710-b9bc-46e6-ad92-7f12376de17c\") " pod="openstack/swift-storage-0" Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.785437 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59565710-b9bc-46e6-ad92-7f12376de17c-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"59565710-b9bc-46e6-ad92-7f12376de17c\") " pod="openstack/swift-storage-0" Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.785466 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/59565710-b9bc-46e6-ad92-7f12376de17c-lock\") pod \"swift-storage-0\" (UID: \"59565710-b9bc-46e6-ad92-7f12376de17c\") " pod="openstack/swift-storage-0" Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.786093 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"59565710-b9bc-46e6-ad92-7f12376de17c\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/swift-storage-0" Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.786269 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/59565710-b9bc-46e6-ad92-7f12376de17c-cache\") pod \"swift-storage-0\" (UID: \"59565710-b9bc-46e6-ad92-7f12376de17c\") " pod="openstack/swift-storage-0" Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.787612 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/59565710-b9bc-46e6-ad92-7f12376de17c-lock\") pod \"swift-storage-0\" (UID: \"59565710-b9bc-46e6-ad92-7f12376de17c\") " pod="openstack/swift-storage-0" Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.795881 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59565710-b9bc-46e6-ad92-7f12376de17c-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"59565710-b9bc-46e6-ad92-7f12376de17c\") " pod="openstack/swift-storage-0" Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.839500 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmlmq\" (UniqueName: \"kubernetes.io/projected/59565710-b9bc-46e6-ad92-7f12376de17c-kube-api-access-wmlmq\") pod \"swift-storage-0\" (UID: \"59565710-b9bc-46e6-ad92-7f12376de17c\") " pod="openstack/swift-storage-0" Mar 13 12:05:39 crc kubenswrapper[4837]: I0313 12:05:39.849876 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"59565710-b9bc-46e6-ad92-7f12376de17c\") " pod="openstack/swift-storage-0" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.177337 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-f6w8l" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.262289 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-c7zsw" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.302691 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/177d4af4-1f81-43ff-bcbc-b3d74689452f-config\") pod \"177d4af4-1f81-43ff-bcbc-b3d74689452f\" (UID: \"177d4af4-1f81-43ff-bcbc-b3d74689452f\") " Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.302766 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/177d4af4-1f81-43ff-bcbc-b3d74689452f-ovsdbserver-sb\") pod \"177d4af4-1f81-43ff-bcbc-b3d74689452f\" (UID: \"177d4af4-1f81-43ff-bcbc-b3d74689452f\") " Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.302806 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/177d4af4-1f81-43ff-bcbc-b3d74689452f-dns-svc\") pod \"177d4af4-1f81-43ff-bcbc-b3d74689452f\" (UID: \"177d4af4-1f81-43ff-bcbc-b3d74689452f\") " Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.302921 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s72rx\" (UniqueName: \"kubernetes.io/projected/177d4af4-1f81-43ff-bcbc-b3d74689452f-kube-api-access-s72rx\") pod \"177d4af4-1f81-43ff-bcbc-b3d74689452f\" (UID: \"177d4af4-1f81-43ff-bcbc-b3d74689452f\") " Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.302973 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/177d4af4-1f81-43ff-bcbc-b3d74689452f-ovsdbserver-nb\") pod \"177d4af4-1f81-43ff-bcbc-b3d74689452f\" (UID: \"177d4af4-1f81-43ff-bcbc-b3d74689452f\") " Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.303249 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/59565710-b9bc-46e6-ad92-7f12376de17c-etc-swift\") pod \"swift-storage-0\" (UID: \"59565710-b9bc-46e6-ad92-7f12376de17c\") " pod="openstack/swift-storage-0" Mar 13 12:05:40 crc kubenswrapper[4837]: E0313 12:05:40.303430 4837 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 13 12:05:40 crc kubenswrapper[4837]: E0313 12:05:40.303466 4837 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 13 12:05:40 crc kubenswrapper[4837]: E0313 12:05:40.303534 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/59565710-b9bc-46e6-ad92-7f12376de17c-etc-swift podName:59565710-b9bc-46e6-ad92-7f12376de17c nodeName:}" failed. No retries permitted until 2026-03-13 12:05:41.303508048 +0000 UTC m=+1056.941774811 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/59565710-b9bc-46e6-ad92-7f12376de17c-etc-swift") pod "swift-storage-0" (UID: "59565710-b9bc-46e6-ad92-7f12376de17c") : configmap "swift-ring-files" not found Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.307546 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/177d4af4-1f81-43ff-bcbc-b3d74689452f-kube-api-access-s72rx" (OuterVolumeSpecName: "kube-api-access-s72rx") pod "177d4af4-1f81-43ff-bcbc-b3d74689452f" (UID: "177d4af4-1f81-43ff-bcbc-b3d74689452f"). InnerVolumeSpecName "kube-api-access-s72rx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.323047 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/177d4af4-1f81-43ff-bcbc-b3d74689452f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "177d4af4-1f81-43ff-bcbc-b3d74689452f" (UID: "177d4af4-1f81-43ff-bcbc-b3d74689452f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.323653 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/177d4af4-1f81-43ff-bcbc-b3d74689452f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "177d4af4-1f81-43ff-bcbc-b3d74689452f" (UID: "177d4af4-1f81-43ff-bcbc-b3d74689452f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.325220 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/177d4af4-1f81-43ff-bcbc-b3d74689452f-config" (OuterVolumeSpecName: "config") pod "177d4af4-1f81-43ff-bcbc-b3d74689452f" (UID: "177d4af4-1f81-43ff-bcbc-b3d74689452f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.326745 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/177d4af4-1f81-43ff-bcbc-b3d74689452f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "177d4af4-1f81-43ff-bcbc-b3d74689452f" (UID: "177d4af4-1f81-43ff-bcbc-b3d74689452f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.404166 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a34d203-ee9d-4cb5-b2a4-64f57334e1e5-dns-svc\") pod \"3a34d203-ee9d-4cb5-b2a4-64f57334e1e5\" (UID: \"3a34d203-ee9d-4cb5-b2a4-64f57334e1e5\") " Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.404509 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tblrl\" (UniqueName: \"kubernetes.io/projected/3a34d203-ee9d-4cb5-b2a4-64f57334e1e5-kube-api-access-tblrl\") pod \"3a34d203-ee9d-4cb5-b2a4-64f57334e1e5\" (UID: \"3a34d203-ee9d-4cb5-b2a4-64f57334e1e5\") " Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.404573 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a34d203-ee9d-4cb5-b2a4-64f57334e1e5-config\") pod \"3a34d203-ee9d-4cb5-b2a4-64f57334e1e5\" (UID: \"3a34d203-ee9d-4cb5-b2a4-64f57334e1e5\") " Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.404631 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a34d203-ee9d-4cb5-b2a4-64f57334e1e5-ovsdbserver-nb\") pod \"3a34d203-ee9d-4cb5-b2a4-64f57334e1e5\" (UID: \"3a34d203-ee9d-4cb5-b2a4-64f57334e1e5\") " Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.405004 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/177d4af4-1f81-43ff-bcbc-b3d74689452f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.405020 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/177d4af4-1f81-43ff-bcbc-b3d74689452f-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.405029 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/177d4af4-1f81-43ff-bcbc-b3d74689452f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.405037 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/177d4af4-1f81-43ff-bcbc-b3d74689452f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.405046 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s72rx\" (UniqueName: \"kubernetes.io/projected/177d4af4-1f81-43ff-bcbc-b3d74689452f-kube-api-access-s72rx\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.407226 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a34d203-ee9d-4cb5-b2a4-64f57334e1e5-kube-api-access-tblrl" (OuterVolumeSpecName: "kube-api-access-tblrl") pod "3a34d203-ee9d-4cb5-b2a4-64f57334e1e5" (UID: "3a34d203-ee9d-4cb5-b2a4-64f57334e1e5"). InnerVolumeSpecName "kube-api-access-tblrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.420082 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a34d203-ee9d-4cb5-b2a4-64f57334e1e5-config" (OuterVolumeSpecName: "config") pod "3a34d203-ee9d-4cb5-b2a4-64f57334e1e5" (UID: "3a34d203-ee9d-4cb5-b2a4-64f57334e1e5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.422069 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a34d203-ee9d-4cb5-b2a4-64f57334e1e5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3a34d203-ee9d-4cb5-b2a4-64f57334e1e5" (UID: "3a34d203-ee9d-4cb5-b2a4-64f57334e1e5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.422874 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a34d203-ee9d-4cb5-b2a4-64f57334e1e5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3a34d203-ee9d-4cb5-b2a4-64f57334e1e5" (UID: "3a34d203-ee9d-4cb5-b2a4-64f57334e1e5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.506787 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tblrl\" (UniqueName: \"kubernetes.io/projected/3a34d203-ee9d-4cb5-b2a4-64f57334e1e5-kube-api-access-tblrl\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.506848 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a34d203-ee9d-4cb5-b2a4-64f57334e1e5-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.506865 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a34d203-ee9d-4cb5-b2a4-64f57334e1e5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.506878 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a34d203-ee9d-4cb5-b2a4-64f57334e1e5-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.763052 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"25ea0f5e-e277-4944-8c9d-2c7709e1a8cf","Type":"ContainerStarted","Data":"60b408b07a440275036d25737bc6cd3f4e00346e306c3b99863ad3c2d758fb25"} Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.763100 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"25ea0f5e-e277-4944-8c9d-2c7709e1a8cf","Type":"ContainerStarted","Data":"ab64c3706e783fd7321b2f1fb1e02d3f494cfa6379c035375e2ab370a6d3a514"} Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.763805 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.766665 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-gqrt7" event={"ID":"de68f8fe-0650-4ef4-9445-d31e119de423","Type":"ContainerStarted","Data":"a3d9d75be9f89d9ac614473e4e3a4f535965320bd55937576eb6b69f6cb8f8b9"} Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.766808 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-gqrt7" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.772497 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-c7zsw" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.772668 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-c7zsw" event={"ID":"3a34d203-ee9d-4cb5-b2a4-64f57334e1e5","Type":"ContainerDied","Data":"be59ccb857947065a413e020e34f05fc612d0f53e32fd7765216d90d410b005f"} Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.772734 4837 scope.go:117] "RemoveContainer" containerID="008473fe0498516be934d5e47eeaa2a9839d4bbed34d1703c646c45767593f85" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.775005 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-f6w8l" event={"ID":"177d4af4-1f81-43ff-bcbc-b3d74689452f","Type":"ContainerDied","Data":"e5c07a9f609638a177aae5955938d17ec2d0fa96ca0f604f6cfe39e8b5c432a3"} Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.775249 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-f6w8l" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.797086 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.481699651 podStartE2EDuration="6.797068333s" podCreationTimestamp="2026-03-13 12:05:34 +0000 UTC" firstStartedPulling="2026-03-13 12:05:35.416188369 +0000 UTC m=+1051.054455132" lastFinishedPulling="2026-03-13 12:05:39.731557051 +0000 UTC m=+1055.369823814" observedRunningTime="2026-03-13 12:05:40.793976035 +0000 UTC m=+1056.432242808" watchObservedRunningTime="2026-03-13 12:05:40.797068333 +0000 UTC m=+1056.435335096" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.816889 4837 scope.go:117] "RemoveContainer" containerID="f9c06e52cd86e1b4bf6834ffdc5c8a0fcbac08a96f3424206d1b66cf562bc09a" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.818072 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-gqrt7" podStartSLOduration=2.817815409 podStartE2EDuration="2.817815409s" podCreationTimestamp="2026-03-13 12:05:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:05:40.812238053 +0000 UTC m=+1056.450504816" watchObservedRunningTime="2026-03-13 12:05:40.817815409 +0000 UTC m=+1056.456082172" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.852091 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.911207 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-c7zsw"] Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.922191 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-c7zsw"] Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.968448 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-f6w8l"] Mar 13 12:05:40 crc kubenswrapper[4837]: I0313 12:05:40.980817 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-f6w8l"] Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.017530 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.057807 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="177d4af4-1f81-43ff-bcbc-b3d74689452f" path="/var/lib/kubelet/pods/177d4af4-1f81-43ff-bcbc-b3d74689452f/volumes" Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.058488 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a34d203-ee9d-4cb5-b2a4-64f57334e1e5" path="/var/lib/kubelet/pods/3a34d203-ee9d-4cb5-b2a4-64f57334e1e5/volumes" Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.319195 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/59565710-b9bc-46e6-ad92-7f12376de17c-etc-swift\") pod \"swift-storage-0\" (UID: \"59565710-b9bc-46e6-ad92-7f12376de17c\") " pod="openstack/swift-storage-0" Mar 13 12:05:41 crc kubenswrapper[4837]: E0313 12:05:41.319375 4837 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 13 12:05:41 crc kubenswrapper[4837]: E0313 12:05:41.319577 4837 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 13 12:05:41 crc kubenswrapper[4837]: E0313 12:05:41.319706 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/59565710-b9bc-46e6-ad92-7f12376de17c-etc-swift podName:59565710-b9bc-46e6-ad92-7f12376de17c nodeName:}" failed. No retries permitted until 2026-03-13 12:05:43.319618134 +0000 UTC m=+1058.957884897 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/59565710-b9bc-46e6-ad92-7f12376de17c-etc-swift") pod "swift-storage-0" (UID: "59565710-b9bc-46e6-ad92-7f12376de17c") : configmap "swift-ring-files" not found Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.495504 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-a48b-account-create-update-ckblt"] Mar 13 12:05:41 crc kubenswrapper[4837]: E0313 12:05:41.495899 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a34d203-ee9d-4cb5-b2a4-64f57334e1e5" containerName="init" Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.495916 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a34d203-ee9d-4cb5-b2a4-64f57334e1e5" containerName="init" Mar 13 12:05:41 crc kubenswrapper[4837]: E0313 12:05:41.495975 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="177d4af4-1f81-43ff-bcbc-b3d74689452f" containerName="init" Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.495984 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="177d4af4-1f81-43ff-bcbc-b3d74689452f" containerName="init" Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.496149 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="177d4af4-1f81-43ff-bcbc-b3d74689452f" containerName="init" Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.496165 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a34d203-ee9d-4cb5-b2a4-64f57334e1e5" containerName="init" Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.496699 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a48b-account-create-update-ckblt" Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.507456 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-a48b-account-create-update-ckblt"] Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.509511 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.535415 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-n42jz"] Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.536760 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-n42jz" Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.548920 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-n42jz"] Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.623692 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n7fz\" (UniqueName: \"kubernetes.io/projected/f2936dcb-f1fa-446b-b20f-87e09a9c03ee-kube-api-access-6n7fz\") pod \"glance-a48b-account-create-update-ckblt\" (UID: \"f2936dcb-f1fa-446b-b20f-87e09a9c03ee\") " pod="openstack/glance-a48b-account-create-update-ckblt" Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.623750 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2936dcb-f1fa-446b-b20f-87e09a9c03ee-operator-scripts\") pod \"glance-a48b-account-create-update-ckblt\" (UID: \"f2936dcb-f1fa-446b-b20f-87e09a9c03ee\") " pod="openstack/glance-a48b-account-create-update-ckblt" Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.623886 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28320b08-9dde-491d-b151-21f93395bf10-operator-scripts\") pod \"glance-db-create-n42jz\" (UID: \"28320b08-9dde-491d-b151-21f93395bf10\") " pod="openstack/glance-db-create-n42jz" Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.623926 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px6d4\" (UniqueName: \"kubernetes.io/projected/28320b08-9dde-491d-b151-21f93395bf10-kube-api-access-px6d4\") pod \"glance-db-create-n42jz\" (UID: \"28320b08-9dde-491d-b151-21f93395bf10\") " pod="openstack/glance-db-create-n42jz" Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.725173 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28320b08-9dde-491d-b151-21f93395bf10-operator-scripts\") pod \"glance-db-create-n42jz\" (UID: \"28320b08-9dde-491d-b151-21f93395bf10\") " pod="openstack/glance-db-create-n42jz" Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.725247 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px6d4\" (UniqueName: \"kubernetes.io/projected/28320b08-9dde-491d-b151-21f93395bf10-kube-api-access-px6d4\") pod \"glance-db-create-n42jz\" (UID: \"28320b08-9dde-491d-b151-21f93395bf10\") " pod="openstack/glance-db-create-n42jz" Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.725319 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n7fz\" (UniqueName: \"kubernetes.io/projected/f2936dcb-f1fa-446b-b20f-87e09a9c03ee-kube-api-access-6n7fz\") pod \"glance-a48b-account-create-update-ckblt\" (UID: \"f2936dcb-f1fa-446b-b20f-87e09a9c03ee\") " pod="openstack/glance-a48b-account-create-update-ckblt" Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.725375 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2936dcb-f1fa-446b-b20f-87e09a9c03ee-operator-scripts\") pod \"glance-a48b-account-create-update-ckblt\" (UID: \"f2936dcb-f1fa-446b-b20f-87e09a9c03ee\") " pod="openstack/glance-a48b-account-create-update-ckblt" Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.725838 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28320b08-9dde-491d-b151-21f93395bf10-operator-scripts\") pod \"glance-db-create-n42jz\" (UID: \"28320b08-9dde-491d-b151-21f93395bf10\") " pod="openstack/glance-db-create-n42jz" Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.726747 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2936dcb-f1fa-446b-b20f-87e09a9c03ee-operator-scripts\") pod \"glance-a48b-account-create-update-ckblt\" (UID: \"f2936dcb-f1fa-446b-b20f-87e09a9c03ee\") " pod="openstack/glance-a48b-account-create-update-ckblt" Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.744021 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px6d4\" (UniqueName: \"kubernetes.io/projected/28320b08-9dde-491d-b151-21f93395bf10-kube-api-access-px6d4\") pod \"glance-db-create-n42jz\" (UID: \"28320b08-9dde-491d-b151-21f93395bf10\") " pod="openstack/glance-db-create-n42jz" Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.749215 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n7fz\" (UniqueName: \"kubernetes.io/projected/f2936dcb-f1fa-446b-b20f-87e09a9c03ee-kube-api-access-6n7fz\") pod \"glance-a48b-account-create-update-ckblt\" (UID: \"f2936dcb-f1fa-446b-b20f-87e09a9c03ee\") " pod="openstack/glance-a48b-account-create-update-ckblt" Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.812100 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a48b-account-create-update-ckblt" Mar 13 12:05:41 crc kubenswrapper[4837]: I0313 12:05:41.858295 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-n42jz" Mar 13 12:05:42 crc kubenswrapper[4837]: I0313 12:05:42.266323 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-a48b-account-create-update-ckblt"] Mar 13 12:05:42 crc kubenswrapper[4837]: W0313 12:05:42.270298 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2936dcb_f1fa_446b_b20f_87e09a9c03ee.slice/crio-930fa6fd842bef07ab99729d63d9c467c183629e38c0c3b6e35eacef428e4c49 WatchSource:0}: Error finding container 930fa6fd842bef07ab99729d63d9c467c183629e38c0c3b6e35eacef428e4c49: Status 404 returned error can't find the container with id 930fa6fd842bef07ab99729d63d9c467c183629e38c0c3b6e35eacef428e4c49 Mar 13 12:05:42 crc kubenswrapper[4837]: W0313 12:05:42.389310 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28320b08_9dde_491d_b151_21f93395bf10.slice/crio-79f305c056052302aa6aedb9477a994fc2b391f952b1de999fb2cdfcefd3e773 WatchSource:0}: Error finding container 79f305c056052302aa6aedb9477a994fc2b391f952b1de999fb2cdfcefd3e773: Status 404 returned error can't find the container with id 79f305c056052302aa6aedb9477a994fc2b391f952b1de999fb2cdfcefd3e773 Mar 13 12:05:42 crc kubenswrapper[4837]: I0313 12:05:42.393309 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-n42jz"] Mar 13 12:05:42 crc kubenswrapper[4837]: I0313 12:05:42.795199 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-n42jz" event={"ID":"28320b08-9dde-491d-b151-21f93395bf10","Type":"ContainerStarted","Data":"9c444d34c403a2440618afe6e0c75ef9551c465f012f8ba4f50c5bde9744bb16"} Mar 13 12:05:42 crc kubenswrapper[4837]: I0313 12:05:42.795252 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-n42jz" event={"ID":"28320b08-9dde-491d-b151-21f93395bf10","Type":"ContainerStarted","Data":"79f305c056052302aa6aedb9477a994fc2b391f952b1de999fb2cdfcefd3e773"} Mar 13 12:05:42 crc kubenswrapper[4837]: I0313 12:05:42.796912 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-a48b-account-create-update-ckblt" event={"ID":"f2936dcb-f1fa-446b-b20f-87e09a9c03ee","Type":"ContainerStarted","Data":"4f6fb24113c34cb08d7bf34817309c7c27eeab0cdaee4f12683e138394d254b1"} Mar 13 12:05:42 crc kubenswrapper[4837]: I0313 12:05:42.796958 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-a48b-account-create-update-ckblt" event={"ID":"f2936dcb-f1fa-446b-b20f-87e09a9c03ee","Type":"ContainerStarted","Data":"930fa6fd842bef07ab99729d63d9c467c183629e38c0c3b6e35eacef428e4c49"} Mar 13 12:05:42 crc kubenswrapper[4837]: I0313 12:05:42.817726 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-a48b-account-create-update-ckblt" podStartSLOduration=1.817707908 podStartE2EDuration="1.817707908s" podCreationTimestamp="2026-03-13 12:05:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:05:42.816373735 +0000 UTC m=+1058.454640498" watchObservedRunningTime="2026-03-13 12:05:42.817707908 +0000 UTC m=+1058.455974681" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.371945 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/59565710-b9bc-46e6-ad92-7f12376de17c-etc-swift\") pod \"swift-storage-0\" (UID: \"59565710-b9bc-46e6-ad92-7f12376de17c\") " pod="openstack/swift-storage-0" Mar 13 12:05:43 crc kubenswrapper[4837]: E0313 12:05:43.372264 4837 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 13 12:05:43 crc kubenswrapper[4837]: E0313 12:05:43.372321 4837 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 13 12:05:43 crc kubenswrapper[4837]: E0313 12:05:43.372410 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/59565710-b9bc-46e6-ad92-7f12376de17c-etc-swift podName:59565710-b9bc-46e6-ad92-7f12376de17c nodeName:}" failed. No retries permitted until 2026-03-13 12:05:47.372382558 +0000 UTC m=+1063.010649311 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/59565710-b9bc-46e6-ad92-7f12376de17c-etc-swift") pod "swift-storage-0" (UID: "59565710-b9bc-46e6-ad92-7f12376de17c") : configmap "swift-ring-files" not found Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.389751 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-tgg8d"] Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.391158 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-tgg8d" Mar 13 12:05:43 crc kubenswrapper[4837]: W0313 12:05:43.393520 4837 reflector.go:561] object-"openstack"/"openstack-mariadb-root-db-secret": failed to list *v1.Secret: secrets "openstack-mariadb-root-db-secret" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Mar 13 12:05:43 crc kubenswrapper[4837]: E0313 12:05:43.393591 4837 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"openstack-mariadb-root-db-secret\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openstack-mariadb-root-db-secret\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.409314 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-tgg8d"] Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.473731 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw8n4\" (UniqueName: \"kubernetes.io/projected/b2c8bb14-f7ed-4a97-a6f8-73f67824897e-kube-api-access-vw8n4\") pod \"root-account-create-update-tgg8d\" (UID: \"b2c8bb14-f7ed-4a97-a6f8-73f67824897e\") " pod="openstack/root-account-create-update-tgg8d" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.474118 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2c8bb14-f7ed-4a97-a6f8-73f67824897e-operator-scripts\") pod \"root-account-create-update-tgg8d\" (UID: \"b2c8bb14-f7ed-4a97-a6f8-73f67824897e\") " pod="openstack/root-account-create-update-tgg8d" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.576240 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2c8bb14-f7ed-4a97-a6f8-73f67824897e-operator-scripts\") pod \"root-account-create-update-tgg8d\" (UID: \"b2c8bb14-f7ed-4a97-a6f8-73f67824897e\") " pod="openstack/root-account-create-update-tgg8d" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.576354 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vw8n4\" (UniqueName: \"kubernetes.io/projected/b2c8bb14-f7ed-4a97-a6f8-73f67824897e-kube-api-access-vw8n4\") pod \"root-account-create-update-tgg8d\" (UID: \"b2c8bb14-f7ed-4a97-a6f8-73f67824897e\") " pod="openstack/root-account-create-update-tgg8d" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.577085 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2c8bb14-f7ed-4a97-a6f8-73f67824897e-operator-scripts\") pod \"root-account-create-update-tgg8d\" (UID: \"b2c8bb14-f7ed-4a97-a6f8-73f67824897e\") " pod="openstack/root-account-create-update-tgg8d" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.588172 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-69xgx"] Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.589185 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-69xgx" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.591281 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.591292 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.594145 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.603360 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw8n4\" (UniqueName: \"kubernetes.io/projected/b2c8bb14-f7ed-4a97-a6f8-73f67824897e-kube-api-access-vw8n4\") pod \"root-account-create-update-tgg8d\" (UID: \"b2c8bb14-f7ed-4a97-a6f8-73f67824897e\") " pod="openstack/root-account-create-update-tgg8d" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.619242 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-69xgx"] Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.678290 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcxh5\" (UniqueName: \"kubernetes.io/projected/24998567-afa6-4adc-a503-4fc054946aef-kube-api-access-pcxh5\") pod \"swift-ring-rebalance-69xgx\" (UID: \"24998567-afa6-4adc-a503-4fc054946aef\") " pod="openstack/swift-ring-rebalance-69xgx" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.678349 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/24998567-afa6-4adc-a503-4fc054946aef-etc-swift\") pod \"swift-ring-rebalance-69xgx\" (UID: \"24998567-afa6-4adc-a503-4fc054946aef\") " pod="openstack/swift-ring-rebalance-69xgx" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.678447 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/24998567-afa6-4adc-a503-4fc054946aef-ring-data-devices\") pod \"swift-ring-rebalance-69xgx\" (UID: \"24998567-afa6-4adc-a503-4fc054946aef\") " pod="openstack/swift-ring-rebalance-69xgx" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.678521 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24998567-afa6-4adc-a503-4fc054946aef-combined-ca-bundle\") pod \"swift-ring-rebalance-69xgx\" (UID: \"24998567-afa6-4adc-a503-4fc054946aef\") " pod="openstack/swift-ring-rebalance-69xgx" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.678571 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/24998567-afa6-4adc-a503-4fc054946aef-dispersionconf\") pod \"swift-ring-rebalance-69xgx\" (UID: \"24998567-afa6-4adc-a503-4fc054946aef\") " pod="openstack/swift-ring-rebalance-69xgx" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.678629 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/24998567-afa6-4adc-a503-4fc054946aef-scripts\") pod \"swift-ring-rebalance-69xgx\" (UID: \"24998567-afa6-4adc-a503-4fc054946aef\") " pod="openstack/swift-ring-rebalance-69xgx" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.678709 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/24998567-afa6-4adc-a503-4fc054946aef-swiftconf\") pod \"swift-ring-rebalance-69xgx\" (UID: \"24998567-afa6-4adc-a503-4fc054946aef\") " pod="openstack/swift-ring-rebalance-69xgx" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.708401 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-tgg8d" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.780455 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcxh5\" (UniqueName: \"kubernetes.io/projected/24998567-afa6-4adc-a503-4fc054946aef-kube-api-access-pcxh5\") pod \"swift-ring-rebalance-69xgx\" (UID: \"24998567-afa6-4adc-a503-4fc054946aef\") " pod="openstack/swift-ring-rebalance-69xgx" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.780533 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/24998567-afa6-4adc-a503-4fc054946aef-etc-swift\") pod \"swift-ring-rebalance-69xgx\" (UID: \"24998567-afa6-4adc-a503-4fc054946aef\") " pod="openstack/swift-ring-rebalance-69xgx" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.780674 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/24998567-afa6-4adc-a503-4fc054946aef-ring-data-devices\") pod \"swift-ring-rebalance-69xgx\" (UID: \"24998567-afa6-4adc-a503-4fc054946aef\") " pod="openstack/swift-ring-rebalance-69xgx" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.780709 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24998567-afa6-4adc-a503-4fc054946aef-combined-ca-bundle\") pod \"swift-ring-rebalance-69xgx\" (UID: \"24998567-afa6-4adc-a503-4fc054946aef\") " pod="openstack/swift-ring-rebalance-69xgx" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.780757 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/24998567-afa6-4adc-a503-4fc054946aef-dispersionconf\") pod \"swift-ring-rebalance-69xgx\" (UID: \"24998567-afa6-4adc-a503-4fc054946aef\") " pod="openstack/swift-ring-rebalance-69xgx" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.780791 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/24998567-afa6-4adc-a503-4fc054946aef-scripts\") pod \"swift-ring-rebalance-69xgx\" (UID: \"24998567-afa6-4adc-a503-4fc054946aef\") " pod="openstack/swift-ring-rebalance-69xgx" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.780825 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/24998567-afa6-4adc-a503-4fc054946aef-swiftconf\") pod \"swift-ring-rebalance-69xgx\" (UID: \"24998567-afa6-4adc-a503-4fc054946aef\") " pod="openstack/swift-ring-rebalance-69xgx" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.781970 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/24998567-afa6-4adc-a503-4fc054946aef-etc-swift\") pod \"swift-ring-rebalance-69xgx\" (UID: \"24998567-afa6-4adc-a503-4fc054946aef\") " pod="openstack/swift-ring-rebalance-69xgx" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.782551 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/24998567-afa6-4adc-a503-4fc054946aef-scripts\") pod \"swift-ring-rebalance-69xgx\" (UID: \"24998567-afa6-4adc-a503-4fc054946aef\") " pod="openstack/swift-ring-rebalance-69xgx" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.782935 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/24998567-afa6-4adc-a503-4fc054946aef-ring-data-devices\") pod \"swift-ring-rebalance-69xgx\" (UID: \"24998567-afa6-4adc-a503-4fc054946aef\") " pod="openstack/swift-ring-rebalance-69xgx" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.784598 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24998567-afa6-4adc-a503-4fc054946aef-combined-ca-bundle\") pod \"swift-ring-rebalance-69xgx\" (UID: \"24998567-afa6-4adc-a503-4fc054946aef\") " pod="openstack/swift-ring-rebalance-69xgx" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.784689 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/24998567-afa6-4adc-a503-4fc054946aef-swiftconf\") pod \"swift-ring-rebalance-69xgx\" (UID: \"24998567-afa6-4adc-a503-4fc054946aef\") " pod="openstack/swift-ring-rebalance-69xgx" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.785241 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/24998567-afa6-4adc-a503-4fc054946aef-dispersionconf\") pod \"swift-ring-rebalance-69xgx\" (UID: \"24998567-afa6-4adc-a503-4fc054946aef\") " pod="openstack/swift-ring-rebalance-69xgx" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.803318 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcxh5\" (UniqueName: \"kubernetes.io/projected/24998567-afa6-4adc-a503-4fc054946aef-kube-api-access-pcxh5\") pod \"swift-ring-rebalance-69xgx\" (UID: \"24998567-afa6-4adc-a503-4fc054946aef\") " pod="openstack/swift-ring-rebalance-69xgx" Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.818231 4837 generic.go:334] "Generic (PLEG): container finished" podID="28320b08-9dde-491d-b151-21f93395bf10" containerID="9c444d34c403a2440618afe6e0c75ef9551c465f012f8ba4f50c5bde9744bb16" exitCode=0 Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.819198 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-n42jz" event={"ID":"28320b08-9dde-491d-b151-21f93395bf10","Type":"ContainerDied","Data":"9c444d34c403a2440618afe6e0c75ef9551c465f012f8ba4f50c5bde9744bb16"} Mar 13 12:05:43 crc kubenswrapper[4837]: I0313 12:05:43.952932 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-69xgx" Mar 13 12:05:44 crc kubenswrapper[4837]: I0313 12:05:44.155808 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-tgg8d"] Mar 13 12:05:44 crc kubenswrapper[4837]: W0313 12:05:44.158869 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb2c8bb14_f7ed_4a97_a6f8_73f67824897e.slice/crio-8d935243c5f433e3064f23569f68df8366157f2b6f6ece0f72a4fdf38d92f739 WatchSource:0}: Error finding container 8d935243c5f433e3064f23569f68df8366157f2b6f6ece0f72a4fdf38d92f739: Status 404 returned error can't find the container with id 8d935243c5f433e3064f23569f68df8366157f2b6f6ece0f72a4fdf38d92f739 Mar 13 12:05:44 crc kubenswrapper[4837]: I0313 12:05:44.370411 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-69xgx"] Mar 13 12:05:44 crc kubenswrapper[4837]: W0313 12:05:44.380229 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24998567_afa6_4adc_a503_4fc054946aef.slice/crio-1ed0808ad30fc86e0f532b5d60900d3ad34dad8095b22ae074e18418eb961f7b WatchSource:0}: Error finding container 1ed0808ad30fc86e0f532b5d60900d3ad34dad8095b22ae074e18418eb961f7b: Status 404 returned error can't find the container with id 1ed0808ad30fc86e0f532b5d60900d3ad34dad8095b22ae074e18418eb961f7b Mar 13 12:05:44 crc kubenswrapper[4837]: I0313 12:05:44.426491 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 13 12:05:44 crc kubenswrapper[4837]: I0313 12:05:44.827276 4837 generic.go:334] "Generic (PLEG): container finished" podID="b2c8bb14-f7ed-4a97-a6f8-73f67824897e" containerID="57b8ae831c66c62748afbdcfeed21457125293d241eef5e2c9e04fa2bc86f046" exitCode=0 Mar 13 12:05:44 crc kubenswrapper[4837]: I0313 12:05:44.827357 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-tgg8d" event={"ID":"b2c8bb14-f7ed-4a97-a6f8-73f67824897e","Type":"ContainerDied","Data":"57b8ae831c66c62748afbdcfeed21457125293d241eef5e2c9e04fa2bc86f046"} Mar 13 12:05:44 crc kubenswrapper[4837]: I0313 12:05:44.827604 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-tgg8d" event={"ID":"b2c8bb14-f7ed-4a97-a6f8-73f67824897e","Type":"ContainerStarted","Data":"8d935243c5f433e3064f23569f68df8366157f2b6f6ece0f72a4fdf38d92f739"} Mar 13 12:05:44 crc kubenswrapper[4837]: I0313 12:05:44.829867 4837 generic.go:334] "Generic (PLEG): container finished" podID="f2936dcb-f1fa-446b-b20f-87e09a9c03ee" containerID="4f6fb24113c34cb08d7bf34817309c7c27eeab0cdaee4f12683e138394d254b1" exitCode=0 Mar 13 12:05:44 crc kubenswrapper[4837]: I0313 12:05:44.830012 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-a48b-account-create-update-ckblt" event={"ID":"f2936dcb-f1fa-446b-b20f-87e09a9c03ee","Type":"ContainerDied","Data":"4f6fb24113c34cb08d7bf34817309c7c27eeab0cdaee4f12683e138394d254b1"} Mar 13 12:05:44 crc kubenswrapper[4837]: I0313 12:05:44.831460 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-69xgx" event={"ID":"24998567-afa6-4adc-a503-4fc054946aef","Type":"ContainerStarted","Data":"1ed0808ad30fc86e0f532b5d60900d3ad34dad8095b22ae074e18418eb961f7b"} Mar 13 12:05:45 crc kubenswrapper[4837]: I0313 12:05:45.213200 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-n42jz" Mar 13 12:05:45 crc kubenswrapper[4837]: I0313 12:05:45.328144 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28320b08-9dde-491d-b151-21f93395bf10-operator-scripts\") pod \"28320b08-9dde-491d-b151-21f93395bf10\" (UID: \"28320b08-9dde-491d-b151-21f93395bf10\") " Mar 13 12:05:45 crc kubenswrapper[4837]: I0313 12:05:45.328480 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-px6d4\" (UniqueName: \"kubernetes.io/projected/28320b08-9dde-491d-b151-21f93395bf10-kube-api-access-px6d4\") pod \"28320b08-9dde-491d-b151-21f93395bf10\" (UID: \"28320b08-9dde-491d-b151-21f93395bf10\") " Mar 13 12:05:45 crc kubenswrapper[4837]: I0313 12:05:45.328905 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28320b08-9dde-491d-b151-21f93395bf10-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "28320b08-9dde-491d-b151-21f93395bf10" (UID: "28320b08-9dde-491d-b151-21f93395bf10"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:05:45 crc kubenswrapper[4837]: I0313 12:05:45.329332 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28320b08-9dde-491d-b151-21f93395bf10-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:45 crc kubenswrapper[4837]: I0313 12:05:45.335310 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28320b08-9dde-491d-b151-21f93395bf10-kube-api-access-px6d4" (OuterVolumeSpecName: "kube-api-access-px6d4") pod "28320b08-9dde-491d-b151-21f93395bf10" (UID: "28320b08-9dde-491d-b151-21f93395bf10"). InnerVolumeSpecName "kube-api-access-px6d4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:05:45 crc kubenswrapper[4837]: I0313 12:05:45.431835 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-px6d4\" (UniqueName: \"kubernetes.io/projected/28320b08-9dde-491d-b151-21f93395bf10-kube-api-access-px6d4\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:45 crc kubenswrapper[4837]: I0313 12:05:45.838718 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-n42jz" Mar 13 12:05:45 crc kubenswrapper[4837]: I0313 12:05:45.840477 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-n42jz" event={"ID":"28320b08-9dde-491d-b151-21f93395bf10","Type":"ContainerDied","Data":"79f305c056052302aa6aedb9477a994fc2b391f952b1de999fb2cdfcefd3e773"} Mar 13 12:05:45 crc kubenswrapper[4837]: I0313 12:05:45.840513 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79f305c056052302aa6aedb9477a994fc2b391f952b1de999fb2cdfcefd3e773" Mar 13 12:05:46 crc kubenswrapper[4837]: I0313 12:05:46.729921 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a48b-account-create-update-ckblt" Mar 13 12:05:46 crc kubenswrapper[4837]: I0313 12:05:46.739525 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-tgg8d" Mar 13 12:05:46 crc kubenswrapper[4837]: I0313 12:05:46.849274 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-tgg8d" event={"ID":"b2c8bb14-f7ed-4a97-a6f8-73f67824897e","Type":"ContainerDied","Data":"8d935243c5f433e3064f23569f68df8366157f2b6f6ece0f72a4fdf38d92f739"} Mar 13 12:05:46 crc kubenswrapper[4837]: I0313 12:05:46.849313 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d935243c5f433e3064f23569f68df8366157f2b6f6ece0f72a4fdf38d92f739" Mar 13 12:05:46 crc kubenswrapper[4837]: I0313 12:05:46.849362 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-tgg8d" Mar 13 12:05:46 crc kubenswrapper[4837]: I0313 12:05:46.851748 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-a48b-account-create-update-ckblt" event={"ID":"f2936dcb-f1fa-446b-b20f-87e09a9c03ee","Type":"ContainerDied","Data":"930fa6fd842bef07ab99729d63d9c467c183629e38c0c3b6e35eacef428e4c49"} Mar 13 12:05:46 crc kubenswrapper[4837]: I0313 12:05:46.851777 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="930fa6fd842bef07ab99729d63d9c467c183629e38c0c3b6e35eacef428e4c49" Mar 13 12:05:46 crc kubenswrapper[4837]: I0313 12:05:46.852104 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a48b-account-create-update-ckblt" Mar 13 12:05:46 crc kubenswrapper[4837]: I0313 12:05:46.867460 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6n7fz\" (UniqueName: \"kubernetes.io/projected/f2936dcb-f1fa-446b-b20f-87e09a9c03ee-kube-api-access-6n7fz\") pod \"f2936dcb-f1fa-446b-b20f-87e09a9c03ee\" (UID: \"f2936dcb-f1fa-446b-b20f-87e09a9c03ee\") " Mar 13 12:05:46 crc kubenswrapper[4837]: I0313 12:05:46.867780 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vw8n4\" (UniqueName: \"kubernetes.io/projected/b2c8bb14-f7ed-4a97-a6f8-73f67824897e-kube-api-access-vw8n4\") pod \"b2c8bb14-f7ed-4a97-a6f8-73f67824897e\" (UID: \"b2c8bb14-f7ed-4a97-a6f8-73f67824897e\") " Mar 13 12:05:46 crc kubenswrapper[4837]: I0313 12:05:46.867879 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2936dcb-f1fa-446b-b20f-87e09a9c03ee-operator-scripts\") pod \"f2936dcb-f1fa-446b-b20f-87e09a9c03ee\" (UID: \"f2936dcb-f1fa-446b-b20f-87e09a9c03ee\") " Mar 13 12:05:46 crc kubenswrapper[4837]: I0313 12:05:46.867940 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2c8bb14-f7ed-4a97-a6f8-73f67824897e-operator-scripts\") pod \"b2c8bb14-f7ed-4a97-a6f8-73f67824897e\" (UID: \"b2c8bb14-f7ed-4a97-a6f8-73f67824897e\") " Mar 13 12:05:46 crc kubenswrapper[4837]: I0313 12:05:46.869069 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2936dcb-f1fa-446b-b20f-87e09a9c03ee-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f2936dcb-f1fa-446b-b20f-87e09a9c03ee" (UID: "f2936dcb-f1fa-446b-b20f-87e09a9c03ee"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:05:46 crc kubenswrapper[4837]: I0313 12:05:46.869130 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2c8bb14-f7ed-4a97-a6f8-73f67824897e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b2c8bb14-f7ed-4a97-a6f8-73f67824897e" (UID: "b2c8bb14-f7ed-4a97-a6f8-73f67824897e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:05:46 crc kubenswrapper[4837]: I0313 12:05:46.874686 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2936dcb-f1fa-446b-b20f-87e09a9c03ee-kube-api-access-6n7fz" (OuterVolumeSpecName: "kube-api-access-6n7fz") pod "f2936dcb-f1fa-446b-b20f-87e09a9c03ee" (UID: "f2936dcb-f1fa-446b-b20f-87e09a9c03ee"). InnerVolumeSpecName "kube-api-access-6n7fz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:05:46 crc kubenswrapper[4837]: I0313 12:05:46.876790 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2c8bb14-f7ed-4a97-a6f8-73f67824897e-kube-api-access-vw8n4" (OuterVolumeSpecName: "kube-api-access-vw8n4") pod "b2c8bb14-f7ed-4a97-a6f8-73f67824897e" (UID: "b2c8bb14-f7ed-4a97-a6f8-73f67824897e"). InnerVolumeSpecName "kube-api-access-vw8n4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:05:46 crc kubenswrapper[4837]: I0313 12:05:46.969664 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2936dcb-f1fa-446b-b20f-87e09a9c03ee-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:46 crc kubenswrapper[4837]: I0313 12:05:46.969707 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2c8bb14-f7ed-4a97-a6f8-73f67824897e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:46 crc kubenswrapper[4837]: I0313 12:05:46.969721 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6n7fz\" (UniqueName: \"kubernetes.io/projected/f2936dcb-f1fa-446b-b20f-87e09a9c03ee-kube-api-access-6n7fz\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:46 crc kubenswrapper[4837]: I0313 12:05:46.969736 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vw8n4\" (UniqueName: \"kubernetes.io/projected/b2c8bb14-f7ed-4a97-a6f8-73f67824897e-kube-api-access-vw8n4\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.305971 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-gmczg"] Mar 13 12:05:47 crc kubenswrapper[4837]: E0313 12:05:47.306322 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2936dcb-f1fa-446b-b20f-87e09a9c03ee" containerName="mariadb-account-create-update" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.306338 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2936dcb-f1fa-446b-b20f-87e09a9c03ee" containerName="mariadb-account-create-update" Mar 13 12:05:47 crc kubenswrapper[4837]: E0313 12:05:47.306378 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2c8bb14-f7ed-4a97-a6f8-73f67824897e" containerName="mariadb-account-create-update" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.306391 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2c8bb14-f7ed-4a97-a6f8-73f67824897e" containerName="mariadb-account-create-update" Mar 13 12:05:47 crc kubenswrapper[4837]: E0313 12:05:47.306409 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28320b08-9dde-491d-b151-21f93395bf10" containerName="mariadb-database-create" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.306417 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="28320b08-9dde-491d-b151-21f93395bf10" containerName="mariadb-database-create" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.306618 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2c8bb14-f7ed-4a97-a6f8-73f67824897e" containerName="mariadb-account-create-update" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.306658 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="28320b08-9dde-491d-b151-21f93395bf10" containerName="mariadb-database-create" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.306672 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2936dcb-f1fa-446b-b20f-87e09a9c03ee" containerName="mariadb-account-create-update" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.307191 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-gmczg" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.327701 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-gmczg"] Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.376576 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9j7s8\" (UniqueName: \"kubernetes.io/projected/5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e-kube-api-access-9j7s8\") pod \"keystone-db-create-gmczg\" (UID: \"5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e\") " pod="openstack/keystone-db-create-gmczg" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.377251 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e-operator-scripts\") pod \"keystone-db-create-gmczg\" (UID: \"5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e\") " pod="openstack/keystone-db-create-gmczg" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.377490 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/59565710-b9bc-46e6-ad92-7f12376de17c-etc-swift\") pod \"swift-storage-0\" (UID: \"59565710-b9bc-46e6-ad92-7f12376de17c\") " pod="openstack/swift-storage-0" Mar 13 12:05:47 crc kubenswrapper[4837]: E0313 12:05:47.377932 4837 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 13 12:05:47 crc kubenswrapper[4837]: E0313 12:05:47.377953 4837 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 13 12:05:47 crc kubenswrapper[4837]: E0313 12:05:47.378047 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/59565710-b9bc-46e6-ad92-7f12376de17c-etc-swift podName:59565710-b9bc-46e6-ad92-7f12376de17c nodeName:}" failed. No retries permitted until 2026-03-13 12:05:55.378027569 +0000 UTC m=+1071.016294332 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/59565710-b9bc-46e6-ad92-7f12376de17c-etc-swift") pod "swift-storage-0" (UID: "59565710-b9bc-46e6-ad92-7f12376de17c") : configmap "swift-ring-files" not found Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.411834 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-d970-account-create-update-lkc7z"] Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.413457 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d970-account-create-update-lkc7z" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.416079 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.423329 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-d970-account-create-update-lkc7z"] Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.479958 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fwgn\" (UniqueName: \"kubernetes.io/projected/2230cdcb-087e-4882-8aea-c5d850b711ac-kube-api-access-9fwgn\") pod \"keystone-d970-account-create-update-lkc7z\" (UID: \"2230cdcb-087e-4882-8aea-c5d850b711ac\") " pod="openstack/keystone-d970-account-create-update-lkc7z" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.480043 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e-operator-scripts\") pod \"keystone-db-create-gmczg\" (UID: \"5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e\") " pod="openstack/keystone-db-create-gmczg" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.480135 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2230cdcb-087e-4882-8aea-c5d850b711ac-operator-scripts\") pod \"keystone-d970-account-create-update-lkc7z\" (UID: \"2230cdcb-087e-4882-8aea-c5d850b711ac\") " pod="openstack/keystone-d970-account-create-update-lkc7z" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.480229 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9j7s8\" (UniqueName: \"kubernetes.io/projected/5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e-kube-api-access-9j7s8\") pod \"keystone-db-create-gmczg\" (UID: \"5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e\") " pod="openstack/keystone-db-create-gmczg" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.490592 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e-operator-scripts\") pod \"keystone-db-create-gmczg\" (UID: \"5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e\") " pod="openstack/keystone-db-create-gmczg" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.516407 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9j7s8\" (UniqueName: \"kubernetes.io/projected/5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e-kube-api-access-9j7s8\") pod \"keystone-db-create-gmczg\" (UID: \"5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e\") " pod="openstack/keystone-db-create-gmczg" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.539070 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-rb248"] Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.540528 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rb248" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.587333 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2230cdcb-087e-4882-8aea-c5d850b711ac-operator-scripts\") pod \"keystone-d970-account-create-update-lkc7z\" (UID: \"2230cdcb-087e-4882-8aea-c5d850b711ac\") " pod="openstack/keystone-d970-account-create-update-lkc7z" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.587873 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fwgn\" (UniqueName: \"kubernetes.io/projected/2230cdcb-087e-4882-8aea-c5d850b711ac-kube-api-access-9fwgn\") pod \"keystone-d970-account-create-update-lkc7z\" (UID: \"2230cdcb-087e-4882-8aea-c5d850b711ac\") " pod="openstack/keystone-d970-account-create-update-lkc7z" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.589307 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2230cdcb-087e-4882-8aea-c5d850b711ac-operator-scripts\") pod \"keystone-d970-account-create-update-lkc7z\" (UID: \"2230cdcb-087e-4882-8aea-c5d850b711ac\") " pod="openstack/keystone-d970-account-create-update-lkc7z" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.598264 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-rb248"] Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.606684 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fwgn\" (UniqueName: \"kubernetes.io/projected/2230cdcb-087e-4882-8aea-c5d850b711ac-kube-api-access-9fwgn\") pod \"keystone-d970-account-create-update-lkc7z\" (UID: \"2230cdcb-087e-4882-8aea-c5d850b711ac\") " pod="openstack/keystone-d970-account-create-update-lkc7z" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.621452 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-d9fb-account-create-update-5jvwd"] Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.630657 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-gmczg" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.633240 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d9fb-account-create-update-5jvwd" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.633380 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d9fb-account-create-update-5jvwd"] Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.637411 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.689300 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5qkz\" (UniqueName: \"kubernetes.io/projected/737740b8-437c-4c6a-a16f-ac0afcf40b95-kube-api-access-g5qkz\") pod \"placement-db-create-rb248\" (UID: \"737740b8-437c-4c6a-a16f-ac0afcf40b95\") " pod="openstack/placement-db-create-rb248" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.689356 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/737740b8-437c-4c6a-a16f-ac0afcf40b95-operator-scripts\") pod \"placement-db-create-rb248\" (UID: \"737740b8-437c-4c6a-a16f-ac0afcf40b95\") " pod="openstack/placement-db-create-rb248" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.742976 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d970-account-create-update-lkc7z" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.792128 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5qkz\" (UniqueName: \"kubernetes.io/projected/737740b8-437c-4c6a-a16f-ac0afcf40b95-kube-api-access-g5qkz\") pod \"placement-db-create-rb248\" (UID: \"737740b8-437c-4c6a-a16f-ac0afcf40b95\") " pod="openstack/placement-db-create-rb248" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.792204 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5601ea4-ee81-4e2a-b370-268652332465-operator-scripts\") pod \"placement-d9fb-account-create-update-5jvwd\" (UID: \"c5601ea4-ee81-4e2a-b370-268652332465\") " pod="openstack/placement-d9fb-account-create-update-5jvwd" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.792232 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/737740b8-437c-4c6a-a16f-ac0afcf40b95-operator-scripts\") pod \"placement-db-create-rb248\" (UID: \"737740b8-437c-4c6a-a16f-ac0afcf40b95\") " pod="openstack/placement-db-create-rb248" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.792281 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npm9t\" (UniqueName: \"kubernetes.io/projected/c5601ea4-ee81-4e2a-b370-268652332465-kube-api-access-npm9t\") pod \"placement-d9fb-account-create-update-5jvwd\" (UID: \"c5601ea4-ee81-4e2a-b370-268652332465\") " pod="openstack/placement-d9fb-account-create-update-5jvwd" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.793296 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/737740b8-437c-4c6a-a16f-ac0afcf40b95-operator-scripts\") pod \"placement-db-create-rb248\" (UID: \"737740b8-437c-4c6a-a16f-ac0afcf40b95\") " pod="openstack/placement-db-create-rb248" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.814186 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5qkz\" (UniqueName: \"kubernetes.io/projected/737740b8-437c-4c6a-a16f-ac0afcf40b95-kube-api-access-g5qkz\") pod \"placement-db-create-rb248\" (UID: \"737740b8-437c-4c6a-a16f-ac0afcf40b95\") " pod="openstack/placement-db-create-rb248" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.871277 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rb248" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.894481 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5601ea4-ee81-4e2a-b370-268652332465-operator-scripts\") pod \"placement-d9fb-account-create-update-5jvwd\" (UID: \"c5601ea4-ee81-4e2a-b370-268652332465\") " pod="openstack/placement-d9fb-account-create-update-5jvwd" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.894595 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npm9t\" (UniqueName: \"kubernetes.io/projected/c5601ea4-ee81-4e2a-b370-268652332465-kube-api-access-npm9t\") pod \"placement-d9fb-account-create-update-5jvwd\" (UID: \"c5601ea4-ee81-4e2a-b370-268652332465\") " pod="openstack/placement-d9fb-account-create-update-5jvwd" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.895259 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5601ea4-ee81-4e2a-b370-268652332465-operator-scripts\") pod \"placement-d9fb-account-create-update-5jvwd\" (UID: \"c5601ea4-ee81-4e2a-b370-268652332465\") " pod="openstack/placement-d9fb-account-create-update-5jvwd" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.914569 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npm9t\" (UniqueName: \"kubernetes.io/projected/c5601ea4-ee81-4e2a-b370-268652332465-kube-api-access-npm9t\") pod \"placement-d9fb-account-create-update-5jvwd\" (UID: \"c5601ea4-ee81-4e2a-b370-268652332465\") " pod="openstack/placement-d9fb-account-create-update-5jvwd" Mar 13 12:05:47 crc kubenswrapper[4837]: I0313 12:05:47.954377 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d9fb-account-create-update-5jvwd" Mar 13 12:05:48 crc kubenswrapper[4837]: W0313 12:05:48.518171 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5601ea4_ee81_4e2a_b370_268652332465.slice/crio-390d3e9855f3f57028fb161478457ea4b1e4ef62c19e2881ed44da5bdeee6e27 WatchSource:0}: Error finding container 390d3e9855f3f57028fb161478457ea4b1e4ef62c19e2881ed44da5bdeee6e27: Status 404 returned error can't find the container with id 390d3e9855f3f57028fb161478457ea4b1e4ef62c19e2881ed44da5bdeee6e27 Mar 13 12:05:48 crc kubenswrapper[4837]: I0313 12:05:48.519285 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d9fb-account-create-update-5jvwd"] Mar 13 12:05:48 crc kubenswrapper[4837]: I0313 12:05:48.581551 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-rb248"] Mar 13 12:05:48 crc kubenswrapper[4837]: W0313 12:05:48.605982 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod737740b8_437c_4c6a_a16f_ac0afcf40b95.slice/crio-d70df55b1790a1ebbe6a4a6a6fb3d1059c01bf3fda86f5702d2b75048a834895 WatchSource:0}: Error finding container d70df55b1790a1ebbe6a4a6a6fb3d1059c01bf3fda86f5702d2b75048a834895: Status 404 returned error can't find the container with id d70df55b1790a1ebbe6a4a6a6fb3d1059c01bf3fda86f5702d2b75048a834895 Mar 13 12:05:48 crc kubenswrapper[4837]: I0313 12:05:48.652858 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-gmczg"] Mar 13 12:05:48 crc kubenswrapper[4837]: I0313 12:05:48.670430 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-d970-account-create-update-lkc7z"] Mar 13 12:05:48 crc kubenswrapper[4837]: I0313 12:05:48.815877 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-gqrt7" Mar 13 12:05:48 crc kubenswrapper[4837]: I0313 12:05:48.864582 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d970-account-create-update-lkc7z" event={"ID":"2230cdcb-087e-4882-8aea-c5d850b711ac","Type":"ContainerStarted","Data":"ba21907e7c2549bba4ed2433e390f6db6aab42f1bc7683bed090aa5abb21d188"} Mar 13 12:05:48 crc kubenswrapper[4837]: I0313 12:05:48.866447 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-69xgx" event={"ID":"24998567-afa6-4adc-a503-4fc054946aef","Type":"ContainerStarted","Data":"ab5c46268962acc94f8e7f96b6af1d93a9a7a4507799423762e91ef22d7a30a9"} Mar 13 12:05:48 crc kubenswrapper[4837]: I0313 12:05:48.867919 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d9fb-account-create-update-5jvwd" event={"ID":"c5601ea4-ee81-4e2a-b370-268652332465","Type":"ContainerStarted","Data":"258afa8ad3c4b205a4d5ebbc2dad025a8beb1c8bcd26054b8547c8dad13f8f6c"} Mar 13 12:05:48 crc kubenswrapper[4837]: I0313 12:05:48.867976 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d9fb-account-create-update-5jvwd" event={"ID":"c5601ea4-ee81-4e2a-b370-268652332465","Type":"ContainerStarted","Data":"390d3e9855f3f57028fb161478457ea4b1e4ef62c19e2881ed44da5bdeee6e27"} Mar 13 12:05:48 crc kubenswrapper[4837]: I0313 12:05:48.869351 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-gmczg" event={"ID":"5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e","Type":"ContainerStarted","Data":"b098a43f2d9d03bb99f812f96bcd81706cca6018f0407d1c0e0208765a59836f"} Mar 13 12:05:48 crc kubenswrapper[4837]: I0313 12:05:48.871889 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rb248" event={"ID":"737740b8-437c-4c6a-a16f-ac0afcf40b95","Type":"ContainerStarted","Data":"d70df55b1790a1ebbe6a4a6a6fb3d1059c01bf3fda86f5702d2b75048a834895"} Mar 13 12:05:48 crc kubenswrapper[4837]: I0313 12:05:48.874598 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-7vs6f"] Mar 13 12:05:48 crc kubenswrapper[4837]: I0313 12:05:48.874834 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-7vs6f" podUID="ae5deee0-59c4-4fa7-8d8c-e12b516885dc" containerName="dnsmasq-dns" containerID="cri-o://5550e9e7c07715ced202f75aa48fd7f0efbb60ae462c21120aa95405a645e06f" gracePeriod=10 Mar 13 12:05:48 crc kubenswrapper[4837]: I0313 12:05:48.897850 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-69xgx" podStartSLOduration=2.213518497 podStartE2EDuration="5.89781357s" podCreationTimestamp="2026-03-13 12:05:43 +0000 UTC" firstStartedPulling="2026-03-13 12:05:44.382503534 +0000 UTC m=+1060.020770297" lastFinishedPulling="2026-03-13 12:05:48.066798607 +0000 UTC m=+1063.705065370" observedRunningTime="2026-03-13 12:05:48.894788134 +0000 UTC m=+1064.533054907" watchObservedRunningTime="2026-03-13 12:05:48.89781357 +0000 UTC m=+1064.536080333" Mar 13 12:05:48 crc kubenswrapper[4837]: I0313 12:05:48.923149 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-d9fb-account-create-update-5jvwd" podStartSLOduration=1.9231266420000002 podStartE2EDuration="1.923126642s" podCreationTimestamp="2026-03-13 12:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:05:48.914591371 +0000 UTC m=+1064.552858134" watchObservedRunningTime="2026-03-13 12:05:48.923126642 +0000 UTC m=+1064.561393415" Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.405735 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-7vs6f" Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.530179 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhg95\" (UniqueName: \"kubernetes.io/projected/ae5deee0-59c4-4fa7-8d8c-e12b516885dc-kube-api-access-lhg95\") pod \"ae5deee0-59c4-4fa7-8d8c-e12b516885dc\" (UID: \"ae5deee0-59c4-4fa7-8d8c-e12b516885dc\") " Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.530296 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae5deee0-59c4-4fa7-8d8c-e12b516885dc-dns-svc\") pod \"ae5deee0-59c4-4fa7-8d8c-e12b516885dc\" (UID: \"ae5deee0-59c4-4fa7-8d8c-e12b516885dc\") " Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.530377 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae5deee0-59c4-4fa7-8d8c-e12b516885dc-config\") pod \"ae5deee0-59c4-4fa7-8d8c-e12b516885dc\" (UID: \"ae5deee0-59c4-4fa7-8d8c-e12b516885dc\") " Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.543891 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae5deee0-59c4-4fa7-8d8c-e12b516885dc-kube-api-access-lhg95" (OuterVolumeSpecName: "kube-api-access-lhg95") pod "ae5deee0-59c4-4fa7-8d8c-e12b516885dc" (UID: "ae5deee0-59c4-4fa7-8d8c-e12b516885dc"). InnerVolumeSpecName "kube-api-access-lhg95". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.595383 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae5deee0-59c4-4fa7-8d8c-e12b516885dc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ae5deee0-59c4-4fa7-8d8c-e12b516885dc" (UID: "ae5deee0-59c4-4fa7-8d8c-e12b516885dc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.608955 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae5deee0-59c4-4fa7-8d8c-e12b516885dc-config" (OuterVolumeSpecName: "config") pod "ae5deee0-59c4-4fa7-8d8c-e12b516885dc" (UID: "ae5deee0-59c4-4fa7-8d8c-e12b516885dc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.631860 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae5deee0-59c4-4fa7-8d8c-e12b516885dc-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.631896 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae5deee0-59c4-4fa7-8d8c-e12b516885dc-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.631906 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhg95\" (UniqueName: \"kubernetes.io/projected/ae5deee0-59c4-4fa7-8d8c-e12b516885dc-kube-api-access-lhg95\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.860544 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-tgg8d"] Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.867682 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-tgg8d"] Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.879401 4837 generic.go:334] "Generic (PLEG): container finished" podID="5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e" containerID="e460ab529bcbaef415dda78934a987cdd80d8b23f4cad796d19dcd468ce2d5f7" exitCode=0 Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.879449 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-gmczg" event={"ID":"5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e","Type":"ContainerDied","Data":"e460ab529bcbaef415dda78934a987cdd80d8b23f4cad796d19dcd468ce2d5f7"} Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.881347 4837 generic.go:334] "Generic (PLEG): container finished" podID="ae5deee0-59c4-4fa7-8d8c-e12b516885dc" containerID="5550e9e7c07715ced202f75aa48fd7f0efbb60ae462c21120aa95405a645e06f" exitCode=0 Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.881406 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-7vs6f" event={"ID":"ae5deee0-59c4-4fa7-8d8c-e12b516885dc","Type":"ContainerDied","Data":"5550e9e7c07715ced202f75aa48fd7f0efbb60ae462c21120aa95405a645e06f"} Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.881437 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-7vs6f" event={"ID":"ae5deee0-59c4-4fa7-8d8c-e12b516885dc","Type":"ContainerDied","Data":"dac372c91256bbeabdb4ee95a6241431746c1c91b7bb9f40ca6c3bd206fe1f51"} Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.881458 4837 scope.go:117] "RemoveContainer" containerID="5550e9e7c07715ced202f75aa48fd7f0efbb60ae462c21120aa95405a645e06f" Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.881560 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-7vs6f" Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.885724 4837 generic.go:334] "Generic (PLEG): container finished" podID="737740b8-437c-4c6a-a16f-ac0afcf40b95" containerID="7c2129e0048255a871372a3d7023ed828ca0d6f1f4e610da012f5353ff07c822" exitCode=0 Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.885819 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rb248" event={"ID":"737740b8-437c-4c6a-a16f-ac0afcf40b95","Type":"ContainerDied","Data":"7c2129e0048255a871372a3d7023ed828ca0d6f1f4e610da012f5353ff07c822"} Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.886939 4837 generic.go:334] "Generic (PLEG): container finished" podID="2230cdcb-087e-4882-8aea-c5d850b711ac" containerID="40deea41e769b1017207ec620ac05bd1eeae7028c9b2f3cacb4bc02a7f4fffdf" exitCode=0 Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.886986 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d970-account-create-update-lkc7z" event={"ID":"2230cdcb-087e-4882-8aea-c5d850b711ac","Type":"ContainerDied","Data":"40deea41e769b1017207ec620ac05bd1eeae7028c9b2f3cacb4bc02a7f4fffdf"} Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.888071 4837 generic.go:334] "Generic (PLEG): container finished" podID="c5601ea4-ee81-4e2a-b370-268652332465" containerID="258afa8ad3c4b205a4d5ebbc2dad025a8beb1c8bcd26054b8547c8dad13f8f6c" exitCode=0 Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.888976 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d9fb-account-create-update-5jvwd" event={"ID":"c5601ea4-ee81-4e2a-b370-268652332465","Type":"ContainerDied","Data":"258afa8ad3c4b205a4d5ebbc2dad025a8beb1c8bcd26054b8547c8dad13f8f6c"} Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.906245 4837 scope.go:117] "RemoveContainer" containerID="86bd795421dbb8f853056d0ba23fbd4446b0d4a512f91ff6b5cf999a58e9698a" Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.949518 4837 scope.go:117] "RemoveContainer" containerID="5550e9e7c07715ced202f75aa48fd7f0efbb60ae462c21120aa95405a645e06f" Mar 13 12:05:49 crc kubenswrapper[4837]: E0313 12:05:49.950982 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5550e9e7c07715ced202f75aa48fd7f0efbb60ae462c21120aa95405a645e06f\": container with ID starting with 5550e9e7c07715ced202f75aa48fd7f0efbb60ae462c21120aa95405a645e06f not found: ID does not exist" containerID="5550e9e7c07715ced202f75aa48fd7f0efbb60ae462c21120aa95405a645e06f" Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.951064 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5550e9e7c07715ced202f75aa48fd7f0efbb60ae462c21120aa95405a645e06f"} err="failed to get container status \"5550e9e7c07715ced202f75aa48fd7f0efbb60ae462c21120aa95405a645e06f\": rpc error: code = NotFound desc = could not find container \"5550e9e7c07715ced202f75aa48fd7f0efbb60ae462c21120aa95405a645e06f\": container with ID starting with 5550e9e7c07715ced202f75aa48fd7f0efbb60ae462c21120aa95405a645e06f not found: ID does not exist" Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.951109 4837 scope.go:117] "RemoveContainer" containerID="86bd795421dbb8f853056d0ba23fbd4446b0d4a512f91ff6b5cf999a58e9698a" Mar 13 12:05:49 crc kubenswrapper[4837]: E0313 12:05:49.951477 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86bd795421dbb8f853056d0ba23fbd4446b0d4a512f91ff6b5cf999a58e9698a\": container with ID starting with 86bd795421dbb8f853056d0ba23fbd4446b0d4a512f91ff6b5cf999a58e9698a not found: ID does not exist" containerID="86bd795421dbb8f853056d0ba23fbd4446b0d4a512f91ff6b5cf999a58e9698a" Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.951529 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86bd795421dbb8f853056d0ba23fbd4446b0d4a512f91ff6b5cf999a58e9698a"} err="failed to get container status \"86bd795421dbb8f853056d0ba23fbd4446b0d4a512f91ff6b5cf999a58e9698a\": rpc error: code = NotFound desc = could not find container \"86bd795421dbb8f853056d0ba23fbd4446b0d4a512f91ff6b5cf999a58e9698a\": container with ID starting with 86bd795421dbb8f853056d0ba23fbd4446b0d4a512f91ff6b5cf999a58e9698a not found: ID does not exist" Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.965478 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-7vs6f"] Mar 13 12:05:49 crc kubenswrapper[4837]: I0313 12:05:49.972798 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-7vs6f"] Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.063839 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae5deee0-59c4-4fa7-8d8c-e12b516885dc" path="/var/lib/kubelet/pods/ae5deee0-59c4-4fa7-8d8c-e12b516885dc/volumes" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.064775 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2c8bb14-f7ed-4a97-a6f8-73f67824897e" path="/var/lib/kubelet/pods/b2c8bb14-f7ed-4a97-a6f8-73f67824897e/volumes" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.249490 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rb248" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.361730 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5qkz\" (UniqueName: \"kubernetes.io/projected/737740b8-437c-4c6a-a16f-ac0afcf40b95-kube-api-access-g5qkz\") pod \"737740b8-437c-4c6a-a16f-ac0afcf40b95\" (UID: \"737740b8-437c-4c6a-a16f-ac0afcf40b95\") " Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.361827 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/737740b8-437c-4c6a-a16f-ac0afcf40b95-operator-scripts\") pod \"737740b8-437c-4c6a-a16f-ac0afcf40b95\" (UID: \"737740b8-437c-4c6a-a16f-ac0afcf40b95\") " Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.362760 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/737740b8-437c-4c6a-a16f-ac0afcf40b95-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "737740b8-437c-4c6a-a16f-ac0afcf40b95" (UID: "737740b8-437c-4c6a-a16f-ac0afcf40b95"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.372010 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/737740b8-437c-4c6a-a16f-ac0afcf40b95-kube-api-access-g5qkz" (OuterVolumeSpecName: "kube-api-access-g5qkz") pod "737740b8-437c-4c6a-a16f-ac0afcf40b95" (UID: "737740b8-437c-4c6a-a16f-ac0afcf40b95"). InnerVolumeSpecName "kube-api-access-g5qkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.464387 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d970-account-create-update-lkc7z" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.464515 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5qkz\" (UniqueName: \"kubernetes.io/projected/737740b8-437c-4c6a-a16f-ac0afcf40b95-kube-api-access-g5qkz\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.464943 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/737740b8-437c-4c6a-a16f-ac0afcf40b95-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.474935 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d9fb-account-create-update-5jvwd" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.488039 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-gmczg" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.566011 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5601ea4-ee81-4e2a-b370-268652332465-operator-scripts\") pod \"c5601ea4-ee81-4e2a-b370-268652332465\" (UID: \"c5601ea4-ee81-4e2a-b370-268652332465\") " Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.566065 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e-operator-scripts\") pod \"5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e\" (UID: \"5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e\") " Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.566179 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9j7s8\" (UniqueName: \"kubernetes.io/projected/5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e-kube-api-access-9j7s8\") pod \"5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e\" (UID: \"5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e\") " Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.566247 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fwgn\" (UniqueName: \"kubernetes.io/projected/2230cdcb-087e-4882-8aea-c5d850b711ac-kube-api-access-9fwgn\") pod \"2230cdcb-087e-4882-8aea-c5d850b711ac\" (UID: \"2230cdcb-087e-4882-8aea-c5d850b711ac\") " Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.566296 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2230cdcb-087e-4882-8aea-c5d850b711ac-operator-scripts\") pod \"2230cdcb-087e-4882-8aea-c5d850b711ac\" (UID: \"2230cdcb-087e-4882-8aea-c5d850b711ac\") " Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.566349 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npm9t\" (UniqueName: \"kubernetes.io/projected/c5601ea4-ee81-4e2a-b370-268652332465-kube-api-access-npm9t\") pod \"c5601ea4-ee81-4e2a-b370-268652332465\" (UID: \"c5601ea4-ee81-4e2a-b370-268652332465\") " Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.566536 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5601ea4-ee81-4e2a-b370-268652332465-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c5601ea4-ee81-4e2a-b370-268652332465" (UID: "c5601ea4-ee81-4e2a-b370-268652332465"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.567018 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5601ea4-ee81-4e2a-b370-268652332465-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.567385 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e" (UID: "5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.567381 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2230cdcb-087e-4882-8aea-c5d850b711ac-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2230cdcb-087e-4882-8aea-c5d850b711ac" (UID: "2230cdcb-087e-4882-8aea-c5d850b711ac"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.569427 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2230cdcb-087e-4882-8aea-c5d850b711ac-kube-api-access-9fwgn" (OuterVolumeSpecName: "kube-api-access-9fwgn") pod "2230cdcb-087e-4882-8aea-c5d850b711ac" (UID: "2230cdcb-087e-4882-8aea-c5d850b711ac"). InnerVolumeSpecName "kube-api-access-9fwgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.569927 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e-kube-api-access-9j7s8" (OuterVolumeSpecName: "kube-api-access-9j7s8") pod "5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e" (UID: "5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e"). InnerVolumeSpecName "kube-api-access-9j7s8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.573138 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5601ea4-ee81-4e2a-b370-268652332465-kube-api-access-npm9t" (OuterVolumeSpecName: "kube-api-access-npm9t") pod "c5601ea4-ee81-4e2a-b370-268652332465" (UID: "c5601ea4-ee81-4e2a-b370-268652332465"). InnerVolumeSpecName "kube-api-access-npm9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.651142 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-jkthw"] Mar 13 12:05:51 crc kubenswrapper[4837]: E0313 12:05:51.651534 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae5deee0-59c4-4fa7-8d8c-e12b516885dc" containerName="init" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.651553 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae5deee0-59c4-4fa7-8d8c-e12b516885dc" containerName="init" Mar 13 12:05:51 crc kubenswrapper[4837]: E0313 12:05:51.651566 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e" containerName="mariadb-database-create" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.651574 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e" containerName="mariadb-database-create" Mar 13 12:05:51 crc kubenswrapper[4837]: E0313 12:05:51.651591 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="737740b8-437c-4c6a-a16f-ac0afcf40b95" containerName="mariadb-database-create" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.651600 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="737740b8-437c-4c6a-a16f-ac0afcf40b95" containerName="mariadb-database-create" Mar 13 12:05:51 crc kubenswrapper[4837]: E0313 12:05:51.651615 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae5deee0-59c4-4fa7-8d8c-e12b516885dc" containerName="dnsmasq-dns" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.651623 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae5deee0-59c4-4fa7-8d8c-e12b516885dc" containerName="dnsmasq-dns" Mar 13 12:05:51 crc kubenswrapper[4837]: E0313 12:05:51.651658 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5601ea4-ee81-4e2a-b370-268652332465" containerName="mariadb-account-create-update" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.651668 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5601ea4-ee81-4e2a-b370-268652332465" containerName="mariadb-account-create-update" Mar 13 12:05:51 crc kubenswrapper[4837]: E0313 12:05:51.651695 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2230cdcb-087e-4882-8aea-c5d850b711ac" containerName="mariadb-account-create-update" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.651705 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="2230cdcb-087e-4882-8aea-c5d850b711ac" containerName="mariadb-account-create-update" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.651895 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae5deee0-59c4-4fa7-8d8c-e12b516885dc" containerName="dnsmasq-dns" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.651914 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="737740b8-437c-4c6a-a16f-ac0afcf40b95" containerName="mariadb-database-create" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.651927 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="2230cdcb-087e-4882-8aea-c5d850b711ac" containerName="mariadb-account-create-update" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.651942 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e" containerName="mariadb-database-create" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.651956 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5601ea4-ee81-4e2a-b370-268652332465" containerName="mariadb-account-create-update" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.652614 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-jkthw" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.658714 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.658731 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-dvhzm" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.660163 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-jkthw"] Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.670618 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fwgn\" (UniqueName: \"kubernetes.io/projected/2230cdcb-087e-4882-8aea-c5d850b711ac-kube-api-access-9fwgn\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.670688 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2230cdcb-087e-4882-8aea-c5d850b711ac-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.670709 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npm9t\" (UniqueName: \"kubernetes.io/projected/c5601ea4-ee81-4e2a-b370-268652332465-kube-api-access-npm9t\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.670725 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.670741 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9j7s8\" (UniqueName: \"kubernetes.io/projected/5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e-kube-api-access-9j7s8\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.772546 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4490fb3-45d7-4b40-ad34-5bf33ba88491-config-data\") pod \"glance-db-sync-jkthw\" (UID: \"b4490fb3-45d7-4b40-ad34-5bf33ba88491\") " pod="openstack/glance-db-sync-jkthw" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.772709 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zh4r\" (UniqueName: \"kubernetes.io/projected/b4490fb3-45d7-4b40-ad34-5bf33ba88491-kube-api-access-5zh4r\") pod \"glance-db-sync-jkthw\" (UID: \"b4490fb3-45d7-4b40-ad34-5bf33ba88491\") " pod="openstack/glance-db-sync-jkthw" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.772758 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4490fb3-45d7-4b40-ad34-5bf33ba88491-combined-ca-bundle\") pod \"glance-db-sync-jkthw\" (UID: \"b4490fb3-45d7-4b40-ad34-5bf33ba88491\") " pod="openstack/glance-db-sync-jkthw" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.772878 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b4490fb3-45d7-4b40-ad34-5bf33ba88491-db-sync-config-data\") pod \"glance-db-sync-jkthw\" (UID: \"b4490fb3-45d7-4b40-ad34-5bf33ba88491\") " pod="openstack/glance-db-sync-jkthw" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.874881 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4490fb3-45d7-4b40-ad34-5bf33ba88491-config-data\") pod \"glance-db-sync-jkthw\" (UID: \"b4490fb3-45d7-4b40-ad34-5bf33ba88491\") " pod="openstack/glance-db-sync-jkthw" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.874965 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zh4r\" (UniqueName: \"kubernetes.io/projected/b4490fb3-45d7-4b40-ad34-5bf33ba88491-kube-api-access-5zh4r\") pod \"glance-db-sync-jkthw\" (UID: \"b4490fb3-45d7-4b40-ad34-5bf33ba88491\") " pod="openstack/glance-db-sync-jkthw" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.875007 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4490fb3-45d7-4b40-ad34-5bf33ba88491-combined-ca-bundle\") pod \"glance-db-sync-jkthw\" (UID: \"b4490fb3-45d7-4b40-ad34-5bf33ba88491\") " pod="openstack/glance-db-sync-jkthw" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.875101 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b4490fb3-45d7-4b40-ad34-5bf33ba88491-db-sync-config-data\") pod \"glance-db-sync-jkthw\" (UID: \"b4490fb3-45d7-4b40-ad34-5bf33ba88491\") " pod="openstack/glance-db-sync-jkthw" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.879460 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b4490fb3-45d7-4b40-ad34-5bf33ba88491-db-sync-config-data\") pod \"glance-db-sync-jkthw\" (UID: \"b4490fb3-45d7-4b40-ad34-5bf33ba88491\") " pod="openstack/glance-db-sync-jkthw" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.879762 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4490fb3-45d7-4b40-ad34-5bf33ba88491-combined-ca-bundle\") pod \"glance-db-sync-jkthw\" (UID: \"b4490fb3-45d7-4b40-ad34-5bf33ba88491\") " pod="openstack/glance-db-sync-jkthw" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.880139 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4490fb3-45d7-4b40-ad34-5bf33ba88491-config-data\") pod \"glance-db-sync-jkthw\" (UID: \"b4490fb3-45d7-4b40-ad34-5bf33ba88491\") " pod="openstack/glance-db-sync-jkthw" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.895241 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zh4r\" (UniqueName: \"kubernetes.io/projected/b4490fb3-45d7-4b40-ad34-5bf33ba88491-kube-api-access-5zh4r\") pod \"glance-db-sync-jkthw\" (UID: \"b4490fb3-45d7-4b40-ad34-5bf33ba88491\") " pod="openstack/glance-db-sync-jkthw" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.907998 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d9fb-account-create-update-5jvwd" event={"ID":"c5601ea4-ee81-4e2a-b370-268652332465","Type":"ContainerDied","Data":"390d3e9855f3f57028fb161478457ea4b1e4ef62c19e2881ed44da5bdeee6e27"} Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.908054 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="390d3e9855f3f57028fb161478457ea4b1e4ef62c19e2881ed44da5bdeee6e27" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.908119 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d9fb-account-create-update-5jvwd" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.911028 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-gmczg" event={"ID":"5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e","Type":"ContainerDied","Data":"b098a43f2d9d03bb99f812f96bcd81706cca6018f0407d1c0e0208765a59836f"} Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.911070 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b098a43f2d9d03bb99f812f96bcd81706cca6018f0407d1c0e0208765a59836f" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.911125 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-gmczg" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.921036 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rb248" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.921034 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rb248" event={"ID":"737740b8-437c-4c6a-a16f-ac0afcf40b95","Type":"ContainerDied","Data":"d70df55b1790a1ebbe6a4a6a6fb3d1059c01bf3fda86f5702d2b75048a834895"} Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.921319 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d70df55b1790a1ebbe6a4a6a6fb3d1059c01bf3fda86f5702d2b75048a834895" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.923512 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d970-account-create-update-lkc7z" event={"ID":"2230cdcb-087e-4882-8aea-c5d850b711ac","Type":"ContainerDied","Data":"ba21907e7c2549bba4ed2433e390f6db6aab42f1bc7683bed090aa5abb21d188"} Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.923542 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba21907e7c2549bba4ed2433e390f6db6aab42f1bc7683bed090aa5abb21d188" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.927966 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d970-account-create-update-lkc7z" Mar 13 12:05:51 crc kubenswrapper[4837]: I0313 12:05:51.970969 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-jkthw" Mar 13 12:05:52 crc kubenswrapper[4837]: I0313 12:05:52.481384 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-jkthw"] Mar 13 12:05:52 crc kubenswrapper[4837]: I0313 12:05:52.933450 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-jkthw" event={"ID":"b4490fb3-45d7-4b40-ad34-5bf33ba88491","Type":"ContainerStarted","Data":"17d86872aee9655dc63bbe1e8b164cedfec91be43293c0487555e85e1e22c479"} Mar 13 12:05:54 crc kubenswrapper[4837]: I0313 12:05:54.849747 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-zgdc9"] Mar 13 12:05:54 crc kubenswrapper[4837]: I0313 12:05:54.850713 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zgdc9" Mar 13 12:05:54 crc kubenswrapper[4837]: I0313 12:05:54.854393 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 13 12:05:54 crc kubenswrapper[4837]: I0313 12:05:54.858887 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-zgdc9"] Mar 13 12:05:54 crc kubenswrapper[4837]: I0313 12:05:54.926397 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4x4s\" (UniqueName: \"kubernetes.io/projected/2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f-kube-api-access-w4x4s\") pod \"root-account-create-update-zgdc9\" (UID: \"2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f\") " pod="openstack/root-account-create-update-zgdc9" Mar 13 12:05:54 crc kubenswrapper[4837]: I0313 12:05:54.926494 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f-operator-scripts\") pod \"root-account-create-update-zgdc9\" (UID: \"2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f\") " pod="openstack/root-account-create-update-zgdc9" Mar 13 12:05:54 crc kubenswrapper[4837]: I0313 12:05:54.959754 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 13 12:05:55 crc kubenswrapper[4837]: I0313 12:05:55.028712 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4x4s\" (UniqueName: \"kubernetes.io/projected/2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f-kube-api-access-w4x4s\") pod \"root-account-create-update-zgdc9\" (UID: \"2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f\") " pod="openstack/root-account-create-update-zgdc9" Mar 13 12:05:55 crc kubenswrapper[4837]: I0313 12:05:55.028832 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f-operator-scripts\") pod \"root-account-create-update-zgdc9\" (UID: \"2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f\") " pod="openstack/root-account-create-update-zgdc9" Mar 13 12:05:55 crc kubenswrapper[4837]: I0313 12:05:55.030737 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f-operator-scripts\") pod \"root-account-create-update-zgdc9\" (UID: \"2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f\") " pod="openstack/root-account-create-update-zgdc9" Mar 13 12:05:55 crc kubenswrapper[4837]: I0313 12:05:55.055135 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4x4s\" (UniqueName: \"kubernetes.io/projected/2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f-kube-api-access-w4x4s\") pod \"root-account-create-update-zgdc9\" (UID: \"2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f\") " pod="openstack/root-account-create-update-zgdc9" Mar 13 12:05:55 crc kubenswrapper[4837]: I0313 12:05:55.215729 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zgdc9" Mar 13 12:05:55 crc kubenswrapper[4837]: I0313 12:05:55.442429 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/59565710-b9bc-46e6-ad92-7f12376de17c-etc-swift\") pod \"swift-storage-0\" (UID: \"59565710-b9bc-46e6-ad92-7f12376de17c\") " pod="openstack/swift-storage-0" Mar 13 12:05:55 crc kubenswrapper[4837]: I0313 12:05:55.451073 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/59565710-b9bc-46e6-ad92-7f12376de17c-etc-swift\") pod \"swift-storage-0\" (UID: \"59565710-b9bc-46e6-ad92-7f12376de17c\") " pod="openstack/swift-storage-0" Mar 13 12:05:55 crc kubenswrapper[4837]: I0313 12:05:55.514375 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 13 12:05:55 crc kubenswrapper[4837]: I0313 12:05:55.663505 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-zgdc9"] Mar 13 12:05:55 crc kubenswrapper[4837]: W0313 12:05:55.673101 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f1ebb88_e1c9_4839_9c66_8bd86e4b0d5f.slice/crio-50d96f39e72fd0fb719f907fcf63961ca9fa32d054985b3b8e4e41d43e6a3970 WatchSource:0}: Error finding container 50d96f39e72fd0fb719f907fcf63961ca9fa32d054985b3b8e4e41d43e6a3970: Status 404 returned error can't find the container with id 50d96f39e72fd0fb719f907fcf63961ca9fa32d054985b3b8e4e41d43e6a3970 Mar 13 12:05:55 crc kubenswrapper[4837]: I0313 12:05:55.959876 4837 generic.go:334] "Generic (PLEG): container finished" podID="24998567-afa6-4adc-a503-4fc054946aef" containerID="ab5c46268962acc94f8e7f96b6af1d93a9a7a4507799423762e91ef22d7a30a9" exitCode=0 Mar 13 12:05:55 crc kubenswrapper[4837]: I0313 12:05:55.959906 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-69xgx" event={"ID":"24998567-afa6-4adc-a503-4fc054946aef","Type":"ContainerDied","Data":"ab5c46268962acc94f8e7f96b6af1d93a9a7a4507799423762e91ef22d7a30a9"} Mar 13 12:05:55 crc kubenswrapper[4837]: I0313 12:05:55.962223 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zgdc9" event={"ID":"2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f","Type":"ContainerStarted","Data":"248bbf02c11ba4d4459897916fec2f24105abad663f25d012f6888d993c3fbac"} Mar 13 12:05:55 crc kubenswrapper[4837]: I0313 12:05:55.962252 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zgdc9" event={"ID":"2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f","Type":"ContainerStarted","Data":"50d96f39e72fd0fb719f907fcf63961ca9fa32d054985b3b8e4e41d43e6a3970"} Mar 13 12:05:56 crc kubenswrapper[4837]: I0313 12:05:56.003562 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-zgdc9" podStartSLOduration=2.003540309 podStartE2EDuration="2.003540309s" podCreationTimestamp="2026-03-13 12:05:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:05:55.996400232 +0000 UTC m=+1071.634667005" watchObservedRunningTime="2026-03-13 12:05:56.003540309 +0000 UTC m=+1071.641807072" Mar 13 12:05:56 crc kubenswrapper[4837]: I0313 12:05:56.139716 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 13 12:05:56 crc kubenswrapper[4837]: W0313 12:05:56.175900 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59565710_b9bc_46e6_ad92_7f12376de17c.slice/crio-2fe1e5cd43292f1fe51e0913fb9f74c254e49b17cae827baa09a3570dfd0d830 WatchSource:0}: Error finding container 2fe1e5cd43292f1fe51e0913fb9f74c254e49b17cae827baa09a3570dfd0d830: Status 404 returned error can't find the container with id 2fe1e5cd43292f1fe51e0913fb9f74c254e49b17cae827baa09a3570dfd0d830 Mar 13 12:05:56 crc kubenswrapper[4837]: I0313 12:05:56.972412 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-nbhpw" podUID="32dc51d9-5638-4530-91c8-5be8c13e60f3" containerName="ovn-controller" probeResult="failure" output=< Mar 13 12:05:56 crc kubenswrapper[4837]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 13 12:05:56 crc kubenswrapper[4837]: > Mar 13 12:05:56 crc kubenswrapper[4837]: I0313 12:05:56.974418 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"59565710-b9bc-46e6-ad92-7f12376de17c","Type":"ContainerStarted","Data":"2fe1e5cd43292f1fe51e0913fb9f74c254e49b17cae827baa09a3570dfd0d830"} Mar 13 12:05:56 crc kubenswrapper[4837]: I0313 12:05:56.975568 4837 generic.go:334] "Generic (PLEG): container finished" podID="2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f" containerID="248bbf02c11ba4d4459897916fec2f24105abad663f25d012f6888d993c3fbac" exitCode=0 Mar 13 12:05:56 crc kubenswrapper[4837]: I0313 12:05:56.975985 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zgdc9" event={"ID":"2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f","Type":"ContainerDied","Data":"248bbf02c11ba4d4459897916fec2f24105abad663f25d012f6888d993c3fbac"} Mar 13 12:05:57 crc kubenswrapper[4837]: I0313 12:05:57.092429 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-ls998" Mar 13 12:05:57 crc kubenswrapper[4837]: I0313 12:05:57.433530 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-69xgx" Mar 13 12:05:57 crc kubenswrapper[4837]: I0313 12:05:57.585180 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxh5\" (UniqueName: \"kubernetes.io/projected/24998567-afa6-4adc-a503-4fc054946aef-kube-api-access-pcxh5\") pod \"24998567-afa6-4adc-a503-4fc054946aef\" (UID: \"24998567-afa6-4adc-a503-4fc054946aef\") " Mar 13 12:05:57 crc kubenswrapper[4837]: I0313 12:05:57.585293 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/24998567-afa6-4adc-a503-4fc054946aef-ring-data-devices\") pod \"24998567-afa6-4adc-a503-4fc054946aef\" (UID: \"24998567-afa6-4adc-a503-4fc054946aef\") " Mar 13 12:05:57 crc kubenswrapper[4837]: I0313 12:05:57.585331 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/24998567-afa6-4adc-a503-4fc054946aef-scripts\") pod \"24998567-afa6-4adc-a503-4fc054946aef\" (UID: \"24998567-afa6-4adc-a503-4fc054946aef\") " Mar 13 12:05:57 crc kubenswrapper[4837]: I0313 12:05:57.585372 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/24998567-afa6-4adc-a503-4fc054946aef-swiftconf\") pod \"24998567-afa6-4adc-a503-4fc054946aef\" (UID: \"24998567-afa6-4adc-a503-4fc054946aef\") " Mar 13 12:05:57 crc kubenswrapper[4837]: I0313 12:05:57.585416 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24998567-afa6-4adc-a503-4fc054946aef-combined-ca-bundle\") pod \"24998567-afa6-4adc-a503-4fc054946aef\" (UID: \"24998567-afa6-4adc-a503-4fc054946aef\") " Mar 13 12:05:57 crc kubenswrapper[4837]: I0313 12:05:57.585450 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/24998567-afa6-4adc-a503-4fc054946aef-etc-swift\") pod \"24998567-afa6-4adc-a503-4fc054946aef\" (UID: \"24998567-afa6-4adc-a503-4fc054946aef\") " Mar 13 12:05:57 crc kubenswrapper[4837]: I0313 12:05:57.586173 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24998567-afa6-4adc-a503-4fc054946aef-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "24998567-afa6-4adc-a503-4fc054946aef" (UID: "24998567-afa6-4adc-a503-4fc054946aef"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:05:57 crc kubenswrapper[4837]: I0313 12:05:57.586804 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24998567-afa6-4adc-a503-4fc054946aef-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "24998567-afa6-4adc-a503-4fc054946aef" (UID: "24998567-afa6-4adc-a503-4fc054946aef"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:05:57 crc kubenswrapper[4837]: I0313 12:05:57.587298 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/24998567-afa6-4adc-a503-4fc054946aef-dispersionconf\") pod \"24998567-afa6-4adc-a503-4fc054946aef\" (UID: \"24998567-afa6-4adc-a503-4fc054946aef\") " Mar 13 12:05:57 crc kubenswrapper[4837]: I0313 12:05:57.588554 4837 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/24998567-afa6-4adc-a503-4fc054946aef-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:57 crc kubenswrapper[4837]: I0313 12:05:57.588586 4837 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/24998567-afa6-4adc-a503-4fc054946aef-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:57 crc kubenswrapper[4837]: I0313 12:05:57.592601 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24998567-afa6-4adc-a503-4fc054946aef-kube-api-access-pcxh5" (OuterVolumeSpecName: "kube-api-access-pcxh5") pod "24998567-afa6-4adc-a503-4fc054946aef" (UID: "24998567-afa6-4adc-a503-4fc054946aef"). InnerVolumeSpecName "kube-api-access-pcxh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:05:57 crc kubenswrapper[4837]: I0313 12:05:57.603819 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24998567-afa6-4adc-a503-4fc054946aef-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "24998567-afa6-4adc-a503-4fc054946aef" (UID: "24998567-afa6-4adc-a503-4fc054946aef"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:05:57 crc kubenswrapper[4837]: I0313 12:05:57.622978 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24998567-afa6-4adc-a503-4fc054946aef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "24998567-afa6-4adc-a503-4fc054946aef" (UID: "24998567-afa6-4adc-a503-4fc054946aef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:05:57 crc kubenswrapper[4837]: I0313 12:05:57.630004 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24998567-afa6-4adc-a503-4fc054946aef-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "24998567-afa6-4adc-a503-4fc054946aef" (UID: "24998567-afa6-4adc-a503-4fc054946aef"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:05:57 crc kubenswrapper[4837]: I0313 12:05:57.640140 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24998567-afa6-4adc-a503-4fc054946aef-scripts" (OuterVolumeSpecName: "scripts") pod "24998567-afa6-4adc-a503-4fc054946aef" (UID: "24998567-afa6-4adc-a503-4fc054946aef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:05:57 crc kubenswrapper[4837]: I0313 12:05:57.690471 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/24998567-afa6-4adc-a503-4fc054946aef-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:57 crc kubenswrapper[4837]: I0313 12:05:57.690825 4837 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/24998567-afa6-4adc-a503-4fc054946aef-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:57 crc kubenswrapper[4837]: I0313 12:05:57.690837 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24998567-afa6-4adc-a503-4fc054946aef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:57 crc kubenswrapper[4837]: I0313 12:05:57.690849 4837 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/24998567-afa6-4adc-a503-4fc054946aef-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:57 crc kubenswrapper[4837]: I0313 12:05:57.690859 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxh5\" (UniqueName: \"kubernetes.io/projected/24998567-afa6-4adc-a503-4fc054946aef-kube-api-access-pcxh5\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:58 crc kubenswrapper[4837]: I0313 12:05:58.019532 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"59565710-b9bc-46e6-ad92-7f12376de17c","Type":"ContainerStarted","Data":"187247b205d7cc45292d712b11cea043a6f8c69a568d8a89e720e74e571f5b51"} Mar 13 12:05:58 crc kubenswrapper[4837]: I0313 12:05:58.021551 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"59565710-b9bc-46e6-ad92-7f12376de17c","Type":"ContainerStarted","Data":"00c6ab786fb53319d36acf5ced6f02a3ba152d4adfca8fe3dec6e9106fb84434"} Mar 13 12:05:58 crc kubenswrapper[4837]: I0313 12:05:58.028220 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-69xgx" Mar 13 12:05:58 crc kubenswrapper[4837]: I0313 12:05:58.028312 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-69xgx" event={"ID":"24998567-afa6-4adc-a503-4fc054946aef","Type":"ContainerDied","Data":"1ed0808ad30fc86e0f532b5d60900d3ad34dad8095b22ae074e18418eb961f7b"} Mar 13 12:05:58 crc kubenswrapper[4837]: I0313 12:05:58.028361 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ed0808ad30fc86e0f532b5d60900d3ad34dad8095b22ae074e18418eb961f7b" Mar 13 12:05:58 crc kubenswrapper[4837]: E0313 12:05:58.105793 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24998567_afa6_4adc_a503_4fc054946aef.slice\": RecentStats: unable to find data in memory cache]" Mar 13 12:05:58 crc kubenswrapper[4837]: I0313 12:05:58.583576 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zgdc9" Mar 13 12:05:58 crc kubenswrapper[4837]: I0313 12:05:58.712562 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f-operator-scripts\") pod \"2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f\" (UID: \"2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f\") " Mar 13 12:05:58 crc kubenswrapper[4837]: I0313 12:05:58.712738 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4x4s\" (UniqueName: \"kubernetes.io/projected/2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f-kube-api-access-w4x4s\") pod \"2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f\" (UID: \"2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f\") " Mar 13 12:05:58 crc kubenswrapper[4837]: I0313 12:05:58.714002 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f" (UID: "2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:05:58 crc kubenswrapper[4837]: I0313 12:05:58.720019 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f-kube-api-access-w4x4s" (OuterVolumeSpecName: "kube-api-access-w4x4s") pod "2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f" (UID: "2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f"). InnerVolumeSpecName "kube-api-access-w4x4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:05:58 crc kubenswrapper[4837]: I0313 12:05:58.814402 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4x4s\" (UniqueName: \"kubernetes.io/projected/2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f-kube-api-access-w4x4s\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:58 crc kubenswrapper[4837]: I0313 12:05:58.814442 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:05:59 crc kubenswrapper[4837]: I0313 12:05:59.043460 4837 generic.go:334] "Generic (PLEG): container finished" podID="e7b01be4-73b6-48eb-a06d-4fb38863d982" containerID="afe3a88a0e8205fefe122a8099e4acc29a3ebc22c1a9a9cfe3c00f5ab1794007" exitCode=0 Mar 13 12:05:59 crc kubenswrapper[4837]: I0313 12:05:59.043537 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e7b01be4-73b6-48eb-a06d-4fb38863d982","Type":"ContainerDied","Data":"afe3a88a0e8205fefe122a8099e4acc29a3ebc22c1a9a9cfe3c00f5ab1794007"} Mar 13 12:05:59 crc kubenswrapper[4837]: I0313 12:05:59.071274 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"59565710-b9bc-46e6-ad92-7f12376de17c","Type":"ContainerStarted","Data":"0c046f2ec93fccdfe07d6437b7eb7f95762c9aa74d5b6a853c7d3a653c626650"} Mar 13 12:05:59 crc kubenswrapper[4837]: I0313 12:05:59.071326 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"59565710-b9bc-46e6-ad92-7f12376de17c","Type":"ContainerStarted","Data":"a10a1af8e9199733d353c7a39cb8ff947c9d7b4af81a363d438963cbb65562b0"} Mar 13 12:05:59 crc kubenswrapper[4837]: I0313 12:05:59.071464 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zgdc9" event={"ID":"2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f","Type":"ContainerDied","Data":"50d96f39e72fd0fb719f907fcf63961ca9fa32d054985b3b8e4e41d43e6a3970"} Mar 13 12:05:59 crc kubenswrapper[4837]: I0313 12:05:59.071506 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50d96f39e72fd0fb719f907fcf63961ca9fa32d054985b3b8e4e41d43e6a3970" Mar 13 12:05:59 crc kubenswrapper[4837]: I0313 12:05:59.071565 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zgdc9" Mar 13 12:06:00 crc kubenswrapper[4837]: I0313 12:06:00.084929 4837 generic.go:334] "Generic (PLEG): container finished" podID="13254c8b-516c-435e-9db2-a8d518434f29" containerID="d7e3a2439b933c4a76e0a0472aaff3cf352a36e55d6c5a3aa674478c5299b9be" exitCode=0 Mar 13 12:06:00 crc kubenswrapper[4837]: I0313 12:06:00.085015 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"13254c8b-516c-435e-9db2-a8d518434f29","Type":"ContainerDied","Data":"d7e3a2439b933c4a76e0a0472aaff3cf352a36e55d6c5a3aa674478c5299b9be"} Mar 13 12:06:00 crc kubenswrapper[4837]: I0313 12:06:00.092310 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e7b01be4-73b6-48eb-a06d-4fb38863d982","Type":"ContainerStarted","Data":"616ab6849fdbe4a471544990fac8e7c0dc2c1e3c72338a214f97078a3b1bb01c"} Mar 13 12:06:00 crc kubenswrapper[4837]: I0313 12:06:00.093073 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 13 12:06:00 crc kubenswrapper[4837]: I0313 12:06:00.149117 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556726-gdbfm"] Mar 13 12:06:00 crc kubenswrapper[4837]: E0313 12:06:00.149475 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f" containerName="mariadb-account-create-update" Mar 13 12:06:00 crc kubenswrapper[4837]: I0313 12:06:00.149489 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f" containerName="mariadb-account-create-update" Mar 13 12:06:00 crc kubenswrapper[4837]: E0313 12:06:00.149507 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24998567-afa6-4adc-a503-4fc054946aef" containerName="swift-ring-rebalance" Mar 13 12:06:00 crc kubenswrapper[4837]: I0313 12:06:00.149513 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="24998567-afa6-4adc-a503-4fc054946aef" containerName="swift-ring-rebalance" Mar 13 12:06:00 crc kubenswrapper[4837]: I0313 12:06:00.149712 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="24998567-afa6-4adc-a503-4fc054946aef" containerName="swift-ring-rebalance" Mar 13 12:06:00 crc kubenswrapper[4837]: I0313 12:06:00.149729 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f" containerName="mariadb-account-create-update" Mar 13 12:06:00 crc kubenswrapper[4837]: I0313 12:06:00.150191 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556726-gdbfm" Mar 13 12:06:00 crc kubenswrapper[4837]: I0313 12:06:00.152097 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:06:00 crc kubenswrapper[4837]: I0313 12:06:00.158017 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:06:00 crc kubenswrapper[4837]: I0313 12:06:00.158265 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:06:00 crc kubenswrapper[4837]: I0313 12:06:00.183271 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556726-gdbfm"] Mar 13 12:06:00 crc kubenswrapper[4837]: I0313 12:06:00.213351 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=49.564002375 podStartE2EDuration="58.213327017s" podCreationTimestamp="2026-03-13 12:05:02 +0000 UTC" firstStartedPulling="2026-03-13 12:05:15.770907991 +0000 UTC m=+1031.409174754" lastFinishedPulling="2026-03-13 12:05:24.420232633 +0000 UTC m=+1040.058499396" observedRunningTime="2026-03-13 12:06:00.196945378 +0000 UTC m=+1075.835212141" watchObservedRunningTime="2026-03-13 12:06:00.213327017 +0000 UTC m=+1075.851593780" Mar 13 12:06:00 crc kubenswrapper[4837]: I0313 12:06:00.247680 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nqtb\" (UniqueName: \"kubernetes.io/projected/83f46fff-3510-4758-82a0-30099640fa33-kube-api-access-9nqtb\") pod \"auto-csr-approver-29556726-gdbfm\" (UID: \"83f46fff-3510-4758-82a0-30099640fa33\") " pod="openshift-infra/auto-csr-approver-29556726-gdbfm" Mar 13 12:06:00 crc kubenswrapper[4837]: I0313 12:06:00.349570 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nqtb\" (UniqueName: \"kubernetes.io/projected/83f46fff-3510-4758-82a0-30099640fa33-kube-api-access-9nqtb\") pod \"auto-csr-approver-29556726-gdbfm\" (UID: \"83f46fff-3510-4758-82a0-30099640fa33\") " pod="openshift-infra/auto-csr-approver-29556726-gdbfm" Mar 13 12:06:00 crc kubenswrapper[4837]: I0313 12:06:00.372809 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nqtb\" (UniqueName: \"kubernetes.io/projected/83f46fff-3510-4758-82a0-30099640fa33-kube-api-access-9nqtb\") pod \"auto-csr-approver-29556726-gdbfm\" (UID: \"83f46fff-3510-4758-82a0-30099640fa33\") " pod="openshift-infra/auto-csr-approver-29556726-gdbfm" Mar 13 12:06:00 crc kubenswrapper[4837]: I0313 12:06:00.481930 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556726-gdbfm" Mar 13 12:06:01 crc kubenswrapper[4837]: I0313 12:06:01.988265 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-nbhpw" podUID="32dc51d9-5638-4530-91c8-5be8c13e60f3" containerName="ovn-controller" probeResult="failure" output=< Mar 13 12:06:01 crc kubenswrapper[4837]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 13 12:06:01 crc kubenswrapper[4837]: > Mar 13 12:06:02 crc kubenswrapper[4837]: I0313 12:06:02.065592 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-ls998" Mar 13 12:06:02 crc kubenswrapper[4837]: I0313 12:06:02.283700 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-nbhpw-config-hrgcj"] Mar 13 12:06:02 crc kubenswrapper[4837]: I0313 12:06:02.284831 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nbhpw-config-hrgcj" Mar 13 12:06:02 crc kubenswrapper[4837]: I0313 12:06:02.288904 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 13 12:06:02 crc kubenswrapper[4837]: I0313 12:06:02.313290 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nbhpw-config-hrgcj"] Mar 13 12:06:02 crc kubenswrapper[4837]: I0313 12:06:02.401620 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6776e647-6987-4359-baa9-14ba621118d2-var-run-ovn\") pod \"ovn-controller-nbhpw-config-hrgcj\" (UID: \"6776e647-6987-4359-baa9-14ba621118d2\") " pod="openstack/ovn-controller-nbhpw-config-hrgcj" Mar 13 12:06:02 crc kubenswrapper[4837]: I0313 12:06:02.401706 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jgwn\" (UniqueName: \"kubernetes.io/projected/6776e647-6987-4359-baa9-14ba621118d2-kube-api-access-5jgwn\") pod \"ovn-controller-nbhpw-config-hrgcj\" (UID: \"6776e647-6987-4359-baa9-14ba621118d2\") " pod="openstack/ovn-controller-nbhpw-config-hrgcj" Mar 13 12:06:02 crc kubenswrapper[4837]: I0313 12:06:02.401745 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6776e647-6987-4359-baa9-14ba621118d2-var-log-ovn\") pod \"ovn-controller-nbhpw-config-hrgcj\" (UID: \"6776e647-6987-4359-baa9-14ba621118d2\") " pod="openstack/ovn-controller-nbhpw-config-hrgcj" Mar 13 12:06:02 crc kubenswrapper[4837]: I0313 12:06:02.401810 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6776e647-6987-4359-baa9-14ba621118d2-additional-scripts\") pod \"ovn-controller-nbhpw-config-hrgcj\" (UID: \"6776e647-6987-4359-baa9-14ba621118d2\") " pod="openstack/ovn-controller-nbhpw-config-hrgcj" Mar 13 12:06:02 crc kubenswrapper[4837]: I0313 12:06:02.401860 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6776e647-6987-4359-baa9-14ba621118d2-var-run\") pod \"ovn-controller-nbhpw-config-hrgcj\" (UID: \"6776e647-6987-4359-baa9-14ba621118d2\") " pod="openstack/ovn-controller-nbhpw-config-hrgcj" Mar 13 12:06:02 crc kubenswrapper[4837]: I0313 12:06:02.401887 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6776e647-6987-4359-baa9-14ba621118d2-scripts\") pod \"ovn-controller-nbhpw-config-hrgcj\" (UID: \"6776e647-6987-4359-baa9-14ba621118d2\") " pod="openstack/ovn-controller-nbhpw-config-hrgcj" Mar 13 12:06:02 crc kubenswrapper[4837]: I0313 12:06:02.503209 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6776e647-6987-4359-baa9-14ba621118d2-scripts\") pod \"ovn-controller-nbhpw-config-hrgcj\" (UID: \"6776e647-6987-4359-baa9-14ba621118d2\") " pod="openstack/ovn-controller-nbhpw-config-hrgcj" Mar 13 12:06:02 crc kubenswrapper[4837]: I0313 12:06:02.503320 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6776e647-6987-4359-baa9-14ba621118d2-var-run-ovn\") pod \"ovn-controller-nbhpw-config-hrgcj\" (UID: \"6776e647-6987-4359-baa9-14ba621118d2\") " pod="openstack/ovn-controller-nbhpw-config-hrgcj" Mar 13 12:06:02 crc kubenswrapper[4837]: I0313 12:06:02.503342 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jgwn\" (UniqueName: \"kubernetes.io/projected/6776e647-6987-4359-baa9-14ba621118d2-kube-api-access-5jgwn\") pod \"ovn-controller-nbhpw-config-hrgcj\" (UID: \"6776e647-6987-4359-baa9-14ba621118d2\") " pod="openstack/ovn-controller-nbhpw-config-hrgcj" Mar 13 12:06:02 crc kubenswrapper[4837]: I0313 12:06:02.503372 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6776e647-6987-4359-baa9-14ba621118d2-var-log-ovn\") pod \"ovn-controller-nbhpw-config-hrgcj\" (UID: \"6776e647-6987-4359-baa9-14ba621118d2\") " pod="openstack/ovn-controller-nbhpw-config-hrgcj" Mar 13 12:06:02 crc kubenswrapper[4837]: I0313 12:06:02.503423 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6776e647-6987-4359-baa9-14ba621118d2-additional-scripts\") pod \"ovn-controller-nbhpw-config-hrgcj\" (UID: \"6776e647-6987-4359-baa9-14ba621118d2\") " pod="openstack/ovn-controller-nbhpw-config-hrgcj" Mar 13 12:06:02 crc kubenswrapper[4837]: I0313 12:06:02.503478 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6776e647-6987-4359-baa9-14ba621118d2-var-run\") pod \"ovn-controller-nbhpw-config-hrgcj\" (UID: \"6776e647-6987-4359-baa9-14ba621118d2\") " pod="openstack/ovn-controller-nbhpw-config-hrgcj" Mar 13 12:06:02 crc kubenswrapper[4837]: I0313 12:06:02.503814 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6776e647-6987-4359-baa9-14ba621118d2-var-run\") pod \"ovn-controller-nbhpw-config-hrgcj\" (UID: \"6776e647-6987-4359-baa9-14ba621118d2\") " pod="openstack/ovn-controller-nbhpw-config-hrgcj" Mar 13 12:06:02 crc kubenswrapper[4837]: I0313 12:06:02.505948 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6776e647-6987-4359-baa9-14ba621118d2-scripts\") pod \"ovn-controller-nbhpw-config-hrgcj\" (UID: \"6776e647-6987-4359-baa9-14ba621118d2\") " pod="openstack/ovn-controller-nbhpw-config-hrgcj" Mar 13 12:06:02 crc kubenswrapper[4837]: I0313 12:06:02.506012 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6776e647-6987-4359-baa9-14ba621118d2-var-run-ovn\") pod \"ovn-controller-nbhpw-config-hrgcj\" (UID: \"6776e647-6987-4359-baa9-14ba621118d2\") " pod="openstack/ovn-controller-nbhpw-config-hrgcj" Mar 13 12:06:02 crc kubenswrapper[4837]: I0313 12:06:02.506319 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6776e647-6987-4359-baa9-14ba621118d2-var-log-ovn\") pod \"ovn-controller-nbhpw-config-hrgcj\" (UID: \"6776e647-6987-4359-baa9-14ba621118d2\") " pod="openstack/ovn-controller-nbhpw-config-hrgcj" Mar 13 12:06:02 crc kubenswrapper[4837]: I0313 12:06:02.506902 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6776e647-6987-4359-baa9-14ba621118d2-additional-scripts\") pod \"ovn-controller-nbhpw-config-hrgcj\" (UID: \"6776e647-6987-4359-baa9-14ba621118d2\") " pod="openstack/ovn-controller-nbhpw-config-hrgcj" Mar 13 12:06:02 crc kubenswrapper[4837]: I0313 12:06:02.539319 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jgwn\" (UniqueName: \"kubernetes.io/projected/6776e647-6987-4359-baa9-14ba621118d2-kube-api-access-5jgwn\") pod \"ovn-controller-nbhpw-config-hrgcj\" (UID: \"6776e647-6987-4359-baa9-14ba621118d2\") " pod="openstack/ovn-controller-nbhpw-config-hrgcj" Mar 13 12:06:02 crc kubenswrapper[4837]: I0313 12:06:02.613845 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nbhpw-config-hrgcj" Mar 13 12:06:06 crc kubenswrapper[4837]: I0313 12:06:06.969585 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-nbhpw" podUID="32dc51d9-5638-4530-91c8-5be8c13e60f3" containerName="ovn-controller" probeResult="failure" output=< Mar 13 12:06:06 crc kubenswrapper[4837]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 13 12:06:06 crc kubenswrapper[4837]: > Mar 13 12:06:10 crc kubenswrapper[4837]: E0313 12:06:10.230784 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Mar 13 12:06:10 crc kubenswrapper[4837]: E0313 12:06:10.231832 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5zh4r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-jkthw_openstack(b4490fb3-45d7-4b40-ad34-5bf33ba88491): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 12:06:10 crc kubenswrapper[4837]: E0313 12:06:10.233028 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-jkthw" podUID="b4490fb3-45d7-4b40-ad34-5bf33ba88491" Mar 13 12:06:10 crc kubenswrapper[4837]: I0313 12:06:10.687089 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nbhpw-config-hrgcj"] Mar 13 12:06:10 crc kubenswrapper[4837]: W0313 12:06:10.705631 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6776e647_6987_4359_baa9_14ba621118d2.slice/crio-83aae3e77d9623e024f76a9f48851e9cd8dc7696270a9bc666154f49b6134727 WatchSource:0}: Error finding container 83aae3e77d9623e024f76a9f48851e9cd8dc7696270a9bc666154f49b6134727: Status 404 returned error can't find the container with id 83aae3e77d9623e024f76a9f48851e9cd8dc7696270a9bc666154f49b6134727 Mar 13 12:06:10 crc kubenswrapper[4837]: I0313 12:06:10.744135 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556726-gdbfm"] Mar 13 12:06:10 crc kubenswrapper[4837]: W0313 12:06:10.750913 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod83f46fff_3510_4758_82a0_30099640fa33.slice/crio-97fa7ceb35dbfdb338b3e21974c4f2697b4dc443944351f3a97c4fa67dd5702a WatchSource:0}: Error finding container 97fa7ceb35dbfdb338b3e21974c4f2697b4dc443944351f3a97c4fa67dd5702a: Status 404 returned error can't find the container with id 97fa7ceb35dbfdb338b3e21974c4f2697b4dc443944351f3a97c4fa67dd5702a Mar 13 12:06:11 crc kubenswrapper[4837]: I0313 12:06:11.229547 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"13254c8b-516c-435e-9db2-a8d518434f29","Type":"ContainerStarted","Data":"0464fd995746ce013b42a116039d645f924aa9c972effad4862f7a836f1488e1"} Mar 13 12:06:11 crc kubenswrapper[4837]: I0313 12:06:11.231565 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:06:11 crc kubenswrapper[4837]: I0313 12:06:11.235468 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nbhpw-config-hrgcj" event={"ID":"6776e647-6987-4359-baa9-14ba621118d2","Type":"ContainerStarted","Data":"8b3e536f3d4311421b7a8a53f994fc3c95b97d5e112a955f101e290d9b221b2d"} Mar 13 12:06:11 crc kubenswrapper[4837]: I0313 12:06:11.235513 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nbhpw-config-hrgcj" event={"ID":"6776e647-6987-4359-baa9-14ba621118d2","Type":"ContainerStarted","Data":"83aae3e77d9623e024f76a9f48851e9cd8dc7696270a9bc666154f49b6134727"} Mar 13 12:06:11 crc kubenswrapper[4837]: I0313 12:06:11.241227 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"59565710-b9bc-46e6-ad92-7f12376de17c","Type":"ContainerStarted","Data":"34f268940b6244ea00e96250111548b4ac0a41e171f6d03580dec0254f9f213e"} Mar 13 12:06:11 crc kubenswrapper[4837]: I0313 12:06:11.241263 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"59565710-b9bc-46e6-ad92-7f12376de17c","Type":"ContainerStarted","Data":"166173c05cd8668ae62e105ed92a57900aa789cb5bc89ec73b08c97b367f478c"} Mar 13 12:06:11 crc kubenswrapper[4837]: I0313 12:06:11.241273 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"59565710-b9bc-46e6-ad92-7f12376de17c","Type":"ContainerStarted","Data":"af33c4644aa0b53237784c5c19cfe673d7980526a0f05cdb546c113f3a1f90ed"} Mar 13 12:06:11 crc kubenswrapper[4837]: I0313 12:06:11.241284 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"59565710-b9bc-46e6-ad92-7f12376de17c","Type":"ContainerStarted","Data":"dc72455cada2574d7dee9c4d762184172c399e0801f1156290b97c205dba2901"} Mar 13 12:06:11 crc kubenswrapper[4837]: I0313 12:06:11.243218 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556726-gdbfm" event={"ID":"83f46fff-3510-4758-82a0-30099640fa33","Type":"ContainerStarted","Data":"97fa7ceb35dbfdb338b3e21974c4f2697b4dc443944351f3a97c4fa67dd5702a"} Mar 13 12:06:11 crc kubenswrapper[4837]: E0313 12:06:11.243991 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-jkthw" podUID="b4490fb3-45d7-4b40-ad34-5bf33ba88491" Mar 13 12:06:11 crc kubenswrapper[4837]: I0313 12:06:11.274727 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=62.639597458 podStartE2EDuration="1m10.274706475s" podCreationTimestamp="2026-03-13 12:05:01 +0000 UTC" firstStartedPulling="2026-03-13 12:05:16.487863781 +0000 UTC m=+1032.126130544" lastFinishedPulling="2026-03-13 12:05:24.122972798 +0000 UTC m=+1039.761239561" observedRunningTime="2026-03-13 12:06:11.262747186 +0000 UTC m=+1086.901013959" watchObservedRunningTime="2026-03-13 12:06:11.274706475 +0000 UTC m=+1086.912973238" Mar 13 12:06:11 crc kubenswrapper[4837]: I0313 12:06:11.282993 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-nbhpw-config-hrgcj" podStartSLOduration=9.282973796 podStartE2EDuration="9.282973796s" podCreationTimestamp="2026-03-13 12:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:06:11.279657851 +0000 UTC m=+1086.917924634" watchObservedRunningTime="2026-03-13 12:06:11.282973796 +0000 UTC m=+1086.921240559" Mar 13 12:06:11 crc kubenswrapper[4837]: I0313 12:06:11.973403 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-nbhpw" Mar 13 12:06:12 crc kubenswrapper[4837]: I0313 12:06:12.251825 4837 generic.go:334] "Generic (PLEG): container finished" podID="6776e647-6987-4359-baa9-14ba621118d2" containerID="8b3e536f3d4311421b7a8a53f994fc3c95b97d5e112a955f101e290d9b221b2d" exitCode=0 Mar 13 12:06:12 crc kubenswrapper[4837]: I0313 12:06:12.251959 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nbhpw-config-hrgcj" event={"ID":"6776e647-6987-4359-baa9-14ba621118d2","Type":"ContainerDied","Data":"8b3e536f3d4311421b7a8a53f994fc3c95b97d5e112a955f101e290d9b221b2d"} Mar 13 12:06:12 crc kubenswrapper[4837]: I0313 12:06:12.254615 4837 generic.go:334] "Generic (PLEG): container finished" podID="83f46fff-3510-4758-82a0-30099640fa33" containerID="1858aaffb80ca26b2ecab85a7aa907d93bda6b050db7fd69c55fcebb623536ef" exitCode=0 Mar 13 12:06:12 crc kubenswrapper[4837]: I0313 12:06:12.254667 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556726-gdbfm" event={"ID":"83f46fff-3510-4758-82a0-30099640fa33","Type":"ContainerDied","Data":"1858aaffb80ca26b2ecab85a7aa907d93bda6b050db7fd69c55fcebb623536ef"} Mar 13 12:06:13 crc kubenswrapper[4837]: I0313 12:06:13.267054 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"59565710-b9bc-46e6-ad92-7f12376de17c","Type":"ContainerStarted","Data":"f527d66eb0a2a1ab3d7502e880eff4ba411c1086452adb7c46216c179a97766b"} Mar 13 12:06:13 crc kubenswrapper[4837]: I0313 12:06:13.268014 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"59565710-b9bc-46e6-ad92-7f12376de17c","Type":"ContainerStarted","Data":"be4a385c843853220c9ce490f42ed62425d4c6c763d42763c0c52cd9c2057711"} Mar 13 12:06:13 crc kubenswrapper[4837]: I0313 12:06:13.268218 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"59565710-b9bc-46e6-ad92-7f12376de17c","Type":"ContainerStarted","Data":"03de36fa370480219ae91f786ca94823d401f1b31b4a8b2433f10907447d95a0"} Mar 13 12:06:13 crc kubenswrapper[4837]: I0313 12:06:13.268306 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"59565710-b9bc-46e6-ad92-7f12376de17c","Type":"ContainerStarted","Data":"864955d1c783a52f4bfa111bbfec7cf156c8f217ae8471d28e2d198fcddaabe1"} Mar 13 12:06:13 crc kubenswrapper[4837]: I0313 12:06:13.488408 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 13 12:06:13 crc kubenswrapper[4837]: I0313 12:06:13.916469 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556726-gdbfm" Mar 13 12:06:13 crc kubenswrapper[4837]: I0313 12:06:13.939245 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nbhpw-config-hrgcj" Mar 13 12:06:13 crc kubenswrapper[4837]: I0313 12:06:13.948802 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-mbps4"] Mar 13 12:06:13 crc kubenswrapper[4837]: E0313 12:06:13.949263 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83f46fff-3510-4758-82a0-30099640fa33" containerName="oc" Mar 13 12:06:13 crc kubenswrapper[4837]: I0313 12:06:13.949288 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="83f46fff-3510-4758-82a0-30099640fa33" containerName="oc" Mar 13 12:06:13 crc kubenswrapper[4837]: E0313 12:06:13.949314 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6776e647-6987-4359-baa9-14ba621118d2" containerName="ovn-config" Mar 13 12:06:13 crc kubenswrapper[4837]: I0313 12:06:13.949322 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6776e647-6987-4359-baa9-14ba621118d2" containerName="ovn-config" Mar 13 12:06:13 crc kubenswrapper[4837]: I0313 12:06:13.949553 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="83f46fff-3510-4758-82a0-30099640fa33" containerName="oc" Mar 13 12:06:13 crc kubenswrapper[4837]: I0313 12:06:13.949580 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="6776e647-6987-4359-baa9-14ba621118d2" containerName="ovn-config" Mar 13 12:06:13 crc kubenswrapper[4837]: I0313 12:06:13.950242 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-mbps4" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.010099 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-mbps4"] Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.040847 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jgwn\" (UniqueName: \"kubernetes.io/projected/6776e647-6987-4359-baa9-14ba621118d2-kube-api-access-5jgwn\") pod \"6776e647-6987-4359-baa9-14ba621118d2\" (UID: \"6776e647-6987-4359-baa9-14ba621118d2\") " Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.040947 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6776e647-6987-4359-baa9-14ba621118d2-scripts\") pod \"6776e647-6987-4359-baa9-14ba621118d2\" (UID: \"6776e647-6987-4359-baa9-14ba621118d2\") " Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.040977 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nqtb\" (UniqueName: \"kubernetes.io/projected/83f46fff-3510-4758-82a0-30099640fa33-kube-api-access-9nqtb\") pod \"83f46fff-3510-4758-82a0-30099640fa33\" (UID: \"83f46fff-3510-4758-82a0-30099640fa33\") " Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.041039 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6776e647-6987-4359-baa9-14ba621118d2-var-run\") pod \"6776e647-6987-4359-baa9-14ba621118d2\" (UID: \"6776e647-6987-4359-baa9-14ba621118d2\") " Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.041102 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6776e647-6987-4359-baa9-14ba621118d2-additional-scripts\") pod \"6776e647-6987-4359-baa9-14ba621118d2\" (UID: \"6776e647-6987-4359-baa9-14ba621118d2\") " Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.041123 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6776e647-6987-4359-baa9-14ba621118d2-var-run-ovn\") pod \"6776e647-6987-4359-baa9-14ba621118d2\" (UID: \"6776e647-6987-4359-baa9-14ba621118d2\") " Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.041198 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6776e647-6987-4359-baa9-14ba621118d2-var-log-ovn\") pod \"6776e647-6987-4359-baa9-14ba621118d2\" (UID: \"6776e647-6987-4359-baa9-14ba621118d2\") " Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.041421 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/685f13a4-d293-4199-8049-67b02c0162c1-operator-scripts\") pod \"cinder-db-create-mbps4\" (UID: \"685f13a4-d293-4199-8049-67b02c0162c1\") " pod="openstack/cinder-db-create-mbps4" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.041476 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpbm8\" (UniqueName: \"kubernetes.io/projected/685f13a4-d293-4199-8049-67b02c0162c1-kube-api-access-dpbm8\") pod \"cinder-db-create-mbps4\" (UID: \"685f13a4-d293-4199-8049-67b02c0162c1\") " pod="openstack/cinder-db-create-mbps4" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.042795 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6776e647-6987-4359-baa9-14ba621118d2-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "6776e647-6987-4359-baa9-14ba621118d2" (UID: "6776e647-6987-4359-baa9-14ba621118d2"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.043909 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6776e647-6987-4359-baa9-14ba621118d2-scripts" (OuterVolumeSpecName: "scripts") pod "6776e647-6987-4359-baa9-14ba621118d2" (UID: "6776e647-6987-4359-baa9-14ba621118d2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.046672 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6776e647-6987-4359-baa9-14ba621118d2-var-run" (OuterVolumeSpecName: "var-run") pod "6776e647-6987-4359-baa9-14ba621118d2" (UID: "6776e647-6987-4359-baa9-14ba621118d2"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.046747 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6776e647-6987-4359-baa9-14ba621118d2-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "6776e647-6987-4359-baa9-14ba621118d2" (UID: "6776e647-6987-4359-baa9-14ba621118d2"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.046774 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6776e647-6987-4359-baa9-14ba621118d2-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "6776e647-6987-4359-baa9-14ba621118d2" (UID: "6776e647-6987-4359-baa9-14ba621118d2"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.047099 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6776e647-6987-4359-baa9-14ba621118d2-kube-api-access-5jgwn" (OuterVolumeSpecName: "kube-api-access-5jgwn") pod "6776e647-6987-4359-baa9-14ba621118d2" (UID: "6776e647-6987-4359-baa9-14ba621118d2"). InnerVolumeSpecName "kube-api-access-5jgwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.049736 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83f46fff-3510-4758-82a0-30099640fa33-kube-api-access-9nqtb" (OuterVolumeSpecName: "kube-api-access-9nqtb") pod "83f46fff-3510-4758-82a0-30099640fa33" (UID: "83f46fff-3510-4758-82a0-30099640fa33"). InnerVolumeSpecName "kube-api-access-9nqtb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.173846 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-9a59-account-create-update-hqxzk"] Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.176171 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9a59-account-create-update-hqxzk" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.187888 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.199735 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/685f13a4-d293-4199-8049-67b02c0162c1-operator-scripts\") pod \"cinder-db-create-mbps4\" (UID: \"685f13a4-d293-4199-8049-67b02c0162c1\") " pod="openstack/cinder-db-create-mbps4" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.199925 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpbm8\" (UniqueName: \"kubernetes.io/projected/685f13a4-d293-4199-8049-67b02c0162c1-kube-api-access-dpbm8\") pod \"cinder-db-create-mbps4\" (UID: \"685f13a4-d293-4199-8049-67b02c0162c1\") " pod="openstack/cinder-db-create-mbps4" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.200225 4837 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6776e647-6987-4359-baa9-14ba621118d2-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.200240 4837 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6776e647-6987-4359-baa9-14ba621118d2-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.200258 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jgwn\" (UniqueName: \"kubernetes.io/projected/6776e647-6987-4359-baa9-14ba621118d2-kube-api-access-5jgwn\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.200272 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6776e647-6987-4359-baa9-14ba621118d2-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.200289 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nqtb\" (UniqueName: \"kubernetes.io/projected/83f46fff-3510-4758-82a0-30099640fa33-kube-api-access-9nqtb\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.200301 4837 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6776e647-6987-4359-baa9-14ba621118d2-var-run\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.200318 4837 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6776e647-6987-4359-baa9-14ba621118d2-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.206291 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/685f13a4-d293-4199-8049-67b02c0162c1-operator-scripts\") pod \"cinder-db-create-mbps4\" (UID: \"685f13a4-d293-4199-8049-67b02c0162c1\") " pod="openstack/cinder-db-create-mbps4" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.227719 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-9a59-account-create-update-hqxzk"] Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.235658 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpbm8\" (UniqueName: \"kubernetes.io/projected/685f13a4-d293-4199-8049-67b02c0162c1-kube-api-access-dpbm8\") pod \"cinder-db-create-mbps4\" (UID: \"685f13a4-d293-4199-8049-67b02c0162c1\") " pod="openstack/cinder-db-create-mbps4" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.281331 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nbhpw-config-hrgcj" event={"ID":"6776e647-6987-4359-baa9-14ba621118d2","Type":"ContainerDied","Data":"83aae3e77d9623e024f76a9f48851e9cd8dc7696270a9bc666154f49b6134727"} Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.281383 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83aae3e77d9623e024f76a9f48851e9cd8dc7696270a9bc666154f49b6134727" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.282138 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nbhpw-config-hrgcj" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.282375 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-mbps4" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.289867 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"59565710-b9bc-46e6-ad92-7f12376de17c","Type":"ContainerStarted","Data":"8eb2f07b8224427aa81b367530ce692a82952d6a8d767f3db25e035e590da8a4"} Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.289916 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"59565710-b9bc-46e6-ad92-7f12376de17c","Type":"ContainerStarted","Data":"746a6c97fb0b09e268cdc6160d706911a9bd2285c1da876051d1a9a15009d4fc"} Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.289926 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"59565710-b9bc-46e6-ad92-7f12376de17c","Type":"ContainerStarted","Data":"0497c856aa70619719f32b639c7ef5b1378ca0f50dd52e61ec0619c5244d0050"} Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.299013 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556726-gdbfm" event={"ID":"83f46fff-3510-4758-82a0-30099640fa33","Type":"ContainerDied","Data":"97fa7ceb35dbfdb338b3e21974c4f2697b4dc443944351f3a97c4fa67dd5702a"} Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.299052 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97fa7ceb35dbfdb338b3e21974c4f2697b4dc443944351f3a97c4fa67dd5702a" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.299136 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556726-gdbfm" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.301225 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72768daf-a5fa-4c8e-b9c3-49cd5f87fe30-operator-scripts\") pod \"cinder-9a59-account-create-update-hqxzk\" (UID: \"72768daf-a5fa-4c8e-b9c3-49cd5f87fe30\") " pod="openstack/cinder-9a59-account-create-update-hqxzk" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.301317 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvpgs\" (UniqueName: \"kubernetes.io/projected/72768daf-a5fa-4c8e-b9c3-49cd5f87fe30-kube-api-access-jvpgs\") pod \"cinder-9a59-account-create-update-hqxzk\" (UID: \"72768daf-a5fa-4c8e-b9c3-49cd5f87fe30\") " pod="openstack/cinder-9a59-account-create-update-hqxzk" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.337323 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-2dlt8"] Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.338970 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2dlt8" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.341671 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.342189 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-w6mdg" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.342466 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.343032 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.378479 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.09037901 podStartE2EDuration="36.378460118s" podCreationTimestamp="2026-03-13 12:05:38 +0000 UTC" firstStartedPulling="2026-03-13 12:05:56.179279255 +0000 UTC m=+1071.817546018" lastFinishedPulling="2026-03-13 12:06:12.467360363 +0000 UTC m=+1088.105627126" observedRunningTime="2026-03-13 12:06:14.353554639 +0000 UTC m=+1089.991821402" watchObservedRunningTime="2026-03-13 12:06:14.378460118 +0000 UTC m=+1090.016726881" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.379381 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-2dlt8"] Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.402293 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72768daf-a5fa-4c8e-b9c3-49cd5f87fe30-operator-scripts\") pod \"cinder-9a59-account-create-update-hqxzk\" (UID: \"72768daf-a5fa-4c8e-b9c3-49cd5f87fe30\") " pod="openstack/cinder-9a59-account-create-update-hqxzk" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.402388 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvpgs\" (UniqueName: \"kubernetes.io/projected/72768daf-a5fa-4c8e-b9c3-49cd5f87fe30-kube-api-access-jvpgs\") pod \"cinder-9a59-account-create-update-hqxzk\" (UID: \"72768daf-a5fa-4c8e-b9c3-49cd5f87fe30\") " pod="openstack/cinder-9a59-account-create-update-hqxzk" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.404913 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72768daf-a5fa-4c8e-b9c3-49cd5f87fe30-operator-scripts\") pod \"cinder-9a59-account-create-update-hqxzk\" (UID: \"72768daf-a5fa-4c8e-b9c3-49cd5f87fe30\") " pod="openstack/cinder-9a59-account-create-update-hqxzk" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.438536 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvpgs\" (UniqueName: \"kubernetes.io/projected/72768daf-a5fa-4c8e-b9c3-49cd5f87fe30-kube-api-access-jvpgs\") pod \"cinder-9a59-account-create-update-hqxzk\" (UID: \"72768daf-a5fa-4c8e-b9c3-49cd5f87fe30\") " pod="openstack/cinder-9a59-account-create-update-hqxzk" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.439905 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-45j5g"] Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.441073 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-45j5g" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.456623 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-330b-account-create-update-snkff"] Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.464009 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-45j5g"] Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.464314 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-330b-account-create-update-snkff" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.466425 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.507913 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe-config-data\") pod \"keystone-db-sync-2dlt8\" (UID: \"19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe\") " pod="openstack/keystone-db-sync-2dlt8" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.508437 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-330b-account-create-update-snkff"] Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.509174 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9a59-account-create-update-hqxzk" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.510747 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqhqt\" (UniqueName: \"kubernetes.io/projected/19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe-kube-api-access-cqhqt\") pod \"keystone-db-sync-2dlt8\" (UID: \"19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe\") " pod="openstack/keystone-db-sync-2dlt8" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.510904 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe-combined-ca-bundle\") pod \"keystone-db-sync-2dlt8\" (UID: \"19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe\") " pod="openstack/keystone-db-sync-2dlt8" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.511111 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l949g\" (UniqueName: \"kubernetes.io/projected/e6b37e8b-50ec-402e-ae31-27ff0d84e0be-kube-api-access-l949g\") pod \"neutron-db-create-45j5g\" (UID: \"e6b37e8b-50ec-402e-ae31-27ff0d84e0be\") " pod="openstack/neutron-db-create-45j5g" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.511192 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6b37e8b-50ec-402e-ae31-27ff0d84e0be-operator-scripts\") pod \"neutron-db-create-45j5g\" (UID: \"e6b37e8b-50ec-402e-ae31-27ff0d84e0be\") " pod="openstack/neutron-db-create-45j5g" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.612078 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l949g\" (UniqueName: \"kubernetes.io/projected/e6b37e8b-50ec-402e-ae31-27ff0d84e0be-kube-api-access-l949g\") pod \"neutron-db-create-45j5g\" (UID: \"e6b37e8b-50ec-402e-ae31-27ff0d84e0be\") " pod="openstack/neutron-db-create-45j5g" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.616299 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6b37e8b-50ec-402e-ae31-27ff0d84e0be-operator-scripts\") pod \"neutron-db-create-45j5g\" (UID: \"e6b37e8b-50ec-402e-ae31-27ff0d84e0be\") " pod="openstack/neutron-db-create-45j5g" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.616569 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe-config-data\") pod \"keystone-db-sync-2dlt8\" (UID: \"19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe\") " pod="openstack/keystone-db-sync-2dlt8" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.616766 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgmcl\" (UniqueName: \"kubernetes.io/projected/77cef7b0-af86-456f-973b-923cb901b88d-kube-api-access-tgmcl\") pod \"neutron-330b-account-create-update-snkff\" (UID: \"77cef7b0-af86-456f-973b-923cb901b88d\") " pod="openstack/neutron-330b-account-create-update-snkff" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.616794 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77cef7b0-af86-456f-973b-923cb901b88d-operator-scripts\") pod \"neutron-330b-account-create-update-snkff\" (UID: \"77cef7b0-af86-456f-973b-923cb901b88d\") " pod="openstack/neutron-330b-account-create-update-snkff" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.616826 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqhqt\" (UniqueName: \"kubernetes.io/projected/19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe-kube-api-access-cqhqt\") pod \"keystone-db-sync-2dlt8\" (UID: \"19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe\") " pod="openstack/keystone-db-sync-2dlt8" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.616868 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe-combined-ca-bundle\") pod \"keystone-db-sync-2dlt8\" (UID: \"19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe\") " pod="openstack/keystone-db-sync-2dlt8" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.617089 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6b37e8b-50ec-402e-ae31-27ff0d84e0be-operator-scripts\") pod \"neutron-db-create-45j5g\" (UID: \"e6b37e8b-50ec-402e-ae31-27ff0d84e0be\") " pod="openstack/neutron-db-create-45j5g" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.627686 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe-combined-ca-bundle\") pod \"keystone-db-sync-2dlt8\" (UID: \"19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe\") " pod="openstack/keystone-db-sync-2dlt8" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.627926 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe-config-data\") pod \"keystone-db-sync-2dlt8\" (UID: \"19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe\") " pod="openstack/keystone-db-sync-2dlt8" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.628685 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-g24hg"] Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.630182 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-g24hg" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.647724 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-g24hg"] Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.658196 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-6b07-account-create-update-wxqsd"] Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.659550 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6b07-account-create-update-wxqsd" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.673077 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.679943 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqhqt\" (UniqueName: \"kubernetes.io/projected/19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe-kube-api-access-cqhqt\") pod \"keystone-db-sync-2dlt8\" (UID: \"19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe\") " pod="openstack/keystone-db-sync-2dlt8" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.682226 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l949g\" (UniqueName: \"kubernetes.io/projected/e6b37e8b-50ec-402e-ae31-27ff0d84e0be-kube-api-access-l949g\") pod \"neutron-db-create-45j5g\" (UID: \"e6b37e8b-50ec-402e-ae31-27ff0d84e0be\") " pod="openstack/neutron-db-create-45j5g" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.711752 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-6b07-account-create-update-wxqsd"] Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.720808 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjmfh\" (UniqueName: \"kubernetes.io/projected/f029b52a-1a09-44b3-affe-9449cd6a5944-kube-api-access-cjmfh\") pod \"barbican-db-create-g24hg\" (UID: \"f029b52a-1a09-44b3-affe-9449cd6a5944\") " pod="openstack/barbican-db-create-g24hg" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.720867 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgmcl\" (UniqueName: \"kubernetes.io/projected/77cef7b0-af86-456f-973b-923cb901b88d-kube-api-access-tgmcl\") pod \"neutron-330b-account-create-update-snkff\" (UID: \"77cef7b0-af86-456f-973b-923cb901b88d\") " pod="openstack/neutron-330b-account-create-update-snkff" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.720902 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77cef7b0-af86-456f-973b-923cb901b88d-operator-scripts\") pod \"neutron-330b-account-create-update-snkff\" (UID: \"77cef7b0-af86-456f-973b-923cb901b88d\") " pod="openstack/neutron-330b-account-create-update-snkff" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.721025 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f029b52a-1a09-44b3-affe-9449cd6a5944-operator-scripts\") pod \"barbican-db-create-g24hg\" (UID: \"f029b52a-1a09-44b3-affe-9449cd6a5944\") " pod="openstack/barbican-db-create-g24hg" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.721826 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77cef7b0-af86-456f-973b-923cb901b88d-operator-scripts\") pod \"neutron-330b-account-create-update-snkff\" (UID: \"77cef7b0-af86-456f-973b-923cb901b88d\") " pod="openstack/neutron-330b-account-create-update-snkff" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.740815 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-5nlfg"] Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.759769 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.760747 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-45j5g" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.769637 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.786163 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgmcl\" (UniqueName: \"kubernetes.io/projected/77cef7b0-af86-456f-973b-923cb901b88d-kube-api-access-tgmcl\") pod \"neutron-330b-account-create-update-snkff\" (UID: \"77cef7b0-af86-456f-973b-923cb901b88d\") " pod="openstack/neutron-330b-account-create-update-snkff" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.801563 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-5nlfg"] Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.823230 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f029b52a-1a09-44b3-affe-9449cd6a5944-operator-scripts\") pod \"barbican-db-create-g24hg\" (UID: \"f029b52a-1a09-44b3-affe-9449cd6a5944\") " pod="openstack/barbican-db-create-g24hg" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.823512 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjmfh\" (UniqueName: \"kubernetes.io/projected/f029b52a-1a09-44b3-affe-9449cd6a5944-kube-api-access-cjmfh\") pod \"barbican-db-create-g24hg\" (UID: \"f029b52a-1a09-44b3-affe-9449cd6a5944\") " pod="openstack/barbican-db-create-g24hg" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.823698 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sgn9\" (UniqueName: \"kubernetes.io/projected/a78456e1-6f14-45d4-ab3f-1fea88af4749-kube-api-access-4sgn9\") pod \"barbican-6b07-account-create-update-wxqsd\" (UID: \"a78456e1-6f14-45d4-ab3f-1fea88af4749\") " pod="openstack/barbican-6b07-account-create-update-wxqsd" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.823813 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a78456e1-6f14-45d4-ab3f-1fea88af4749-operator-scripts\") pod \"barbican-6b07-account-create-update-wxqsd\" (UID: \"a78456e1-6f14-45d4-ab3f-1fea88af4749\") " pod="openstack/barbican-6b07-account-create-update-wxqsd" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.824028 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f029b52a-1a09-44b3-affe-9449cd6a5944-operator-scripts\") pod \"barbican-db-create-g24hg\" (UID: \"f029b52a-1a09-44b3-affe-9449cd6a5944\") " pod="openstack/barbican-db-create-g24hg" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.853022 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjmfh\" (UniqueName: \"kubernetes.io/projected/f029b52a-1a09-44b3-affe-9449cd6a5944-kube-api-access-cjmfh\") pod \"barbican-db-create-g24hg\" (UID: \"f029b52a-1a09-44b3-affe-9449cd6a5944\") " pod="openstack/barbican-db-create-g24hg" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.917049 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-330b-account-create-update-snkff" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.925210 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-5nlfg\" (UID: \"1a847add-da54-4a5d-9bca-5aea455eefe8\") " pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.925301 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trr9k\" (UniqueName: \"kubernetes.io/projected/1a847add-da54-4a5d-9bca-5aea455eefe8-kube-api-access-trr9k\") pod \"dnsmasq-dns-764c5664d7-5nlfg\" (UID: \"1a847add-da54-4a5d-9bca-5aea455eefe8\") " pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.925350 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sgn9\" (UniqueName: \"kubernetes.io/projected/a78456e1-6f14-45d4-ab3f-1fea88af4749-kube-api-access-4sgn9\") pod \"barbican-6b07-account-create-update-wxqsd\" (UID: \"a78456e1-6f14-45d4-ab3f-1fea88af4749\") " pod="openstack/barbican-6b07-account-create-update-wxqsd" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.925376 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a78456e1-6f14-45d4-ab3f-1fea88af4749-operator-scripts\") pod \"barbican-6b07-account-create-update-wxqsd\" (UID: \"a78456e1-6f14-45d4-ab3f-1fea88af4749\") " pod="openstack/barbican-6b07-account-create-update-wxqsd" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.925423 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-dns-svc\") pod \"dnsmasq-dns-764c5664d7-5nlfg\" (UID: \"1a847add-da54-4a5d-9bca-5aea455eefe8\") " pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.925487 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-config\") pod \"dnsmasq-dns-764c5664d7-5nlfg\" (UID: \"1a847add-da54-4a5d-9bca-5aea455eefe8\") " pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.925549 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-5nlfg\" (UID: \"1a847add-da54-4a5d-9bca-5aea455eefe8\") " pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.925570 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-5nlfg\" (UID: \"1a847add-da54-4a5d-9bca-5aea455eefe8\") " pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.927022 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a78456e1-6f14-45d4-ab3f-1fea88af4749-operator-scripts\") pod \"barbican-6b07-account-create-update-wxqsd\" (UID: \"a78456e1-6f14-45d4-ab3f-1fea88af4749\") " pod="openstack/barbican-6b07-account-create-update-wxqsd" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.950814 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sgn9\" (UniqueName: \"kubernetes.io/projected/a78456e1-6f14-45d4-ab3f-1fea88af4749-kube-api-access-4sgn9\") pod \"barbican-6b07-account-create-update-wxqsd\" (UID: \"a78456e1-6f14-45d4-ab3f-1fea88af4749\") " pod="openstack/barbican-6b07-account-create-update-wxqsd" Mar 13 12:06:14 crc kubenswrapper[4837]: I0313 12:06:14.986859 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2dlt8" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.027710 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-config\") pod \"dnsmasq-dns-764c5664d7-5nlfg\" (UID: \"1a847add-da54-4a5d-9bca-5aea455eefe8\") " pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.029310 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-5nlfg\" (UID: \"1a847add-da54-4a5d-9bca-5aea455eefe8\") " pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.029345 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-5nlfg\" (UID: \"1a847add-da54-4a5d-9bca-5aea455eefe8\") " pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.029405 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-5nlfg\" (UID: \"1a847add-da54-4a5d-9bca-5aea455eefe8\") " pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.029448 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trr9k\" (UniqueName: \"kubernetes.io/projected/1a847add-da54-4a5d-9bca-5aea455eefe8-kube-api-access-trr9k\") pod \"dnsmasq-dns-764c5664d7-5nlfg\" (UID: \"1a847add-da54-4a5d-9bca-5aea455eefe8\") " pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.029518 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-dns-svc\") pod \"dnsmasq-dns-764c5664d7-5nlfg\" (UID: \"1a847add-da54-4a5d-9bca-5aea455eefe8\") " pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.031591 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-dns-svc\") pod \"dnsmasq-dns-764c5664d7-5nlfg\" (UID: \"1a847add-da54-4a5d-9bca-5aea455eefe8\") " pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.032498 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-config\") pod \"dnsmasq-dns-764c5664d7-5nlfg\" (UID: \"1a847add-da54-4a5d-9bca-5aea455eefe8\") " pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.033165 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-5nlfg\" (UID: \"1a847add-da54-4a5d-9bca-5aea455eefe8\") " pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.035174 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-5nlfg\" (UID: \"1a847add-da54-4a5d-9bca-5aea455eefe8\") " pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.035868 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-5nlfg\" (UID: \"1a847add-da54-4a5d-9bca-5aea455eefe8\") " pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.069543 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-g24hg" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.102139 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556720-wqrqr"] Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.102393 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6b07-account-create-update-wxqsd" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.129475 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trr9k\" (UniqueName: \"kubernetes.io/projected/1a847add-da54-4a5d-9bca-5aea455eefe8-kube-api-access-trr9k\") pod \"dnsmasq-dns-764c5664d7-5nlfg\" (UID: \"1a847add-da54-4a5d-9bca-5aea455eefe8\") " pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.152084 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.159878 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556720-wqrqr"] Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.182869 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-mbps4"] Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.212585 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-nbhpw-config-hrgcj"] Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.228763 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-nbhpw-config-hrgcj"] Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.341227 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-mbps4" event={"ID":"685f13a4-d293-4199-8049-67b02c0162c1","Type":"ContainerStarted","Data":"e1a54e4a114da1c297e0edf9ab93b2e5ab7a2495817c3040724f0929f933b467"} Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.359863 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-9a59-account-create-update-hqxzk"] Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.378098 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-nbhpw-config-kxsw9"] Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.380447 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nbhpw-config-kxsw9" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.383592 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.406332 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nbhpw-config-kxsw9"] Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.459865 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-45j5g"] Mar 13 12:06:15 crc kubenswrapper[4837]: W0313 12:06:15.484380 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6b37e8b_50ec_402e_ae31_27ff0d84e0be.slice/crio-80fc33f6bc63815e77d98b7326e47c118cb3973d0734718d9a8997233d51b356 WatchSource:0}: Error finding container 80fc33f6bc63815e77d98b7326e47c118cb3973d0734718d9a8997233d51b356: Status 404 returned error can't find the container with id 80fc33f6bc63815e77d98b7326e47c118cb3973d0734718d9a8997233d51b356 Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.556901 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n59vd\" (UniqueName: \"kubernetes.io/projected/c1cab316-6ffc-483a-9c64-76be9ac13753-kube-api-access-n59vd\") pod \"ovn-controller-nbhpw-config-kxsw9\" (UID: \"c1cab316-6ffc-483a-9c64-76be9ac13753\") " pod="openstack/ovn-controller-nbhpw-config-kxsw9" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.556983 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c1cab316-6ffc-483a-9c64-76be9ac13753-scripts\") pod \"ovn-controller-nbhpw-config-kxsw9\" (UID: \"c1cab316-6ffc-483a-9c64-76be9ac13753\") " pod="openstack/ovn-controller-nbhpw-config-kxsw9" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.557036 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c1cab316-6ffc-483a-9c64-76be9ac13753-var-run-ovn\") pod \"ovn-controller-nbhpw-config-kxsw9\" (UID: \"c1cab316-6ffc-483a-9c64-76be9ac13753\") " pod="openstack/ovn-controller-nbhpw-config-kxsw9" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.557061 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c1cab316-6ffc-483a-9c64-76be9ac13753-var-log-ovn\") pod \"ovn-controller-nbhpw-config-kxsw9\" (UID: \"c1cab316-6ffc-483a-9c64-76be9ac13753\") " pod="openstack/ovn-controller-nbhpw-config-kxsw9" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.557079 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c1cab316-6ffc-483a-9c64-76be9ac13753-additional-scripts\") pod \"ovn-controller-nbhpw-config-kxsw9\" (UID: \"c1cab316-6ffc-483a-9c64-76be9ac13753\") " pod="openstack/ovn-controller-nbhpw-config-kxsw9" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.557145 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c1cab316-6ffc-483a-9c64-76be9ac13753-var-run\") pod \"ovn-controller-nbhpw-config-kxsw9\" (UID: \"c1cab316-6ffc-483a-9c64-76be9ac13753\") " pod="openstack/ovn-controller-nbhpw-config-kxsw9" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.661389 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c1cab316-6ffc-483a-9c64-76be9ac13753-var-run\") pod \"ovn-controller-nbhpw-config-kxsw9\" (UID: \"c1cab316-6ffc-483a-9c64-76be9ac13753\") " pod="openstack/ovn-controller-nbhpw-config-kxsw9" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.661935 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n59vd\" (UniqueName: \"kubernetes.io/projected/c1cab316-6ffc-483a-9c64-76be9ac13753-kube-api-access-n59vd\") pod \"ovn-controller-nbhpw-config-kxsw9\" (UID: \"c1cab316-6ffc-483a-9c64-76be9ac13753\") " pod="openstack/ovn-controller-nbhpw-config-kxsw9" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.662196 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c1cab316-6ffc-483a-9c64-76be9ac13753-scripts\") pod \"ovn-controller-nbhpw-config-kxsw9\" (UID: \"c1cab316-6ffc-483a-9c64-76be9ac13753\") " pod="openstack/ovn-controller-nbhpw-config-kxsw9" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.662305 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c1cab316-6ffc-483a-9c64-76be9ac13753-var-run-ovn\") pod \"ovn-controller-nbhpw-config-kxsw9\" (UID: \"c1cab316-6ffc-483a-9c64-76be9ac13753\") " pod="openstack/ovn-controller-nbhpw-config-kxsw9" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.662331 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c1cab316-6ffc-483a-9c64-76be9ac13753-var-log-ovn\") pod \"ovn-controller-nbhpw-config-kxsw9\" (UID: \"c1cab316-6ffc-483a-9c64-76be9ac13753\") " pod="openstack/ovn-controller-nbhpw-config-kxsw9" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.662352 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c1cab316-6ffc-483a-9c64-76be9ac13753-additional-scripts\") pod \"ovn-controller-nbhpw-config-kxsw9\" (UID: \"c1cab316-6ffc-483a-9c64-76be9ac13753\") " pod="openstack/ovn-controller-nbhpw-config-kxsw9" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.662438 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c1cab316-6ffc-483a-9c64-76be9ac13753-var-run\") pod \"ovn-controller-nbhpw-config-kxsw9\" (UID: \"c1cab316-6ffc-483a-9c64-76be9ac13753\") " pod="openstack/ovn-controller-nbhpw-config-kxsw9" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.662515 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c1cab316-6ffc-483a-9c64-76be9ac13753-var-run-ovn\") pod \"ovn-controller-nbhpw-config-kxsw9\" (UID: \"c1cab316-6ffc-483a-9c64-76be9ac13753\") " pod="openstack/ovn-controller-nbhpw-config-kxsw9" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.662553 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c1cab316-6ffc-483a-9c64-76be9ac13753-var-log-ovn\") pod \"ovn-controller-nbhpw-config-kxsw9\" (UID: \"c1cab316-6ffc-483a-9c64-76be9ac13753\") " pod="openstack/ovn-controller-nbhpw-config-kxsw9" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.663146 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c1cab316-6ffc-483a-9c64-76be9ac13753-additional-scripts\") pod \"ovn-controller-nbhpw-config-kxsw9\" (UID: \"c1cab316-6ffc-483a-9c64-76be9ac13753\") " pod="openstack/ovn-controller-nbhpw-config-kxsw9" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.664627 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c1cab316-6ffc-483a-9c64-76be9ac13753-scripts\") pod \"ovn-controller-nbhpw-config-kxsw9\" (UID: \"c1cab316-6ffc-483a-9c64-76be9ac13753\") " pod="openstack/ovn-controller-nbhpw-config-kxsw9" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.687848 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n59vd\" (UniqueName: \"kubernetes.io/projected/c1cab316-6ffc-483a-9c64-76be9ac13753-kube-api-access-n59vd\") pod \"ovn-controller-nbhpw-config-kxsw9\" (UID: \"c1cab316-6ffc-483a-9c64-76be9ac13753\") " pod="openstack/ovn-controller-nbhpw-config-kxsw9" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.702136 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-330b-account-create-update-snkff"] Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.715956 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nbhpw-config-kxsw9" Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.857917 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-2dlt8"] Mar 13 12:06:15 crc kubenswrapper[4837]: W0313 12:06:15.860871 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19cfb16d_f7a7_4f5d_baa9_b00eaecf1dfe.slice/crio-396e456946df28247885135847e00125b4072fdad39608b68bcd78f42b00f1ad WatchSource:0}: Error finding container 396e456946df28247885135847e00125b4072fdad39608b68bcd78f42b00f1ad: Status 404 returned error can't find the container with id 396e456946df28247885135847e00125b4072fdad39608b68bcd78f42b00f1ad Mar 13 12:06:15 crc kubenswrapper[4837]: I0313 12:06:15.953292 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-6b07-account-create-update-wxqsd"] Mar 13 12:06:16 crc kubenswrapper[4837]: I0313 12:06:15.999544 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-g24hg"] Mar 13 12:06:16 crc kubenswrapper[4837]: I0313 12:06:16.044811 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-5nlfg"] Mar 13 12:06:16 crc kubenswrapper[4837]: I0313 12:06:16.367171 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" event={"ID":"1a847add-da54-4a5d-9bca-5aea455eefe8","Type":"ContainerStarted","Data":"2cbead7100d7df29ad960b80cb3c7ee5eb871cec6fea242940565dd0d3726566"} Mar 13 12:06:16 crc kubenswrapper[4837]: I0313 12:06:16.384364 4837 generic.go:334] "Generic (PLEG): container finished" podID="685f13a4-d293-4199-8049-67b02c0162c1" containerID="8295d45762eef27ce4120c578b478e84691da779f8c9457d397485b5b46c5eba" exitCode=0 Mar 13 12:06:16 crc kubenswrapper[4837]: I0313 12:06:16.384448 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-mbps4" event={"ID":"685f13a4-d293-4199-8049-67b02c0162c1","Type":"ContainerDied","Data":"8295d45762eef27ce4120c578b478e84691da779f8c9457d397485b5b46c5eba"} Mar 13 12:06:16 crc kubenswrapper[4837]: I0313 12:06:16.392181 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-45j5g" event={"ID":"e6b37e8b-50ec-402e-ae31-27ff0d84e0be","Type":"ContainerStarted","Data":"286a6a1365f30df6b40943e24ec3066d64b002e22ec98bea016b42eeee5b1160"} Mar 13 12:06:16 crc kubenswrapper[4837]: I0313 12:06:16.392230 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-45j5g" event={"ID":"e6b37e8b-50ec-402e-ae31-27ff0d84e0be","Type":"ContainerStarted","Data":"80fc33f6bc63815e77d98b7326e47c118cb3973d0734718d9a8997233d51b356"} Mar 13 12:06:16 crc kubenswrapper[4837]: I0313 12:06:16.394701 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2dlt8" event={"ID":"19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe","Type":"ContainerStarted","Data":"396e456946df28247885135847e00125b4072fdad39608b68bcd78f42b00f1ad"} Mar 13 12:06:16 crc kubenswrapper[4837]: I0313 12:06:16.401895 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-g24hg" event={"ID":"f029b52a-1a09-44b3-affe-9449cd6a5944","Type":"ContainerStarted","Data":"41647603cfd4e0b54c4f06a96b6516d64d08f596f9eedd470d536cab15741d89"} Mar 13 12:06:16 crc kubenswrapper[4837]: I0313 12:06:16.409382 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-330b-account-create-update-snkff" event={"ID":"77cef7b0-af86-456f-973b-923cb901b88d","Type":"ContainerStarted","Data":"42bfa52d5c8c4ce4aaf6212f222930fd5d442e727a1a7b492df691e11a1e81f6"} Mar 13 12:06:16 crc kubenswrapper[4837]: I0313 12:06:16.409431 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-330b-account-create-update-snkff" event={"ID":"77cef7b0-af86-456f-973b-923cb901b88d","Type":"ContainerStarted","Data":"4d1782dc0e0e5e3ccefab2466f15606ba9a84930c835694be9443f6d78652434"} Mar 13 12:06:16 crc kubenswrapper[4837]: I0313 12:06:16.413970 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9a59-account-create-update-hqxzk" event={"ID":"72768daf-a5fa-4c8e-b9c3-49cd5f87fe30","Type":"ContainerStarted","Data":"d01d04b228faf7f13c332e53f55aacbde9b692f0da2cccf686b1a57f52fa8fe2"} Mar 13 12:06:16 crc kubenswrapper[4837]: I0313 12:06:16.414000 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9a59-account-create-update-hqxzk" event={"ID":"72768daf-a5fa-4c8e-b9c3-49cd5f87fe30","Type":"ContainerStarted","Data":"a0ee92e3c04affb9424f2cee437c2e7d68ccc6bb7f32033e929beb9ec040971c"} Mar 13 12:06:16 crc kubenswrapper[4837]: I0313 12:06:16.421143 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6b07-account-create-update-wxqsd" event={"ID":"a78456e1-6f14-45d4-ab3f-1fea88af4749","Type":"ContainerStarted","Data":"9b38c974a02200392bdef3725afd6e7af730d3588ea3167463053055a9dc7c97"} Mar 13 12:06:16 crc kubenswrapper[4837]: I0313 12:06:16.458488 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-g24hg" podStartSLOduration=2.458450484 podStartE2EDuration="2.458450484s" podCreationTimestamp="2026-03-13 12:06:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:06:16.43841316 +0000 UTC m=+1092.076679923" watchObservedRunningTime="2026-03-13 12:06:16.458450484 +0000 UTC m=+1092.096717247" Mar 13 12:06:16 crc kubenswrapper[4837]: I0313 12:06:16.518635 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-45j5g" podStartSLOduration=2.5186004889999998 podStartE2EDuration="2.518600489s" podCreationTimestamp="2026-03-13 12:06:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:06:16.496869331 +0000 UTC m=+1092.135136094" watchObservedRunningTime="2026-03-13 12:06:16.518600489 +0000 UTC m=+1092.156867252" Mar 13 12:06:16 crc kubenswrapper[4837]: I0313 12:06:16.544861 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nbhpw-config-kxsw9"] Mar 13 12:06:16 crc kubenswrapper[4837]: I0313 12:06:16.571787 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-330b-account-create-update-snkff" podStartSLOduration=2.571770593 podStartE2EDuration="2.571770593s" podCreationTimestamp="2026-03-13 12:06:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:06:16.567476568 +0000 UTC m=+1092.205743331" watchObservedRunningTime="2026-03-13 12:06:16.571770593 +0000 UTC m=+1092.210037356" Mar 13 12:06:16 crc kubenswrapper[4837]: I0313 12:06:16.620495 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-9a59-account-create-update-hqxzk" podStartSLOduration=2.620078764 podStartE2EDuration="2.620078764s" podCreationTimestamp="2026-03-13 12:06:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:06:16.618011899 +0000 UTC m=+1092.256278662" watchObservedRunningTime="2026-03-13 12:06:16.620078764 +0000 UTC m=+1092.258345527" Mar 13 12:06:17 crc kubenswrapper[4837]: I0313 12:06:17.061750 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1335d65b-c0fb-4085-86eb-d948f797ef68" path="/var/lib/kubelet/pods/1335d65b-c0fb-4085-86eb-d948f797ef68/volumes" Mar 13 12:06:17 crc kubenswrapper[4837]: I0313 12:06:17.062824 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6776e647-6987-4359-baa9-14ba621118d2" path="/var/lib/kubelet/pods/6776e647-6987-4359-baa9-14ba621118d2/volumes" Mar 13 12:06:17 crc kubenswrapper[4837]: I0313 12:06:17.430784 4837 generic.go:334] "Generic (PLEG): container finished" podID="e6b37e8b-50ec-402e-ae31-27ff0d84e0be" containerID="286a6a1365f30df6b40943e24ec3066d64b002e22ec98bea016b42eeee5b1160" exitCode=0 Mar 13 12:06:17 crc kubenswrapper[4837]: I0313 12:06:17.430839 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-45j5g" event={"ID":"e6b37e8b-50ec-402e-ae31-27ff0d84e0be","Type":"ContainerDied","Data":"286a6a1365f30df6b40943e24ec3066d64b002e22ec98bea016b42eeee5b1160"} Mar 13 12:06:17 crc kubenswrapper[4837]: I0313 12:06:17.433234 4837 generic.go:334] "Generic (PLEG): container finished" podID="c1cab316-6ffc-483a-9c64-76be9ac13753" containerID="f8f234cd31d0132024229747ad2a8277b3ce2f09009460632455703d08203032" exitCode=0 Mar 13 12:06:17 crc kubenswrapper[4837]: I0313 12:06:17.433290 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nbhpw-config-kxsw9" event={"ID":"c1cab316-6ffc-483a-9c64-76be9ac13753","Type":"ContainerDied","Data":"f8f234cd31d0132024229747ad2a8277b3ce2f09009460632455703d08203032"} Mar 13 12:06:17 crc kubenswrapper[4837]: I0313 12:06:17.433307 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nbhpw-config-kxsw9" event={"ID":"c1cab316-6ffc-483a-9c64-76be9ac13753","Type":"ContainerStarted","Data":"d8e2e7d1269d007fdcb65dd9073b7a70d074c7f6f3d78656f5594e0618ab2446"} Mar 13 12:06:17 crc kubenswrapper[4837]: I0313 12:06:17.436473 4837 generic.go:334] "Generic (PLEG): container finished" podID="f029b52a-1a09-44b3-affe-9449cd6a5944" containerID="4ef4f42482f9efbb7e95ba0aa3a8a4567cffbb3946a12623724cae5ed211d4e1" exitCode=0 Mar 13 12:06:17 crc kubenswrapper[4837]: I0313 12:06:17.436541 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-g24hg" event={"ID":"f029b52a-1a09-44b3-affe-9449cd6a5944","Type":"ContainerDied","Data":"4ef4f42482f9efbb7e95ba0aa3a8a4567cffbb3946a12623724cae5ed211d4e1"} Mar 13 12:06:17 crc kubenswrapper[4837]: I0313 12:06:17.439830 4837 generic.go:334] "Generic (PLEG): container finished" podID="77cef7b0-af86-456f-973b-923cb901b88d" containerID="42bfa52d5c8c4ce4aaf6212f222930fd5d442e727a1a7b492df691e11a1e81f6" exitCode=0 Mar 13 12:06:17 crc kubenswrapper[4837]: I0313 12:06:17.439904 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-330b-account-create-update-snkff" event={"ID":"77cef7b0-af86-456f-973b-923cb901b88d","Type":"ContainerDied","Data":"42bfa52d5c8c4ce4aaf6212f222930fd5d442e727a1a7b492df691e11a1e81f6"} Mar 13 12:06:17 crc kubenswrapper[4837]: I0313 12:06:17.441936 4837 generic.go:334] "Generic (PLEG): container finished" podID="72768daf-a5fa-4c8e-b9c3-49cd5f87fe30" containerID="d01d04b228faf7f13c332e53f55aacbde9b692f0da2cccf686b1a57f52fa8fe2" exitCode=0 Mar 13 12:06:17 crc kubenswrapper[4837]: I0313 12:06:17.442092 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9a59-account-create-update-hqxzk" event={"ID":"72768daf-a5fa-4c8e-b9c3-49cd5f87fe30","Type":"ContainerDied","Data":"d01d04b228faf7f13c332e53f55aacbde9b692f0da2cccf686b1a57f52fa8fe2"} Mar 13 12:06:17 crc kubenswrapper[4837]: I0313 12:06:17.446469 4837 generic.go:334] "Generic (PLEG): container finished" podID="a78456e1-6f14-45d4-ab3f-1fea88af4749" containerID="fbb8d3067503d33b0b6e6a915789395c7b9c10818b3ce84f4506b15f77d6207f" exitCode=0 Mar 13 12:06:17 crc kubenswrapper[4837]: I0313 12:06:17.446562 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6b07-account-create-update-wxqsd" event={"ID":"a78456e1-6f14-45d4-ab3f-1fea88af4749","Type":"ContainerDied","Data":"fbb8d3067503d33b0b6e6a915789395c7b9c10818b3ce84f4506b15f77d6207f"} Mar 13 12:06:17 crc kubenswrapper[4837]: I0313 12:06:17.468817 4837 generic.go:334] "Generic (PLEG): container finished" podID="1a847add-da54-4a5d-9bca-5aea455eefe8" containerID="88f5a9c016c890932c1524d02aeb53601bb1a2cc77b41ca9cf3fabeb2713f8a0" exitCode=0 Mar 13 12:06:17 crc kubenswrapper[4837]: I0313 12:06:17.469296 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" event={"ID":"1a847add-da54-4a5d-9bca-5aea455eefe8","Type":"ContainerDied","Data":"88f5a9c016c890932c1524d02aeb53601bb1a2cc77b41ca9cf3fabeb2713f8a0"} Mar 13 12:06:18 crc kubenswrapper[4837]: I0313 12:06:17.860708 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-mbps4" Mar 13 12:06:18 crc kubenswrapper[4837]: I0313 12:06:18.030358 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/685f13a4-d293-4199-8049-67b02c0162c1-operator-scripts\") pod \"685f13a4-d293-4199-8049-67b02c0162c1\" (UID: \"685f13a4-d293-4199-8049-67b02c0162c1\") " Mar 13 12:06:18 crc kubenswrapper[4837]: I0313 12:06:18.030558 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpbm8\" (UniqueName: \"kubernetes.io/projected/685f13a4-d293-4199-8049-67b02c0162c1-kube-api-access-dpbm8\") pod \"685f13a4-d293-4199-8049-67b02c0162c1\" (UID: \"685f13a4-d293-4199-8049-67b02c0162c1\") " Mar 13 12:06:18 crc kubenswrapper[4837]: I0313 12:06:18.031014 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/685f13a4-d293-4199-8049-67b02c0162c1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "685f13a4-d293-4199-8049-67b02c0162c1" (UID: "685f13a4-d293-4199-8049-67b02c0162c1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:18 crc kubenswrapper[4837]: I0313 12:06:18.031373 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/685f13a4-d293-4199-8049-67b02c0162c1-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:18 crc kubenswrapper[4837]: I0313 12:06:18.035404 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/685f13a4-d293-4199-8049-67b02c0162c1-kube-api-access-dpbm8" (OuterVolumeSpecName: "kube-api-access-dpbm8") pod "685f13a4-d293-4199-8049-67b02c0162c1" (UID: "685f13a4-d293-4199-8049-67b02c0162c1"). InnerVolumeSpecName "kube-api-access-dpbm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:06:18 crc kubenswrapper[4837]: I0313 12:06:18.135016 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpbm8\" (UniqueName: \"kubernetes.io/projected/685f13a4-d293-4199-8049-67b02c0162c1-kube-api-access-dpbm8\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:18 crc kubenswrapper[4837]: I0313 12:06:18.481178 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" event={"ID":"1a847add-da54-4a5d-9bca-5aea455eefe8","Type":"ContainerStarted","Data":"7fd2e269ac89746bd02c6eb6e013fcc551156a1538f9f4807e06a63dd46236d2"} Mar 13 12:06:18 crc kubenswrapper[4837]: I0313 12:06:18.481805 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" Mar 13 12:06:18 crc kubenswrapper[4837]: I0313 12:06:18.483862 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-mbps4" event={"ID":"685f13a4-d293-4199-8049-67b02c0162c1","Type":"ContainerDied","Data":"e1a54e4a114da1c297e0edf9ab93b2e5ab7a2495817c3040724f0929f933b467"} Mar 13 12:06:18 crc kubenswrapper[4837]: I0313 12:06:18.483903 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1a54e4a114da1c297e0edf9ab93b2e5ab7a2495817c3040724f0929f933b467" Mar 13 12:06:18 crc kubenswrapper[4837]: I0313 12:06:18.484043 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-mbps4" Mar 13 12:06:18 crc kubenswrapper[4837]: I0313 12:06:18.511576 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" podStartSLOduration=4.511560068 podStartE2EDuration="4.511560068s" podCreationTimestamp="2026-03-13 12:06:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:06:18.504754203 +0000 UTC m=+1094.143020986" watchObservedRunningTime="2026-03-13 12:06:18.511560068 +0000 UTC m=+1094.149826831" Mar 13 12:06:18 crc kubenswrapper[4837]: E0313 12:06:18.559350 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod685f13a4_d293_4199_8049_67b02c0162c1.slice/crio-e1a54e4a114da1c297e0edf9ab93b2e5ab7a2495817c3040724f0929f933b467\": RecentStats: unable to find data in memory cache]" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.446704 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-45j5g" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.455484 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9a59-account-create-update-hqxzk" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.506246 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-330b-account-create-update-snkff" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.515080 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-330b-account-create-update-snkff" event={"ID":"77cef7b0-af86-456f-973b-923cb901b88d","Type":"ContainerDied","Data":"4d1782dc0e0e5e3ccefab2466f15606ba9a84930c835694be9443f6d78652434"} Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.515114 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d1782dc0e0e5e3ccefab2466f15606ba9a84930c835694be9443f6d78652434" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.515092 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-330b-account-create-update-snkff" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.515411 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6b07-account-create-update-wxqsd" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.518969 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9a59-account-create-update-hqxzk" event={"ID":"72768daf-a5fa-4c8e-b9c3-49cd5f87fe30","Type":"ContainerDied","Data":"a0ee92e3c04affb9424f2cee437c2e7d68ccc6bb7f32033e929beb9ec040971c"} Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.519472 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0ee92e3c04affb9424f2cee437c2e7d68ccc6bb7f32033e929beb9ec040971c" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.519594 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9a59-account-create-update-hqxzk" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.527396 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6b07-account-create-update-wxqsd" event={"ID":"a78456e1-6f14-45d4-ab3f-1fea88af4749","Type":"ContainerDied","Data":"9b38c974a02200392bdef3725afd6e7af730d3588ea3167463053055a9dc7c97"} Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.527430 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b38c974a02200392bdef3725afd6e7af730d3588ea3167463053055a9dc7c97" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.527483 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6b07-account-create-update-wxqsd" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.530627 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-45j5g" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.530728 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-45j5g" event={"ID":"e6b37e8b-50ec-402e-ae31-27ff0d84e0be","Type":"ContainerDied","Data":"80fc33f6bc63815e77d98b7326e47c118cb3973d0734718d9a8997233d51b356"} Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.530757 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80fc33f6bc63815e77d98b7326e47c118cb3973d0734718d9a8997233d51b356" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.531876 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nbhpw-config-kxsw9" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.532603 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nbhpw-config-kxsw9" event={"ID":"c1cab316-6ffc-483a-9c64-76be9ac13753","Type":"ContainerDied","Data":"d8e2e7d1269d007fdcb65dd9073b7a70d074c7f6f3d78656f5594e0618ab2446"} Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.532621 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8e2e7d1269d007fdcb65dd9073b7a70d074c7f6f3d78656f5594e0618ab2446" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.535803 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-g24hg" event={"ID":"f029b52a-1a09-44b3-affe-9449cd6a5944","Type":"ContainerDied","Data":"41647603cfd4e0b54c4f06a96b6516d64d08f596f9eedd470d536cab15741d89"} Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.535840 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41647603cfd4e0b54c4f06a96b6516d64d08f596f9eedd470d536cab15741d89" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.549895 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-g24hg" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.613114 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6b37e8b-50ec-402e-ae31-27ff0d84e0be-operator-scripts\") pod \"e6b37e8b-50ec-402e-ae31-27ff0d84e0be\" (UID: \"e6b37e8b-50ec-402e-ae31-27ff0d84e0be\") " Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.613386 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l949g\" (UniqueName: \"kubernetes.io/projected/e6b37e8b-50ec-402e-ae31-27ff0d84e0be-kube-api-access-l949g\") pod \"e6b37e8b-50ec-402e-ae31-27ff0d84e0be\" (UID: \"e6b37e8b-50ec-402e-ae31-27ff0d84e0be\") " Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.613827 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6b37e8b-50ec-402e-ae31-27ff0d84e0be-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e6b37e8b-50ec-402e-ae31-27ff0d84e0be" (UID: "e6b37e8b-50ec-402e-ae31-27ff0d84e0be"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.614140 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a78456e1-6f14-45d4-ab3f-1fea88af4749-operator-scripts\") pod \"a78456e1-6f14-45d4-ab3f-1fea88af4749\" (UID: \"a78456e1-6f14-45d4-ab3f-1fea88af4749\") " Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.614218 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgmcl\" (UniqueName: \"kubernetes.io/projected/77cef7b0-af86-456f-973b-923cb901b88d-kube-api-access-tgmcl\") pod \"77cef7b0-af86-456f-973b-923cb901b88d\" (UID: \"77cef7b0-af86-456f-973b-923cb901b88d\") " Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.614246 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4sgn9\" (UniqueName: \"kubernetes.io/projected/a78456e1-6f14-45d4-ab3f-1fea88af4749-kube-api-access-4sgn9\") pod \"a78456e1-6f14-45d4-ab3f-1fea88af4749\" (UID: \"a78456e1-6f14-45d4-ab3f-1fea88af4749\") " Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.614269 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77cef7b0-af86-456f-973b-923cb901b88d-operator-scripts\") pod \"77cef7b0-af86-456f-973b-923cb901b88d\" (UID: \"77cef7b0-af86-456f-973b-923cb901b88d\") " Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.614537 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72768daf-a5fa-4c8e-b9c3-49cd5f87fe30-operator-scripts\") pod \"72768daf-a5fa-4c8e-b9c3-49cd5f87fe30\" (UID: \"72768daf-a5fa-4c8e-b9c3-49cd5f87fe30\") " Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.614620 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvpgs\" (UniqueName: \"kubernetes.io/projected/72768daf-a5fa-4c8e-b9c3-49cd5f87fe30-kube-api-access-jvpgs\") pod \"72768daf-a5fa-4c8e-b9c3-49cd5f87fe30\" (UID: \"72768daf-a5fa-4c8e-b9c3-49cd5f87fe30\") " Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.615136 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6b37e8b-50ec-402e-ae31-27ff0d84e0be-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.614620 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a78456e1-6f14-45d4-ab3f-1fea88af4749-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a78456e1-6f14-45d4-ab3f-1fea88af4749" (UID: "a78456e1-6f14-45d4-ab3f-1fea88af4749"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.615626 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72768daf-a5fa-4c8e-b9c3-49cd5f87fe30-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "72768daf-a5fa-4c8e-b9c3-49cd5f87fe30" (UID: "72768daf-a5fa-4c8e-b9c3-49cd5f87fe30"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.616005 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77cef7b0-af86-456f-973b-923cb901b88d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "77cef7b0-af86-456f-973b-923cb901b88d" (UID: "77cef7b0-af86-456f-973b-923cb901b88d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.618243 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6b37e8b-50ec-402e-ae31-27ff0d84e0be-kube-api-access-l949g" (OuterVolumeSpecName: "kube-api-access-l949g") pod "e6b37e8b-50ec-402e-ae31-27ff0d84e0be" (UID: "e6b37e8b-50ec-402e-ae31-27ff0d84e0be"). InnerVolumeSpecName "kube-api-access-l949g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.618275 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a78456e1-6f14-45d4-ab3f-1fea88af4749-kube-api-access-4sgn9" (OuterVolumeSpecName: "kube-api-access-4sgn9") pod "a78456e1-6f14-45d4-ab3f-1fea88af4749" (UID: "a78456e1-6f14-45d4-ab3f-1fea88af4749"). InnerVolumeSpecName "kube-api-access-4sgn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.619262 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77cef7b0-af86-456f-973b-923cb901b88d-kube-api-access-tgmcl" (OuterVolumeSpecName: "kube-api-access-tgmcl") pod "77cef7b0-af86-456f-973b-923cb901b88d" (UID: "77cef7b0-af86-456f-973b-923cb901b88d"). InnerVolumeSpecName "kube-api-access-tgmcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.619922 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72768daf-a5fa-4c8e-b9c3-49cd5f87fe30-kube-api-access-jvpgs" (OuterVolumeSpecName: "kube-api-access-jvpgs") pod "72768daf-a5fa-4c8e-b9c3-49cd5f87fe30" (UID: "72768daf-a5fa-4c8e-b9c3-49cd5f87fe30"). InnerVolumeSpecName "kube-api-access-jvpgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.716451 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c1cab316-6ffc-483a-9c64-76be9ac13753-var-log-ovn\") pod \"c1cab316-6ffc-483a-9c64-76be9ac13753\" (UID: \"c1cab316-6ffc-483a-9c64-76be9ac13753\") " Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.716519 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c1cab316-6ffc-483a-9c64-76be9ac13753-additional-scripts\") pod \"c1cab316-6ffc-483a-9c64-76be9ac13753\" (UID: \"c1cab316-6ffc-483a-9c64-76be9ac13753\") " Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.716547 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c1cab316-6ffc-483a-9c64-76be9ac13753-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "c1cab316-6ffc-483a-9c64-76be9ac13753" (UID: "c1cab316-6ffc-483a-9c64-76be9ac13753"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.716560 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjmfh\" (UniqueName: \"kubernetes.io/projected/f029b52a-1a09-44b3-affe-9449cd6a5944-kube-api-access-cjmfh\") pod \"f029b52a-1a09-44b3-affe-9449cd6a5944\" (UID: \"f029b52a-1a09-44b3-affe-9449cd6a5944\") " Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.716700 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n59vd\" (UniqueName: \"kubernetes.io/projected/c1cab316-6ffc-483a-9c64-76be9ac13753-kube-api-access-n59vd\") pod \"c1cab316-6ffc-483a-9c64-76be9ac13753\" (UID: \"c1cab316-6ffc-483a-9c64-76be9ac13753\") " Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.716777 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c1cab316-6ffc-483a-9c64-76be9ac13753-var-run-ovn\") pod \"c1cab316-6ffc-483a-9c64-76be9ac13753\" (UID: \"c1cab316-6ffc-483a-9c64-76be9ac13753\") " Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.716869 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c1cab316-6ffc-483a-9c64-76be9ac13753-var-run\") pod \"c1cab316-6ffc-483a-9c64-76be9ac13753\" (UID: \"c1cab316-6ffc-483a-9c64-76be9ac13753\") " Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.716886 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c1cab316-6ffc-483a-9c64-76be9ac13753-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "c1cab316-6ffc-483a-9c64-76be9ac13753" (UID: "c1cab316-6ffc-483a-9c64-76be9ac13753"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.716932 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f029b52a-1a09-44b3-affe-9449cd6a5944-operator-scripts\") pod \"f029b52a-1a09-44b3-affe-9449cd6a5944\" (UID: \"f029b52a-1a09-44b3-affe-9449cd6a5944\") " Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.716958 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c1cab316-6ffc-483a-9c64-76be9ac13753-var-run" (OuterVolumeSpecName: "var-run") pod "c1cab316-6ffc-483a-9c64-76be9ac13753" (UID: "c1cab316-6ffc-483a-9c64-76be9ac13753"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.716961 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c1cab316-6ffc-483a-9c64-76be9ac13753-scripts\") pod \"c1cab316-6ffc-483a-9c64-76be9ac13753\" (UID: \"c1cab316-6ffc-483a-9c64-76be9ac13753\") " Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.717743 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f029b52a-1a09-44b3-affe-9449cd6a5944-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f029b52a-1a09-44b3-affe-9449cd6a5944" (UID: "f029b52a-1a09-44b3-affe-9449cd6a5944"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.717776 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4sgn9\" (UniqueName: \"kubernetes.io/projected/a78456e1-6f14-45d4-ab3f-1fea88af4749-kube-api-access-4sgn9\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.717803 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77cef7b0-af86-456f-973b-923cb901b88d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.717816 4837 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c1cab316-6ffc-483a-9c64-76be9ac13753-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.717829 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72768daf-a5fa-4c8e-b9c3-49cd5f87fe30-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.717841 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvpgs\" (UniqueName: \"kubernetes.io/projected/72768daf-a5fa-4c8e-b9c3-49cd5f87fe30-kube-api-access-jvpgs\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.717853 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l949g\" (UniqueName: \"kubernetes.io/projected/e6b37e8b-50ec-402e-ae31-27ff0d84e0be-kube-api-access-l949g\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.717863 4837 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c1cab316-6ffc-483a-9c64-76be9ac13753-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.717903 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1cab316-6ffc-483a-9c64-76be9ac13753-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "c1cab316-6ffc-483a-9c64-76be9ac13753" (UID: "c1cab316-6ffc-483a-9c64-76be9ac13753"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.717922 4837 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c1cab316-6ffc-483a-9c64-76be9ac13753-var-run\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.717937 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a78456e1-6f14-45d4-ab3f-1fea88af4749-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.717951 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgmcl\" (UniqueName: \"kubernetes.io/projected/77cef7b0-af86-456f-973b-923cb901b88d-kube-api-access-tgmcl\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.718086 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1cab316-6ffc-483a-9c64-76be9ac13753-scripts" (OuterVolumeSpecName: "scripts") pod "c1cab316-6ffc-483a-9c64-76be9ac13753" (UID: "c1cab316-6ffc-483a-9c64-76be9ac13753"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.721002 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1cab316-6ffc-483a-9c64-76be9ac13753-kube-api-access-n59vd" (OuterVolumeSpecName: "kube-api-access-n59vd") pod "c1cab316-6ffc-483a-9c64-76be9ac13753" (UID: "c1cab316-6ffc-483a-9c64-76be9ac13753"). InnerVolumeSpecName "kube-api-access-n59vd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.721504 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f029b52a-1a09-44b3-affe-9449cd6a5944-kube-api-access-cjmfh" (OuterVolumeSpecName: "kube-api-access-cjmfh") pod "f029b52a-1a09-44b3-affe-9449cd6a5944" (UID: "f029b52a-1a09-44b3-affe-9449cd6a5944"). InnerVolumeSpecName "kube-api-access-cjmfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.820313 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f029b52a-1a09-44b3-affe-9449cd6a5944-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.820351 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c1cab316-6ffc-483a-9c64-76be9ac13753-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.820365 4837 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c1cab316-6ffc-483a-9c64-76be9ac13753-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.820377 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjmfh\" (UniqueName: \"kubernetes.io/projected/f029b52a-1a09-44b3-affe-9449cd6a5944-kube-api-access-cjmfh\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:21 crc kubenswrapper[4837]: I0313 12:06:21.820390 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n59vd\" (UniqueName: \"kubernetes.io/projected/c1cab316-6ffc-483a-9c64-76be9ac13753-kube-api-access-n59vd\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:22 crc kubenswrapper[4837]: I0313 12:06:22.547562 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nbhpw-config-kxsw9" Mar 13 12:06:22 crc kubenswrapper[4837]: I0313 12:06:22.547562 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2dlt8" event={"ID":"19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe","Type":"ContainerStarted","Data":"4278a43d1836aa1abbebaa7d3b0197dd5fc3373adc2b4d3124d2a223104eef56"} Mar 13 12:06:22 crc kubenswrapper[4837]: I0313 12:06:22.547628 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-g24hg" Mar 13 12:06:22 crc kubenswrapper[4837]: I0313 12:06:22.575742 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-2dlt8" podStartSLOduration=3.172906674 podStartE2EDuration="8.575720843s" podCreationTimestamp="2026-03-13 12:06:14 +0000 UTC" firstStartedPulling="2026-03-13 12:06:15.868682042 +0000 UTC m=+1091.506948805" lastFinishedPulling="2026-03-13 12:06:21.271496201 +0000 UTC m=+1096.909762974" observedRunningTime="2026-03-13 12:06:22.569579699 +0000 UTC m=+1098.207846472" watchObservedRunningTime="2026-03-13 12:06:22.575720843 +0000 UTC m=+1098.213987606" Mar 13 12:06:22 crc kubenswrapper[4837]: I0313 12:06:22.643497 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-nbhpw-config-kxsw9"] Mar 13 12:06:22 crc kubenswrapper[4837]: I0313 12:06:22.655292 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-nbhpw-config-kxsw9"] Mar 13 12:06:23 crc kubenswrapper[4837]: I0313 12:06:23.059433 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1cab316-6ffc-483a-9c64-76be9ac13753" path="/var/lib/kubelet/pods/c1cab316-6ffc-483a-9c64-76be9ac13753/volumes" Mar 13 12:06:23 crc kubenswrapper[4837]: I0313 12:06:23.161772 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:06:24 crc kubenswrapper[4837]: I0313 12:06:24.564971 4837 generic.go:334] "Generic (PLEG): container finished" podID="19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe" containerID="4278a43d1836aa1abbebaa7d3b0197dd5fc3373adc2b4d3124d2a223104eef56" exitCode=0 Mar 13 12:06:24 crc kubenswrapper[4837]: I0313 12:06:24.565056 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2dlt8" event={"ID":"19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe","Type":"ContainerDied","Data":"4278a43d1836aa1abbebaa7d3b0197dd5fc3373adc2b4d3124d2a223104eef56"} Mar 13 12:06:24 crc kubenswrapper[4837]: E0313 12:06:24.866241 4837 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.138:45154->38.102.83.138:43005: read tcp 38.102.83.138:45154->38.102.83.138:43005: read: connection reset by peer Mar 13 12:06:24 crc kubenswrapper[4837]: E0313 12:06:24.866258 4837 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.138:45154->38.102.83.138:43005: write tcp 38.102.83.138:45154->38.102.83.138:43005: write: broken pipe Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.153806 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.207211 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-gqrt7"] Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.207426 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-gqrt7" podUID="de68f8fe-0650-4ef4-9445-d31e119de423" containerName="dnsmasq-dns" containerID="cri-o://a3d9d75be9f89d9ac614473e4e3a4f535965320bd55937576eb6b69f6cb8f8b9" gracePeriod=10 Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.578960 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-jkthw" event={"ID":"b4490fb3-45d7-4b40-ad34-5bf33ba88491","Type":"ContainerStarted","Data":"483a91e4e8aeb62a4bc9d00fab2fa3f3452e90337b10ae7eb6d6d40d39b495c8"} Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.586667 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-gqrt7" event={"ID":"de68f8fe-0650-4ef4-9445-d31e119de423","Type":"ContainerDied","Data":"a3d9d75be9f89d9ac614473e4e3a4f535965320bd55937576eb6b69f6cb8f8b9"} Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.586677 4837 generic.go:334] "Generic (PLEG): container finished" podID="de68f8fe-0650-4ef4-9445-d31e119de423" containerID="a3d9d75be9f89d9ac614473e4e3a4f535965320bd55937576eb6b69f6cb8f8b9" exitCode=0 Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.586781 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-gqrt7" event={"ID":"de68f8fe-0650-4ef4-9445-d31e119de423","Type":"ContainerDied","Data":"d23a17995d98b2790b62117dc60f3874a46893982c985ce77e930333e0f2f46d"} Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.586802 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d23a17995d98b2790b62117dc60f3874a46893982c985ce77e930333e0f2f46d" Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.606449 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-jkthw" podStartSLOduration=2.611277292 podStartE2EDuration="34.606428233s" podCreationTimestamp="2026-03-13 12:05:51 +0000 UTC" firstStartedPulling="2026-03-13 12:05:52.490668946 +0000 UTC m=+1068.128935709" lastFinishedPulling="2026-03-13 12:06:24.485819887 +0000 UTC m=+1100.124086650" observedRunningTime="2026-03-13 12:06:25.598694669 +0000 UTC m=+1101.236961432" watchObservedRunningTime="2026-03-13 12:06:25.606428233 +0000 UTC m=+1101.244694996" Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.646663 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-gqrt7" Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.684260 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de68f8fe-0650-4ef4-9445-d31e119de423-dns-svc\") pod \"de68f8fe-0650-4ef4-9445-d31e119de423\" (UID: \"de68f8fe-0650-4ef4-9445-d31e119de423\") " Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.684418 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dq8p7\" (UniqueName: \"kubernetes.io/projected/de68f8fe-0650-4ef4-9445-d31e119de423-kube-api-access-dq8p7\") pod \"de68f8fe-0650-4ef4-9445-d31e119de423\" (UID: \"de68f8fe-0650-4ef4-9445-d31e119de423\") " Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.684498 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/de68f8fe-0650-4ef4-9445-d31e119de423-ovsdbserver-nb\") pod \"de68f8fe-0650-4ef4-9445-d31e119de423\" (UID: \"de68f8fe-0650-4ef4-9445-d31e119de423\") " Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.684559 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de68f8fe-0650-4ef4-9445-d31e119de423-config\") pod \"de68f8fe-0650-4ef4-9445-d31e119de423\" (UID: \"de68f8fe-0650-4ef4-9445-d31e119de423\") " Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.684597 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/de68f8fe-0650-4ef4-9445-d31e119de423-ovsdbserver-sb\") pod \"de68f8fe-0650-4ef4-9445-d31e119de423\" (UID: \"de68f8fe-0650-4ef4-9445-d31e119de423\") " Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.703311 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de68f8fe-0650-4ef4-9445-d31e119de423-kube-api-access-dq8p7" (OuterVolumeSpecName: "kube-api-access-dq8p7") pod "de68f8fe-0650-4ef4-9445-d31e119de423" (UID: "de68f8fe-0650-4ef4-9445-d31e119de423"). InnerVolumeSpecName "kube-api-access-dq8p7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.754555 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de68f8fe-0650-4ef4-9445-d31e119de423-config" (OuterVolumeSpecName: "config") pod "de68f8fe-0650-4ef4-9445-d31e119de423" (UID: "de68f8fe-0650-4ef4-9445-d31e119de423"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.764143 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de68f8fe-0650-4ef4-9445-d31e119de423-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "de68f8fe-0650-4ef4-9445-d31e119de423" (UID: "de68f8fe-0650-4ef4-9445-d31e119de423"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.769688 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de68f8fe-0650-4ef4-9445-d31e119de423-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "de68f8fe-0650-4ef4-9445-d31e119de423" (UID: "de68f8fe-0650-4ef4-9445-d31e119de423"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.781235 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de68f8fe-0650-4ef4-9445-d31e119de423-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "de68f8fe-0650-4ef4-9445-d31e119de423" (UID: "de68f8fe-0650-4ef4-9445-d31e119de423"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.791168 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de68f8fe-0650-4ef4-9445-d31e119de423-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.791212 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dq8p7\" (UniqueName: \"kubernetes.io/projected/de68f8fe-0650-4ef4-9445-d31e119de423-kube-api-access-dq8p7\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.791226 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/de68f8fe-0650-4ef4-9445-d31e119de423-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.791237 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de68f8fe-0650-4ef4-9445-d31e119de423-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.791245 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/de68f8fe-0650-4ef4-9445-d31e119de423-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.890761 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2dlt8" Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.993089 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqhqt\" (UniqueName: \"kubernetes.io/projected/19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe-kube-api-access-cqhqt\") pod \"19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe\" (UID: \"19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe\") " Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.993201 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe-config-data\") pod \"19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe\" (UID: \"19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe\") " Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.993291 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe-combined-ca-bundle\") pod \"19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe\" (UID: \"19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe\") " Mar 13 12:06:25 crc kubenswrapper[4837]: I0313 12:06:25.997923 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe-kube-api-access-cqhqt" (OuterVolumeSpecName: "kube-api-access-cqhqt") pod "19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe" (UID: "19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe"). InnerVolumeSpecName "kube-api-access-cqhqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.039939 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe" (UID: "19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.044761 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe-config-data" (OuterVolumeSpecName: "config-data") pod "19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe" (UID: "19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.095444 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqhqt\" (UniqueName: \"kubernetes.io/projected/19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe-kube-api-access-cqhqt\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.095503 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.095514 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.596695 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-gqrt7" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.596700 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2dlt8" event={"ID":"19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe","Type":"ContainerDied","Data":"396e456946df28247885135847e00125b4072fdad39608b68bcd78f42b00f1ad"} Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.596735 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2dlt8" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.596750 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="396e456946df28247885135847e00125b4072fdad39608b68bcd78f42b00f1ad" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.636277 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-gqrt7"] Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.642252 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-gqrt7"] Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.841111 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-x64d8"] Mar 13 12:06:26 crc kubenswrapper[4837]: E0313 12:06:26.841712 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="685f13a4-d293-4199-8049-67b02c0162c1" containerName="mariadb-database-create" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.841734 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="685f13a4-d293-4199-8049-67b02c0162c1" containerName="mariadb-database-create" Mar 13 12:06:26 crc kubenswrapper[4837]: E0313 12:06:26.841753 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de68f8fe-0650-4ef4-9445-d31e119de423" containerName="init" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.841760 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="de68f8fe-0650-4ef4-9445-d31e119de423" containerName="init" Mar 13 12:06:26 crc kubenswrapper[4837]: E0313 12:06:26.841772 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6b37e8b-50ec-402e-ae31-27ff0d84e0be" containerName="mariadb-database-create" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.841779 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6b37e8b-50ec-402e-ae31-27ff0d84e0be" containerName="mariadb-database-create" Mar 13 12:06:26 crc kubenswrapper[4837]: E0313 12:06:26.841792 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a78456e1-6f14-45d4-ab3f-1fea88af4749" containerName="mariadb-account-create-update" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.841802 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="a78456e1-6f14-45d4-ab3f-1fea88af4749" containerName="mariadb-account-create-update" Mar 13 12:06:26 crc kubenswrapper[4837]: E0313 12:06:26.841814 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe" containerName="keystone-db-sync" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.841820 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe" containerName="keystone-db-sync" Mar 13 12:06:26 crc kubenswrapper[4837]: E0313 12:06:26.841832 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de68f8fe-0650-4ef4-9445-d31e119de423" containerName="dnsmasq-dns" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.841840 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="de68f8fe-0650-4ef4-9445-d31e119de423" containerName="dnsmasq-dns" Mar 13 12:06:26 crc kubenswrapper[4837]: E0313 12:06:26.841863 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1cab316-6ffc-483a-9c64-76be9ac13753" containerName="ovn-config" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.841873 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1cab316-6ffc-483a-9c64-76be9ac13753" containerName="ovn-config" Mar 13 12:06:26 crc kubenswrapper[4837]: E0313 12:06:26.841882 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72768daf-a5fa-4c8e-b9c3-49cd5f87fe30" containerName="mariadb-account-create-update" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.841888 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="72768daf-a5fa-4c8e-b9c3-49cd5f87fe30" containerName="mariadb-account-create-update" Mar 13 12:06:26 crc kubenswrapper[4837]: E0313 12:06:26.841906 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77cef7b0-af86-456f-973b-923cb901b88d" containerName="mariadb-account-create-update" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.841912 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="77cef7b0-af86-456f-973b-923cb901b88d" containerName="mariadb-account-create-update" Mar 13 12:06:26 crc kubenswrapper[4837]: E0313 12:06:26.841920 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f029b52a-1a09-44b3-affe-9449cd6a5944" containerName="mariadb-database-create" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.841927 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f029b52a-1a09-44b3-affe-9449cd6a5944" containerName="mariadb-database-create" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.842111 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1cab316-6ffc-483a-9c64-76be9ac13753" containerName="ovn-config" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.842127 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="a78456e1-6f14-45d4-ab3f-1fea88af4749" containerName="mariadb-account-create-update" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.842139 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f029b52a-1a09-44b3-affe-9449cd6a5944" containerName="mariadb-database-create" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.842146 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="de68f8fe-0650-4ef4-9445-d31e119de423" containerName="dnsmasq-dns" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.842158 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe" containerName="keystone-db-sync" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.842168 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="72768daf-a5fa-4c8e-b9c3-49cd5f87fe30" containerName="mariadb-account-create-update" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.842182 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="77cef7b0-af86-456f-973b-923cb901b88d" containerName="mariadb-account-create-update" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.842190 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6b37e8b-50ec-402e-ae31-27ff0d84e0be" containerName="mariadb-database-create" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.842197 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="685f13a4-d293-4199-8049-67b02c0162c1" containerName="mariadb-database-create" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.843600 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-x64d8" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.858329 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-x64d8"] Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.906715 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-kz7j9"] Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.908275 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kz7j9" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.910372 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.912328 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.912658 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.912873 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-w6mdg" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.918451 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 13 12:06:26 crc kubenswrapper[4837]: I0313 12:06:26.935444 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-kz7j9"] Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.017048 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-x64d8\" (UID: \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\") " pod="openstack/dnsmasq-dns-5959f8865f-x64d8" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.017091 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-scripts\") pod \"keystone-bootstrap-kz7j9\" (UID: \"3dec188c-ab95-4544-ac61-6f435f830f97\") " pod="openstack/keystone-bootstrap-kz7j9" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.017111 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-fernet-keys\") pod \"keystone-bootstrap-kz7j9\" (UID: \"3dec188c-ab95-4544-ac61-6f435f830f97\") " pod="openstack/keystone-bootstrap-kz7j9" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.017127 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pj7d\" (UniqueName: \"kubernetes.io/projected/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-kube-api-access-8pj7d\") pod \"dnsmasq-dns-5959f8865f-x64d8\" (UID: \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\") " pod="openstack/dnsmasq-dns-5959f8865f-x64d8" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.017148 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-x64d8\" (UID: \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\") " pod="openstack/dnsmasq-dns-5959f8865f-x64d8" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.017168 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-config\") pod \"dnsmasq-dns-5959f8865f-x64d8\" (UID: \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\") " pod="openstack/dnsmasq-dns-5959f8865f-x64d8" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.017182 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-dns-svc\") pod \"dnsmasq-dns-5959f8865f-x64d8\" (UID: \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\") " pod="openstack/dnsmasq-dns-5959f8865f-x64d8" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.017218 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-x64d8\" (UID: \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\") " pod="openstack/dnsmasq-dns-5959f8865f-x64d8" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.017246 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-config-data\") pod \"keystone-bootstrap-kz7j9\" (UID: \"3dec188c-ab95-4544-ac61-6f435f830f97\") " pod="openstack/keystone-bootstrap-kz7j9" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.017269 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsj8v\" (UniqueName: \"kubernetes.io/projected/3dec188c-ab95-4544-ac61-6f435f830f97-kube-api-access-gsj8v\") pod \"keystone-bootstrap-kz7j9\" (UID: \"3dec188c-ab95-4544-ac61-6f435f830f97\") " pod="openstack/keystone-bootstrap-kz7j9" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.017309 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-credential-keys\") pod \"keystone-bootstrap-kz7j9\" (UID: \"3dec188c-ab95-4544-ac61-6f435f830f97\") " pod="openstack/keystone-bootstrap-kz7j9" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.017324 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-combined-ca-bundle\") pod \"keystone-bootstrap-kz7j9\" (UID: \"3dec188c-ab95-4544-ac61-6f435f830f97\") " pod="openstack/keystone-bootstrap-kz7j9" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.091000 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de68f8fe-0650-4ef4-9445-d31e119de423" path="/var/lib/kubelet/pods/de68f8fe-0650-4ef4-9445-d31e119de423/volumes" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.101059 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6b5f9b5c85-p584g"] Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.102413 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6b5f9b5c85-p584g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.108531 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.108743 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.108872 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-4srx9" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.109040 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.119353 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsj8v\" (UniqueName: \"kubernetes.io/projected/3dec188c-ab95-4544-ac61-6f435f830f97-kube-api-access-gsj8v\") pod \"keystone-bootstrap-kz7j9\" (UID: \"3dec188c-ab95-4544-ac61-6f435f830f97\") " pod="openstack/keystone-bootstrap-kz7j9" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.119419 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-credential-keys\") pod \"keystone-bootstrap-kz7j9\" (UID: \"3dec188c-ab95-4544-ac61-6f435f830f97\") " pod="openstack/keystone-bootstrap-kz7j9" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.119441 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-combined-ca-bundle\") pod \"keystone-bootstrap-kz7j9\" (UID: \"3dec188c-ab95-4544-ac61-6f435f830f97\") " pod="openstack/keystone-bootstrap-kz7j9" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.119494 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-x64d8\" (UID: \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\") " pod="openstack/dnsmasq-dns-5959f8865f-x64d8" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.119513 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-fernet-keys\") pod \"keystone-bootstrap-kz7j9\" (UID: \"3dec188c-ab95-4544-ac61-6f435f830f97\") " pod="openstack/keystone-bootstrap-kz7j9" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.119526 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-scripts\") pod \"keystone-bootstrap-kz7j9\" (UID: \"3dec188c-ab95-4544-ac61-6f435f830f97\") " pod="openstack/keystone-bootstrap-kz7j9" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.119542 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pj7d\" (UniqueName: \"kubernetes.io/projected/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-kube-api-access-8pj7d\") pod \"dnsmasq-dns-5959f8865f-x64d8\" (UID: \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\") " pod="openstack/dnsmasq-dns-5959f8865f-x64d8" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.119561 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-x64d8\" (UID: \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\") " pod="openstack/dnsmasq-dns-5959f8865f-x64d8" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.119580 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-config\") pod \"dnsmasq-dns-5959f8865f-x64d8\" (UID: \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\") " pod="openstack/dnsmasq-dns-5959f8865f-x64d8" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.119595 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-dns-svc\") pod \"dnsmasq-dns-5959f8865f-x64d8\" (UID: \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\") " pod="openstack/dnsmasq-dns-5959f8865f-x64d8" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.119629 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-x64d8\" (UID: \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\") " pod="openstack/dnsmasq-dns-5959f8865f-x64d8" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.119673 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-config-data\") pod \"keystone-bootstrap-kz7j9\" (UID: \"3dec188c-ab95-4544-ac61-6f435f830f97\") " pod="openstack/keystone-bootstrap-kz7j9" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.121993 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-x64d8\" (UID: \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\") " pod="openstack/dnsmasq-dns-5959f8865f-x64d8" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.121993 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-config\") pod \"dnsmasq-dns-5959f8865f-x64d8\" (UID: \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\") " pod="openstack/dnsmasq-dns-5959f8865f-x64d8" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.122165 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-x64d8\" (UID: \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\") " pod="openstack/dnsmasq-dns-5959f8865f-x64d8" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.122456 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-dns-svc\") pod \"dnsmasq-dns-5959f8865f-x64d8\" (UID: \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\") " pod="openstack/dnsmasq-dns-5959f8865f-x64d8" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.122757 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-x64d8\" (UID: \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\") " pod="openstack/dnsmasq-dns-5959f8865f-x64d8" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.125729 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-credential-keys\") pod \"keystone-bootstrap-kz7j9\" (UID: \"3dec188c-ab95-4544-ac61-6f435f830f97\") " pod="openstack/keystone-bootstrap-kz7j9" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.131015 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-combined-ca-bundle\") pod \"keystone-bootstrap-kz7j9\" (UID: \"3dec188c-ab95-4544-ac61-6f435f830f97\") " pod="openstack/keystone-bootstrap-kz7j9" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.135962 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-fernet-keys\") pod \"keystone-bootstrap-kz7j9\" (UID: \"3dec188c-ab95-4544-ac61-6f435f830f97\") " pod="openstack/keystone-bootstrap-kz7j9" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.138707 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-config-data\") pod \"keystone-bootstrap-kz7j9\" (UID: \"3dec188c-ab95-4544-ac61-6f435f830f97\") " pod="openstack/keystone-bootstrap-kz7j9" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.158694 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-scripts\") pod \"keystone-bootstrap-kz7j9\" (UID: \"3dec188c-ab95-4544-ac61-6f435f830f97\") " pod="openstack/keystone-bootstrap-kz7j9" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.159271 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pj7d\" (UniqueName: \"kubernetes.io/projected/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-kube-api-access-8pj7d\") pod \"dnsmasq-dns-5959f8865f-x64d8\" (UID: \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\") " pod="openstack/dnsmasq-dns-5959f8865f-x64d8" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.168899 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsj8v\" (UniqueName: \"kubernetes.io/projected/3dec188c-ab95-4544-ac61-6f435f830f97-kube-api-access-gsj8v\") pod \"keystone-bootstrap-kz7j9\" (UID: \"3dec188c-ab95-4544-ac61-6f435f830f97\") " pod="openstack/keystone-bootstrap-kz7j9" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.169378 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-x64d8" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.186270 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6b5f9b5c85-p584g"] Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.222496 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1f2afb5c-bfb2-4349-8000-4c0c90892d56-horizon-secret-key\") pod \"horizon-6b5f9b5c85-p584g\" (UID: \"1f2afb5c-bfb2-4349-8000-4c0c90892d56\") " pod="openstack/horizon-6b5f9b5c85-p584g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.222528 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scc7s\" (UniqueName: \"kubernetes.io/projected/1f2afb5c-bfb2-4349-8000-4c0c90892d56-kube-api-access-scc7s\") pod \"horizon-6b5f9b5c85-p584g\" (UID: \"1f2afb5c-bfb2-4349-8000-4c0c90892d56\") " pod="openstack/horizon-6b5f9b5c85-p584g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.222548 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f2afb5c-bfb2-4349-8000-4c0c90892d56-logs\") pod \"horizon-6b5f9b5c85-p584g\" (UID: \"1f2afb5c-bfb2-4349-8000-4c0c90892d56\") " pod="openstack/horizon-6b5f9b5c85-p584g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.222603 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1f2afb5c-bfb2-4349-8000-4c0c90892d56-config-data\") pod \"horizon-6b5f9b5c85-p584g\" (UID: \"1f2afb5c-bfb2-4349-8000-4c0c90892d56\") " pod="openstack/horizon-6b5f9b5c85-p584g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.222646 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f2afb5c-bfb2-4349-8000-4c0c90892d56-scripts\") pod \"horizon-6b5f9b5c85-p584g\" (UID: \"1f2afb5c-bfb2-4349-8000-4c0c90892d56\") " pod="openstack/horizon-6b5f9b5c85-p584g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.229987 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.235141 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kz7j9" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.249322 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.254333 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.254500 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.254592 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.268405 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-wdwg2"] Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.269700 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-wdwg2" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.285131 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.285315 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-88ssc" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.285349 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.324874 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\") " pod="openstack/ceilometer-0" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.325317 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\") " pod="openstack/ceilometer-0" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.325360 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r7pn\" (UniqueName: \"kubernetes.io/projected/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-kube-api-access-6r7pn\") pod \"ceilometer-0\" (UID: \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\") " pod="openstack/ceilometer-0" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.325522 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-scripts\") pod \"ceilometer-0\" (UID: \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\") " pod="openstack/ceilometer-0" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.325600 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1f2afb5c-bfb2-4349-8000-4c0c90892d56-horizon-secret-key\") pod \"horizon-6b5f9b5c85-p584g\" (UID: \"1f2afb5c-bfb2-4349-8000-4c0c90892d56\") " pod="openstack/horizon-6b5f9b5c85-p584g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.325630 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-config-data\") pod \"ceilometer-0\" (UID: \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\") " pod="openstack/ceilometer-0" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.325661 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-log-httpd\") pod \"ceilometer-0\" (UID: \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\") " pod="openstack/ceilometer-0" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.325686 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scc7s\" (UniqueName: \"kubernetes.io/projected/1f2afb5c-bfb2-4349-8000-4c0c90892d56-kube-api-access-scc7s\") pod \"horizon-6b5f9b5c85-p584g\" (UID: \"1f2afb5c-bfb2-4349-8000-4c0c90892d56\") " pod="openstack/horizon-6b5f9b5c85-p584g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.325717 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f2afb5c-bfb2-4349-8000-4c0c90892d56-logs\") pod \"horizon-6b5f9b5c85-p584g\" (UID: \"1f2afb5c-bfb2-4349-8000-4c0c90892d56\") " pod="openstack/horizon-6b5f9b5c85-p584g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.325830 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-run-httpd\") pod \"ceilometer-0\" (UID: \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\") " pod="openstack/ceilometer-0" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.325871 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1f2afb5c-bfb2-4349-8000-4c0c90892d56-config-data\") pod \"horizon-6b5f9b5c85-p584g\" (UID: \"1f2afb5c-bfb2-4349-8000-4c0c90892d56\") " pod="openstack/horizon-6b5f9b5c85-p584g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.325936 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f2afb5c-bfb2-4349-8000-4c0c90892d56-scripts\") pod \"horizon-6b5f9b5c85-p584g\" (UID: \"1f2afb5c-bfb2-4349-8000-4c0c90892d56\") " pod="openstack/horizon-6b5f9b5c85-p584g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.326837 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f2afb5c-bfb2-4349-8000-4c0c90892d56-scripts\") pod \"horizon-6b5f9b5c85-p584g\" (UID: \"1f2afb5c-bfb2-4349-8000-4c0c90892d56\") " pod="openstack/horizon-6b5f9b5c85-p584g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.328817 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f2afb5c-bfb2-4349-8000-4c0c90892d56-logs\") pod \"horizon-6b5f9b5c85-p584g\" (UID: \"1f2afb5c-bfb2-4349-8000-4c0c90892d56\") " pod="openstack/horizon-6b5f9b5c85-p584g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.329758 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1f2afb5c-bfb2-4349-8000-4c0c90892d56-config-data\") pod \"horizon-6b5f9b5c85-p584g\" (UID: \"1f2afb5c-bfb2-4349-8000-4c0c90892d56\") " pod="openstack/horizon-6b5f9b5c85-p584g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.354712 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1f2afb5c-bfb2-4349-8000-4c0c90892d56-horizon-secret-key\") pod \"horizon-6b5f9b5c85-p584g\" (UID: \"1f2afb5c-bfb2-4349-8000-4c0c90892d56\") " pod="openstack/horizon-6b5f9b5c85-p584g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.371184 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-b6qnm"] Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.373550 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-b6qnm" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.381121 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.382860 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-ktqxm" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.391872 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scc7s\" (UniqueName: \"kubernetes.io/projected/1f2afb5c-bfb2-4349-8000-4c0c90892d56-kube-api-access-scc7s\") pod \"horizon-6b5f9b5c85-p584g\" (UID: \"1f2afb5c-bfb2-4349-8000-4c0c90892d56\") " pod="openstack/horizon-6b5f9b5c85-p584g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.442062 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-config-data\") pod \"ceilometer-0\" (UID: \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\") " pod="openstack/ceilometer-0" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.442102 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-log-httpd\") pod \"ceilometer-0\" (UID: \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\") " pod="openstack/ceilometer-0" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.442162 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-run-httpd\") pod \"ceilometer-0\" (UID: \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\") " pod="openstack/ceilometer-0" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.442210 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq9wd\" (UniqueName: \"kubernetes.io/projected/d2d0a770-288f-40d8-832e-f5463863bef1-kube-api-access-qq9wd\") pod \"neutron-db-sync-wdwg2\" (UID: \"d2d0a770-288f-40d8-832e-f5463863bef1\") " pod="openstack/neutron-db-sync-wdwg2" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.442255 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\") " pod="openstack/ceilometer-0" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.442282 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2d0a770-288f-40d8-832e-f5463863bef1-combined-ca-bundle\") pod \"neutron-db-sync-wdwg2\" (UID: \"d2d0a770-288f-40d8-832e-f5463863bef1\") " pod="openstack/neutron-db-sync-wdwg2" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.442321 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\") " pod="openstack/ceilometer-0" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.442341 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6r7pn\" (UniqueName: \"kubernetes.io/projected/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-kube-api-access-6r7pn\") pod \"ceilometer-0\" (UID: \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\") " pod="openstack/ceilometer-0" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.442373 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d2d0a770-288f-40d8-832e-f5463863bef1-config\") pod \"neutron-db-sync-wdwg2\" (UID: \"d2d0a770-288f-40d8-832e-f5463863bef1\") " pod="openstack/neutron-db-sync-wdwg2" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.442396 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-scripts\") pod \"ceilometer-0\" (UID: \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\") " pod="openstack/ceilometer-0" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.450923 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-log-httpd\") pod \"ceilometer-0\" (UID: \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\") " pod="openstack/ceilometer-0" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.450980 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-qdzjz"] Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.451310 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-run-httpd\") pod \"ceilometer-0\" (UID: \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\") " pod="openstack/ceilometer-0" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.452035 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-qdzjz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.458062 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\") " pod="openstack/ceilometer-0" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.458369 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\") " pod="openstack/ceilometer-0" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.462563 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-config-data\") pod \"ceilometer-0\" (UID: \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\") " pod="openstack/ceilometer-0" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.463506 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-scripts\") pod \"ceilometer-0\" (UID: \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\") " pod="openstack/ceilometer-0" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.464329 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-klrh4" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.466929 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.467109 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.481430 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r7pn\" (UniqueName: \"kubernetes.io/projected/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-kube-api-access-6r7pn\") pod \"ceilometer-0\" (UID: \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\") " pod="openstack/ceilometer-0" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.482704 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-wdwg2"] Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.493171 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-b6qnm"] Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.501048 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-qdzjz"] Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.510014 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-8vx8g"] Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.511293 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8vx8g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.518100 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.518300 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-s6tq2" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.518414 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.518788 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-x64d8"] Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.529083 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-8vx8g"] Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.547059 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qq9wd\" (UniqueName: \"kubernetes.io/projected/d2d0a770-288f-40d8-832e-f5463863bef1-kube-api-access-qq9wd\") pod \"neutron-db-sync-wdwg2\" (UID: \"d2d0a770-288f-40d8-832e-f5463863bef1\") " pod="openstack/neutron-db-sync-wdwg2" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.547100 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-combined-ca-bundle\") pod \"cinder-db-sync-qdzjz\" (UID: \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\") " pod="openstack/cinder-db-sync-qdzjz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.547128 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-scripts\") pod \"cinder-db-sync-qdzjz\" (UID: \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\") " pod="openstack/cinder-db-sync-qdzjz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.547155 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p5k2\" (UniqueName: \"kubernetes.io/projected/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-kube-api-access-4p5k2\") pod \"cinder-db-sync-qdzjz\" (UID: \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\") " pod="openstack/cinder-db-sync-qdzjz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.547178 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2d0a770-288f-40d8-832e-f5463863bef1-combined-ca-bundle\") pod \"neutron-db-sync-wdwg2\" (UID: \"d2d0a770-288f-40d8-832e-f5463863bef1\") " pod="openstack/neutron-db-sync-wdwg2" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.547203 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95b808e7-674f-4592-af6e-f7c8682f6a17-combined-ca-bundle\") pod \"barbican-db-sync-b6qnm\" (UID: \"95b808e7-674f-4592-af6e-f7c8682f6a17\") " pod="openstack/barbican-db-sync-b6qnm" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.547221 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sh7t\" (UniqueName: \"kubernetes.io/projected/95b808e7-674f-4592-af6e-f7c8682f6a17-kube-api-access-9sh7t\") pod \"barbican-db-sync-b6qnm\" (UID: \"95b808e7-674f-4592-af6e-f7c8682f6a17\") " pod="openstack/barbican-db-sync-b6qnm" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.547238 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-config-data\") pod \"cinder-db-sync-qdzjz\" (UID: \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\") " pod="openstack/cinder-db-sync-qdzjz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.547275 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d2d0a770-288f-40d8-832e-f5463863bef1-config\") pod \"neutron-db-sync-wdwg2\" (UID: \"d2d0a770-288f-40d8-832e-f5463863bef1\") " pod="openstack/neutron-db-sync-wdwg2" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.547293 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-db-sync-config-data\") pod \"cinder-db-sync-qdzjz\" (UID: \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\") " pod="openstack/cinder-db-sync-qdzjz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.547308 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/95b808e7-674f-4592-af6e-f7c8682f6a17-db-sync-config-data\") pod \"barbican-db-sync-b6qnm\" (UID: \"95b808e7-674f-4592-af6e-f7c8682f6a17\") " pod="openstack/barbican-db-sync-b6qnm" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.547324 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-etc-machine-id\") pod \"cinder-db-sync-qdzjz\" (UID: \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\") " pod="openstack/cinder-db-sync-qdzjz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.551687 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2d0a770-288f-40d8-832e-f5463863bef1-combined-ca-bundle\") pod \"neutron-db-sync-wdwg2\" (UID: \"d2d0a770-288f-40d8-832e-f5463863bef1\") " pod="openstack/neutron-db-sync-wdwg2" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.566999 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d2d0a770-288f-40d8-832e-f5463863bef1-config\") pod \"neutron-db-sync-wdwg2\" (UID: \"d2d0a770-288f-40d8-832e-f5463863bef1\") " pod="openstack/neutron-db-sync-wdwg2" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.577870 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq9wd\" (UniqueName: \"kubernetes.io/projected/d2d0a770-288f-40d8-832e-f5463863bef1-kube-api-access-qq9wd\") pod \"neutron-db-sync-wdwg2\" (UID: \"d2d0a770-288f-40d8-832e-f5463863bef1\") " pod="openstack/neutron-db-sync-wdwg2" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.604301 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-7kqcz"] Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.606300 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.638547 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7f5fbb4dd7-wcbjd"] Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.640579 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7f5fbb4dd7-wcbjd" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.649750 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4p5k2\" (UniqueName: \"kubernetes.io/projected/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-kube-api-access-4p5k2\") pod \"cinder-db-sync-qdzjz\" (UID: \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\") " pod="openstack/cinder-db-sync-qdzjz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.649816 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-logs\") pod \"placement-db-sync-8vx8g\" (UID: \"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76\") " pod="openstack/placement-db-sync-8vx8g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.649842 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95b808e7-674f-4592-af6e-f7c8682f6a17-combined-ca-bundle\") pod \"barbican-db-sync-b6qnm\" (UID: \"95b808e7-674f-4592-af6e-f7c8682f6a17\") " pod="openstack/barbican-db-sync-b6qnm" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.649858 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sh7t\" (UniqueName: \"kubernetes.io/projected/95b808e7-674f-4592-af6e-f7c8682f6a17-kube-api-access-9sh7t\") pod \"barbican-db-sync-b6qnm\" (UID: \"95b808e7-674f-4592-af6e-f7c8682f6a17\") " pod="openstack/barbican-db-sync-b6qnm" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.649875 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-config-data\") pod \"cinder-db-sync-qdzjz\" (UID: \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\") " pod="openstack/cinder-db-sync-qdzjz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.649914 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-scripts\") pod \"placement-db-sync-8vx8g\" (UID: \"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76\") " pod="openstack/placement-db-sync-8vx8g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.649942 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-db-sync-config-data\") pod \"cinder-db-sync-qdzjz\" (UID: \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\") " pod="openstack/cinder-db-sync-qdzjz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.649960 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/95b808e7-674f-4592-af6e-f7c8682f6a17-db-sync-config-data\") pod \"barbican-db-sync-b6qnm\" (UID: \"95b808e7-674f-4592-af6e-f7c8682f6a17\") " pod="openstack/barbican-db-sync-b6qnm" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.649978 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-etc-machine-id\") pod \"cinder-db-sync-qdzjz\" (UID: \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\") " pod="openstack/cinder-db-sync-qdzjz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.649994 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-combined-ca-bundle\") pod \"placement-db-sync-8vx8g\" (UID: \"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76\") " pod="openstack/placement-db-sync-8vx8g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.650080 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vtt2\" (UniqueName: \"kubernetes.io/projected/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-kube-api-access-2vtt2\") pod \"placement-db-sync-8vx8g\" (UID: \"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76\") " pod="openstack/placement-db-sync-8vx8g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.650116 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-combined-ca-bundle\") pod \"cinder-db-sync-qdzjz\" (UID: \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\") " pod="openstack/cinder-db-sync-qdzjz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.650136 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-config-data\") pod \"placement-db-sync-8vx8g\" (UID: \"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76\") " pod="openstack/placement-db-sync-8vx8g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.650160 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-scripts\") pod \"cinder-db-sync-qdzjz\" (UID: \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\") " pod="openstack/cinder-db-sync-qdzjz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.650496 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-etc-machine-id\") pod \"cinder-db-sync-qdzjz\" (UID: \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\") " pod="openstack/cinder-db-sync-qdzjz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.653443 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/95b808e7-674f-4592-af6e-f7c8682f6a17-db-sync-config-data\") pod \"barbican-db-sync-b6qnm\" (UID: \"95b808e7-674f-4592-af6e-f7c8682f6a17\") " pod="openstack/barbican-db-sync-b6qnm" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.654038 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-config-data\") pod \"cinder-db-sync-qdzjz\" (UID: \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\") " pod="openstack/cinder-db-sync-qdzjz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.654522 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6b5f9b5c85-p584g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.659564 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-combined-ca-bundle\") pod \"cinder-db-sync-qdzjz\" (UID: \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\") " pod="openstack/cinder-db-sync-qdzjz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.667692 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-7kqcz"] Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.676358 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-scripts\") pod \"cinder-db-sync-qdzjz\" (UID: \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\") " pod="openstack/cinder-db-sync-qdzjz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.679507 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sh7t\" (UniqueName: \"kubernetes.io/projected/95b808e7-674f-4592-af6e-f7c8682f6a17-kube-api-access-9sh7t\") pod \"barbican-db-sync-b6qnm\" (UID: \"95b808e7-674f-4592-af6e-f7c8682f6a17\") " pod="openstack/barbican-db-sync-b6qnm" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.680440 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p5k2\" (UniqueName: \"kubernetes.io/projected/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-kube-api-access-4p5k2\") pod \"cinder-db-sync-qdzjz\" (UID: \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\") " pod="openstack/cinder-db-sync-qdzjz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.683214 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7f5fbb4dd7-wcbjd"] Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.683383 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95b808e7-674f-4592-af6e-f7c8682f6a17-combined-ca-bundle\") pod \"barbican-db-sync-b6qnm\" (UID: \"95b808e7-674f-4592-af6e-f7c8682f6a17\") " pod="openstack/barbican-db-sync-b6qnm" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.689501 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.710988 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-db-sync-config-data\") pod \"cinder-db-sync-qdzjz\" (UID: \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\") " pod="openstack/cinder-db-sync-qdzjz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.751940 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-7kqcz\" (UID: \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.752001 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-scripts\") pod \"placement-db-sync-8vx8g\" (UID: \"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76\") " pod="openstack/placement-db-sync-8vx8g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.752039 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-7kqcz\" (UID: \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.752070 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-combined-ca-bundle\") pod \"placement-db-sync-8vx8g\" (UID: \"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76\") " pod="openstack/placement-db-sync-8vx8g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.752096 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmp8d\" (UniqueName: \"kubernetes.io/projected/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-kube-api-access-nmp8d\") pod \"dnsmasq-dns-58dd9ff6bc-7kqcz\" (UID: \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.752142 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-scripts\") pod \"horizon-7f5fbb4dd7-wcbjd\" (UID: \"99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2\") " pod="openstack/horizon-7f5fbb4dd7-wcbjd" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.752169 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-config-data\") pod \"horizon-7f5fbb4dd7-wcbjd\" (UID: \"99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2\") " pod="openstack/horizon-7f5fbb4dd7-wcbjd" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.752215 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-horizon-secret-key\") pod \"horizon-7f5fbb4dd7-wcbjd\" (UID: \"99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2\") " pod="openstack/horizon-7f5fbb4dd7-wcbjd" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.752243 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vtt2\" (UniqueName: \"kubernetes.io/projected/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-kube-api-access-2vtt2\") pod \"placement-db-sync-8vx8g\" (UID: \"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76\") " pod="openstack/placement-db-sync-8vx8g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.752264 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-logs\") pod \"horizon-7f5fbb4dd7-wcbjd\" (UID: \"99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2\") " pod="openstack/horizon-7f5fbb4dd7-wcbjd" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.752308 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-config-data\") pod \"placement-db-sync-8vx8g\" (UID: \"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76\") " pod="openstack/placement-db-sync-8vx8g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.752360 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jr9qj\" (UniqueName: \"kubernetes.io/projected/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-kube-api-access-jr9qj\") pod \"horizon-7f5fbb4dd7-wcbjd\" (UID: \"99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2\") " pod="openstack/horizon-7f5fbb4dd7-wcbjd" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.752385 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-config\") pod \"dnsmasq-dns-58dd9ff6bc-7kqcz\" (UID: \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.752406 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-7kqcz\" (UID: \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.752433 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-logs\") pod \"placement-db-sync-8vx8g\" (UID: \"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76\") " pod="openstack/placement-db-sync-8vx8g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.752457 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-7kqcz\" (UID: \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.753119 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-logs\") pod \"placement-db-sync-8vx8g\" (UID: \"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76\") " pod="openstack/placement-db-sync-8vx8g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.758465 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-combined-ca-bundle\") pod \"placement-db-sync-8vx8g\" (UID: \"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76\") " pod="openstack/placement-db-sync-8vx8g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.768813 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-config-data\") pod \"placement-db-sync-8vx8g\" (UID: \"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76\") " pod="openstack/placement-db-sync-8vx8g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.769002 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-scripts\") pod \"placement-db-sync-8vx8g\" (UID: \"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76\") " pod="openstack/placement-db-sync-8vx8g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.771084 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vtt2\" (UniqueName: \"kubernetes.io/projected/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-kube-api-access-2vtt2\") pod \"placement-db-sync-8vx8g\" (UID: \"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76\") " pod="openstack/placement-db-sync-8vx8g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.790153 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-wdwg2" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.801064 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-b6qnm" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.815009 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-qdzjz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.861141 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8vx8g" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.862508 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-horizon-secret-key\") pod \"horizon-7f5fbb4dd7-wcbjd\" (UID: \"99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2\") " pod="openstack/horizon-7f5fbb4dd7-wcbjd" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.862542 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-logs\") pod \"horizon-7f5fbb4dd7-wcbjd\" (UID: \"99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2\") " pod="openstack/horizon-7f5fbb4dd7-wcbjd" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.862602 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jr9qj\" (UniqueName: \"kubernetes.io/projected/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-kube-api-access-jr9qj\") pod \"horizon-7f5fbb4dd7-wcbjd\" (UID: \"99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2\") " pod="openstack/horizon-7f5fbb4dd7-wcbjd" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.862628 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-config\") pod \"dnsmasq-dns-58dd9ff6bc-7kqcz\" (UID: \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.862665 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-7kqcz\" (UID: \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.862689 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-7kqcz\" (UID: \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.862728 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-7kqcz\" (UID: \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.862764 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-7kqcz\" (UID: \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.862797 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmp8d\" (UniqueName: \"kubernetes.io/projected/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-kube-api-access-nmp8d\") pod \"dnsmasq-dns-58dd9ff6bc-7kqcz\" (UID: \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.862837 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-scripts\") pod \"horizon-7f5fbb4dd7-wcbjd\" (UID: \"99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2\") " pod="openstack/horizon-7f5fbb4dd7-wcbjd" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.862860 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-config-data\") pod \"horizon-7f5fbb4dd7-wcbjd\" (UID: \"99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2\") " pod="openstack/horizon-7f5fbb4dd7-wcbjd" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.865841 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-logs\") pod \"horizon-7f5fbb4dd7-wcbjd\" (UID: \"99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2\") " pod="openstack/horizon-7f5fbb4dd7-wcbjd" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.865956 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-7kqcz\" (UID: \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.866869 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-config-data\") pod \"horizon-7f5fbb4dd7-wcbjd\" (UID: \"99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2\") " pod="openstack/horizon-7f5fbb4dd7-wcbjd" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.867556 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-scripts\") pod \"horizon-7f5fbb4dd7-wcbjd\" (UID: \"99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2\") " pod="openstack/horizon-7f5fbb4dd7-wcbjd" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.868289 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-7kqcz\" (UID: \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.868716 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-7kqcz\" (UID: \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.869074 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-config\") pod \"dnsmasq-dns-58dd9ff6bc-7kqcz\" (UID: \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.870200 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-7kqcz\" (UID: \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.889184 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmp8d\" (UniqueName: \"kubernetes.io/projected/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-kube-api-access-nmp8d\") pod \"dnsmasq-dns-58dd9ff6bc-7kqcz\" (UID: \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.890021 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jr9qj\" (UniqueName: \"kubernetes.io/projected/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-kube-api-access-jr9qj\") pod \"horizon-7f5fbb4dd7-wcbjd\" (UID: \"99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2\") " pod="openstack/horizon-7f5fbb4dd7-wcbjd" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.892079 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-horizon-secret-key\") pod \"horizon-7f5fbb4dd7-wcbjd\" (UID: \"99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2\") " pod="openstack/horizon-7f5fbb4dd7-wcbjd" Mar 13 12:06:27 crc kubenswrapper[4837]: I0313 12:06:27.906680 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-x64d8"] Mar 13 12:06:27 crc kubenswrapper[4837]: W0313 12:06:27.961246 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64ff6ef1_7035_4f8e_8ee7_d0b858c92459.slice/crio-ed6fbc437d5476a19d20f53afd28bda92c212f6e9ba02f54fb6312bfa2653c75 WatchSource:0}: Error finding container ed6fbc437d5476a19d20f53afd28bda92c212f6e9ba02f54fb6312bfa2653c75: Status 404 returned error can't find the container with id ed6fbc437d5476a19d20f53afd28bda92c212f6e9ba02f54fb6312bfa2653c75 Mar 13 12:06:28 crc kubenswrapper[4837]: W0313 12:06:28.062440 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3dec188c_ab95_4544_ac61_6f435f830f97.slice/crio-a81b99f16a86752e1a0e2bc8e53026978a648af00da7f92ffdb871f715779f44 WatchSource:0}: Error finding container a81b99f16a86752e1a0e2bc8e53026978a648af00da7f92ffdb871f715779f44: Status 404 returned error can't find the container with id a81b99f16a86752e1a0e2bc8e53026978a648af00da7f92ffdb871f715779f44 Mar 13 12:06:28 crc kubenswrapper[4837]: I0313 12:06:28.064151 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-kz7j9"] Mar 13 12:06:28 crc kubenswrapper[4837]: I0313 12:06:28.165901 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" Mar 13 12:06:28 crc kubenswrapper[4837]: I0313 12:06:28.184108 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7f5fbb4dd7-wcbjd" Mar 13 12:06:28 crc kubenswrapper[4837]: I0313 12:06:28.300703 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:06:28 crc kubenswrapper[4837]: I0313 12:06:28.431451 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6b5f9b5c85-p584g"] Mar 13 12:06:28 crc kubenswrapper[4837]: W0313 12:06:28.447569 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f2afb5c_bfb2_4349_8000_4c0c90892d56.slice/crio-18955fe5d50dd684cdf6370fe66929e8470b6ef70461302b53ed56fa3595256c WatchSource:0}: Error finding container 18955fe5d50dd684cdf6370fe66929e8470b6ef70461302b53ed56fa3595256c: Status 404 returned error can't find the container with id 18955fe5d50dd684cdf6370fe66929e8470b6ef70461302b53ed56fa3595256c Mar 13 12:06:28 crc kubenswrapper[4837]: I0313 12:06:28.652839 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-wdwg2"] Mar 13 12:06:28 crc kubenswrapper[4837]: I0313 12:06:28.670403 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kz7j9" event={"ID":"3dec188c-ab95-4544-ac61-6f435f830f97","Type":"ContainerStarted","Data":"40a0292d1dfe433f0d44082f12bb7e30ff5d447f6e395b1d7a570420f6252eeb"} Mar 13 12:06:28 crc kubenswrapper[4837]: I0313 12:06:28.670465 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kz7j9" event={"ID":"3dec188c-ab95-4544-ac61-6f435f830f97","Type":"ContainerStarted","Data":"a81b99f16a86752e1a0e2bc8e53026978a648af00da7f92ffdb871f715779f44"} Mar 13 12:06:28 crc kubenswrapper[4837]: I0313 12:06:28.674419 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee","Type":"ContainerStarted","Data":"2fe508c1e7b8efe966205eebb2665129d9e9d777f425ce11141b713c93504dc7"} Mar 13 12:06:28 crc kubenswrapper[4837]: I0313 12:06:28.676331 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6b5f9b5c85-p584g" event={"ID":"1f2afb5c-bfb2-4349-8000-4c0c90892d56","Type":"ContainerStarted","Data":"18955fe5d50dd684cdf6370fe66929e8470b6ef70461302b53ed56fa3595256c"} Mar 13 12:06:28 crc kubenswrapper[4837]: I0313 12:06:28.678392 4837 generic.go:334] "Generic (PLEG): container finished" podID="64ff6ef1-7035-4f8e-8ee7-d0b858c92459" containerID="1b9a20436328083d120b6ff2c35a76b78305ceb579a5babd22ad7d00d1dc8340" exitCode=0 Mar 13 12:06:28 crc kubenswrapper[4837]: I0313 12:06:28.678462 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5959f8865f-x64d8" event={"ID":"64ff6ef1-7035-4f8e-8ee7-d0b858c92459","Type":"ContainerDied","Data":"1b9a20436328083d120b6ff2c35a76b78305ceb579a5babd22ad7d00d1dc8340"} Mar 13 12:06:28 crc kubenswrapper[4837]: I0313 12:06:28.678543 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5959f8865f-x64d8" event={"ID":"64ff6ef1-7035-4f8e-8ee7-d0b858c92459","Type":"ContainerStarted","Data":"ed6fbc437d5476a19d20f53afd28bda92c212f6e9ba02f54fb6312bfa2653c75"} Mar 13 12:06:28 crc kubenswrapper[4837]: I0313 12:06:28.717567 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-kz7j9" podStartSLOduration=2.71754906 podStartE2EDuration="2.71754906s" podCreationTimestamp="2026-03-13 12:06:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:06:28.709516936 +0000 UTC m=+1104.347783699" watchObservedRunningTime="2026-03-13 12:06:28.71754906 +0000 UTC m=+1104.355815823" Mar 13 12:06:28 crc kubenswrapper[4837]: I0313 12:06:28.836136 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-qdzjz"] Mar 13 12:06:28 crc kubenswrapper[4837]: I0313 12:06:28.844359 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-8vx8g"] Mar 13 12:06:28 crc kubenswrapper[4837]: I0313 12:06:28.852321 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-b6qnm"] Mar 13 12:06:28 crc kubenswrapper[4837]: I0313 12:06:28.983685 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7f5fbb4dd7-wcbjd"] Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.000060 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6b5f9b5c85-p584g"] Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.098327 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-c6787dc45-zbdfx"] Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.099991 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-c6787dc45-zbdfx" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.148461 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-7kqcz"] Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.155161 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-c6787dc45-zbdfx"] Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.194211 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-horizon-secret-key\") pod \"horizon-c6787dc45-zbdfx\" (UID: \"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14\") " pod="openstack/horizon-c6787dc45-zbdfx" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.194389 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v796v\" (UniqueName: \"kubernetes.io/projected/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-kube-api-access-v796v\") pod \"horizon-c6787dc45-zbdfx\" (UID: \"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14\") " pod="openstack/horizon-c6787dc45-zbdfx" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.194543 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-logs\") pod \"horizon-c6787dc45-zbdfx\" (UID: \"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14\") " pod="openstack/horizon-c6787dc45-zbdfx" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.194667 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-config-data\") pod \"horizon-c6787dc45-zbdfx\" (UID: \"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14\") " pod="openstack/horizon-c6787dc45-zbdfx" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.194695 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-scripts\") pod \"horizon-c6787dc45-zbdfx\" (UID: \"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14\") " pod="openstack/horizon-c6787dc45-zbdfx" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.230257 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.235891 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-x64d8" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.295845 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-horizon-secret-key\") pod \"horizon-c6787dc45-zbdfx\" (UID: \"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14\") " pod="openstack/horizon-c6787dc45-zbdfx" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.295936 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v796v\" (UniqueName: \"kubernetes.io/projected/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-kube-api-access-v796v\") pod \"horizon-c6787dc45-zbdfx\" (UID: \"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14\") " pod="openstack/horizon-c6787dc45-zbdfx" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.296007 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-logs\") pod \"horizon-c6787dc45-zbdfx\" (UID: \"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14\") " pod="openstack/horizon-c6787dc45-zbdfx" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.296058 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-config-data\") pod \"horizon-c6787dc45-zbdfx\" (UID: \"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14\") " pod="openstack/horizon-c6787dc45-zbdfx" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.296084 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-scripts\") pod \"horizon-c6787dc45-zbdfx\" (UID: \"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14\") " pod="openstack/horizon-c6787dc45-zbdfx" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.296784 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-logs\") pod \"horizon-c6787dc45-zbdfx\" (UID: \"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14\") " pod="openstack/horizon-c6787dc45-zbdfx" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.296841 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-scripts\") pod \"horizon-c6787dc45-zbdfx\" (UID: \"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14\") " pod="openstack/horizon-c6787dc45-zbdfx" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.297612 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-config-data\") pod \"horizon-c6787dc45-zbdfx\" (UID: \"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14\") " pod="openstack/horizon-c6787dc45-zbdfx" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.300764 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-horizon-secret-key\") pod \"horizon-c6787dc45-zbdfx\" (UID: \"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14\") " pod="openstack/horizon-c6787dc45-zbdfx" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.321376 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v796v\" (UniqueName: \"kubernetes.io/projected/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-kube-api-access-v796v\") pod \"horizon-c6787dc45-zbdfx\" (UID: \"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14\") " pod="openstack/horizon-c6787dc45-zbdfx" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.398304 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-ovsdbserver-nb\") pod \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\" (UID: \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\") " Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.398686 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-dns-svc\") pod \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\" (UID: \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\") " Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.398722 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-dns-swift-storage-0\") pod \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\" (UID: \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\") " Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.398814 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-config\") pod \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\" (UID: \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\") " Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.398854 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pj7d\" (UniqueName: \"kubernetes.io/projected/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-kube-api-access-8pj7d\") pod \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\" (UID: \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\") " Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.398889 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-ovsdbserver-sb\") pod \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\" (UID: \"64ff6ef1-7035-4f8e-8ee7-d0b858c92459\") " Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.407968 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-kube-api-access-8pj7d" (OuterVolumeSpecName: "kube-api-access-8pj7d") pod "64ff6ef1-7035-4f8e-8ee7-d0b858c92459" (UID: "64ff6ef1-7035-4f8e-8ee7-d0b858c92459"). InnerVolumeSpecName "kube-api-access-8pj7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.425675 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "64ff6ef1-7035-4f8e-8ee7-d0b858c92459" (UID: "64ff6ef1-7035-4f8e-8ee7-d0b858c92459"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.427229 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "64ff6ef1-7035-4f8e-8ee7-d0b858c92459" (UID: "64ff6ef1-7035-4f8e-8ee7-d0b858c92459"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.435248 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "64ff6ef1-7035-4f8e-8ee7-d0b858c92459" (UID: "64ff6ef1-7035-4f8e-8ee7-d0b858c92459"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.438599 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "64ff6ef1-7035-4f8e-8ee7-d0b858c92459" (UID: "64ff6ef1-7035-4f8e-8ee7-d0b858c92459"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.441443 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-c6787dc45-zbdfx" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.454919 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-config" (OuterVolumeSpecName: "config") pod "64ff6ef1-7035-4f8e-8ee7-d0b858c92459" (UID: "64ff6ef1-7035-4f8e-8ee7-d0b858c92459"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.501377 4837 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.501421 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.501435 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pj7d\" (UniqueName: \"kubernetes.io/projected/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-kube-api-access-8pj7d\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.501491 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.501509 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.501521 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/64ff6ef1-7035-4f8e-8ee7-d0b858c92459-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.701173 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-b6qnm" event={"ID":"95b808e7-674f-4592-af6e-f7c8682f6a17","Type":"ContainerStarted","Data":"02cdc5326e2dbc385d4e7090105a3655b6651929ef4db12950f0c379aaf98274"} Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.702758 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8vx8g" event={"ID":"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76","Type":"ContainerStarted","Data":"d8d4fa30fd1f227e47a679c4ebd48ddee761f9902a8c45ed343c205dc3f7e3b1"} Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.704564 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5959f8865f-x64d8" event={"ID":"64ff6ef1-7035-4f8e-8ee7-d0b858c92459","Type":"ContainerDied","Data":"ed6fbc437d5476a19d20f53afd28bda92c212f6e9ba02f54fb6312bfa2653c75"} Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.704589 4837 scope.go:117] "RemoveContainer" containerID="1b9a20436328083d120b6ff2c35a76b78305ceb579a5babd22ad7d00d1dc8340" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.704728 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-x64d8" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.708559 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-qdzjz" event={"ID":"a44db1d6-6da2-41a5-a37f-ffc602f0d55a","Type":"ContainerStarted","Data":"d87408c4f80f070da48980a1c0c42ec26d6e0f566d37471876ae97d32157796e"} Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.712021 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-wdwg2" event={"ID":"d2d0a770-288f-40d8-832e-f5463863bef1","Type":"ContainerStarted","Data":"167d2264a85f4435f333e5de927afa95b020419521d018ef924666fe1959c6ff"} Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.712055 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-wdwg2" event={"ID":"d2d0a770-288f-40d8-832e-f5463863bef1","Type":"ContainerStarted","Data":"20eddadf1412bdac7244116ec35325dbc4b45413968aa761e6fe806d93d5742c"} Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.715606 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7f5fbb4dd7-wcbjd" event={"ID":"99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2","Type":"ContainerStarted","Data":"041964cbbb19e51e7b1a85074982a092d132a78534f19eb3616f99ddc8aa3e15"} Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.720307 4837 generic.go:334] "Generic (PLEG): container finished" podID="306aa5e9-7f77-4ff8-9cf6-5b3255c85337" containerID="a952d72f45aa2f65e1c6c2e7322bdcf16fb7324473b881a19caa21b16b66a760" exitCode=0 Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.721937 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" event={"ID":"306aa5e9-7f77-4ff8-9cf6-5b3255c85337","Type":"ContainerDied","Data":"a952d72f45aa2f65e1c6c2e7322bdcf16fb7324473b881a19caa21b16b66a760"} Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.721971 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" event={"ID":"306aa5e9-7f77-4ff8-9cf6-5b3255c85337","Type":"ContainerStarted","Data":"2a0f4fde059e2510bd13af9a796ea4745ff14474c756cbe8a1063e240ce40a71"} Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.726919 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-wdwg2" podStartSLOduration=2.726902056 podStartE2EDuration="2.726902056s" podCreationTimestamp="2026-03-13 12:06:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:06:29.726583876 +0000 UTC m=+1105.364850639" watchObservedRunningTime="2026-03-13 12:06:29.726902056 +0000 UTC m=+1105.365168819" Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.803391 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-x64d8"] Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.823492 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-x64d8"] Mar 13 12:06:29 crc kubenswrapper[4837]: I0313 12:06:29.955286 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-c6787dc45-zbdfx"] Mar 13 12:06:30 crc kubenswrapper[4837]: I0313 12:06:30.763777 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c6787dc45-zbdfx" event={"ID":"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14","Type":"ContainerStarted","Data":"c8e41db64721802eb9e2d30e33b7feaf3f233822df5127e44d2dee0b5f64ca8a"} Mar 13 12:06:30 crc kubenswrapper[4837]: I0313 12:06:30.789156 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" event={"ID":"306aa5e9-7f77-4ff8-9cf6-5b3255c85337","Type":"ContainerStarted","Data":"2f11bc74222520fcf554ba948fc1d1529fb608acebf234b92a60442a96bc720f"} Mar 13 12:06:30 crc kubenswrapper[4837]: I0313 12:06:30.789466 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" Mar 13 12:06:30 crc kubenswrapper[4837]: I0313 12:06:30.853562 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" podStartSLOduration=3.853541599 podStartE2EDuration="3.853541599s" podCreationTimestamp="2026-03-13 12:06:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:06:30.846156216 +0000 UTC m=+1106.484422999" watchObservedRunningTime="2026-03-13 12:06:30.853541599 +0000 UTC m=+1106.491808362" Mar 13 12:06:31 crc kubenswrapper[4837]: I0313 12:06:31.077499 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64ff6ef1-7035-4f8e-8ee7-d0b858c92459" path="/var/lib/kubelet/pods/64ff6ef1-7035-4f8e-8ee7-d0b858c92459/volumes" Mar 13 12:06:33 crc kubenswrapper[4837]: I0313 12:06:33.821052 4837 generic.go:334] "Generic (PLEG): container finished" podID="3dec188c-ab95-4544-ac61-6f435f830f97" containerID="40a0292d1dfe433f0d44082f12bb7e30ff5d447f6e395b1d7a570420f6252eeb" exitCode=0 Mar 13 12:06:33 crc kubenswrapper[4837]: I0313 12:06:33.821107 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kz7j9" event={"ID":"3dec188c-ab95-4544-ac61-6f435f830f97","Type":"ContainerDied","Data":"40a0292d1dfe433f0d44082f12bb7e30ff5d447f6e395b1d7a570420f6252eeb"} Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.771409 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7f5fbb4dd7-wcbjd"] Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.824651 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5596f9dfb8-m9bxb"] Mar 13 12:06:35 crc kubenswrapper[4837]: E0313 12:06:35.825151 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64ff6ef1-7035-4f8e-8ee7-d0b858c92459" containerName="init" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.825167 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="64ff6ef1-7035-4f8e-8ee7-d0b858c92459" containerName="init" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.825348 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="64ff6ef1-7035-4f8e-8ee7-d0b858c92459" containerName="init" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.826314 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.833731 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5596f9dfb8-m9bxb"] Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.835210 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.851147 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvsmz\" (UniqueName: \"kubernetes.io/projected/2a28d7a5-22a2-460a-a08c-8eb484e6c382-kube-api-access-wvsmz\") pod \"horizon-5596f9dfb8-m9bxb\" (UID: \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\") " pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.851220 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2a28d7a5-22a2-460a-a08c-8eb484e6c382-config-data\") pod \"horizon-5596f9dfb8-m9bxb\" (UID: \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\") " pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.851245 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2a28d7a5-22a2-460a-a08c-8eb484e6c382-horizon-secret-key\") pod \"horizon-5596f9dfb8-m9bxb\" (UID: \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\") " pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.851284 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a28d7a5-22a2-460a-a08c-8eb484e6c382-horizon-tls-certs\") pod \"horizon-5596f9dfb8-m9bxb\" (UID: \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\") " pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.851307 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a28d7a5-22a2-460a-a08c-8eb484e6c382-combined-ca-bundle\") pod \"horizon-5596f9dfb8-m9bxb\" (UID: \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\") " pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.851326 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a28d7a5-22a2-460a-a08c-8eb484e6c382-logs\") pod \"horizon-5596f9dfb8-m9bxb\" (UID: \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\") " pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.851352 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a28d7a5-22a2-460a-a08c-8eb484e6c382-scripts\") pod \"horizon-5596f9dfb8-m9bxb\" (UID: \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\") " pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.899212 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-c6787dc45-zbdfx"] Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.927421 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-fd6ddfd9b-f66l8"] Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.929014 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-fd6ddfd9b-f66l8" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.964504 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2a28d7a5-22a2-460a-a08c-8eb484e6c382-horizon-secret-key\") pod \"horizon-5596f9dfb8-m9bxb\" (UID: \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\") " pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.964616 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d3df345-07a2-41bf-aae4-088b3ce83b63-horizon-tls-certs\") pod \"horizon-fd6ddfd9b-f66l8\" (UID: \"4d3df345-07a2-41bf-aae4-088b3ce83b63\") " pod="openstack/horizon-fd6ddfd9b-f66l8" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.964676 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a28d7a5-22a2-460a-a08c-8eb484e6c382-horizon-tls-certs\") pod \"horizon-5596f9dfb8-m9bxb\" (UID: \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\") " pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.964724 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a28d7a5-22a2-460a-a08c-8eb484e6c382-combined-ca-bundle\") pod \"horizon-5596f9dfb8-m9bxb\" (UID: \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\") " pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.964754 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a28d7a5-22a2-460a-a08c-8eb484e6c382-logs\") pod \"horizon-5596f9dfb8-m9bxb\" (UID: \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\") " pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.964825 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfskx\" (UniqueName: \"kubernetes.io/projected/4d3df345-07a2-41bf-aae4-088b3ce83b63-kube-api-access-pfskx\") pod \"horizon-fd6ddfd9b-f66l8\" (UID: \"4d3df345-07a2-41bf-aae4-088b3ce83b63\") " pod="openstack/horizon-fd6ddfd9b-f66l8" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.964868 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a28d7a5-22a2-460a-a08c-8eb484e6c382-scripts\") pod \"horizon-5596f9dfb8-m9bxb\" (UID: \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\") " pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.964990 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4d3df345-07a2-41bf-aae4-088b3ce83b63-config-data\") pod \"horizon-fd6ddfd9b-f66l8\" (UID: \"4d3df345-07a2-41bf-aae4-088b3ce83b63\") " pod="openstack/horizon-fd6ddfd9b-f66l8" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.965034 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvsmz\" (UniqueName: \"kubernetes.io/projected/2a28d7a5-22a2-460a-a08c-8eb484e6c382-kube-api-access-wvsmz\") pod \"horizon-5596f9dfb8-m9bxb\" (UID: \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\") " pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.965076 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4d3df345-07a2-41bf-aae4-088b3ce83b63-scripts\") pod \"horizon-fd6ddfd9b-f66l8\" (UID: \"4d3df345-07a2-41bf-aae4-088b3ce83b63\") " pod="openstack/horizon-fd6ddfd9b-f66l8" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.965109 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d3df345-07a2-41bf-aae4-088b3ce83b63-combined-ca-bundle\") pod \"horizon-fd6ddfd9b-f66l8\" (UID: \"4d3df345-07a2-41bf-aae4-088b3ce83b63\") " pod="openstack/horizon-fd6ddfd9b-f66l8" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.965140 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4d3df345-07a2-41bf-aae4-088b3ce83b63-horizon-secret-key\") pod \"horizon-fd6ddfd9b-f66l8\" (UID: \"4d3df345-07a2-41bf-aae4-088b3ce83b63\") " pod="openstack/horizon-fd6ddfd9b-f66l8" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.965220 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d3df345-07a2-41bf-aae4-088b3ce83b63-logs\") pod \"horizon-fd6ddfd9b-f66l8\" (UID: \"4d3df345-07a2-41bf-aae4-088b3ce83b63\") " pod="openstack/horizon-fd6ddfd9b-f66l8" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.965260 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2a28d7a5-22a2-460a-a08c-8eb484e6c382-config-data\") pod \"horizon-5596f9dfb8-m9bxb\" (UID: \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\") " pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.968057 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a28d7a5-22a2-460a-a08c-8eb484e6c382-logs\") pod \"horizon-5596f9dfb8-m9bxb\" (UID: \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\") " pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.968245 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a28d7a5-22a2-460a-a08c-8eb484e6c382-scripts\") pod \"horizon-5596f9dfb8-m9bxb\" (UID: \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\") " pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.968542 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2a28d7a5-22a2-460a-a08c-8eb484e6c382-config-data\") pod \"horizon-5596f9dfb8-m9bxb\" (UID: \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\") " pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.973453 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2a28d7a5-22a2-460a-a08c-8eb484e6c382-horizon-secret-key\") pod \"horizon-5596f9dfb8-m9bxb\" (UID: \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\") " pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.973810 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a28d7a5-22a2-460a-a08c-8eb484e6c382-combined-ca-bundle\") pod \"horizon-5596f9dfb8-m9bxb\" (UID: \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\") " pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.974253 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a28d7a5-22a2-460a-a08c-8eb484e6c382-horizon-tls-certs\") pod \"horizon-5596f9dfb8-m9bxb\" (UID: \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\") " pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:06:35 crc kubenswrapper[4837]: I0313 12:06:35.974990 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-fd6ddfd9b-f66l8"] Mar 13 12:06:36 crc kubenswrapper[4837]: I0313 12:06:36.010981 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvsmz\" (UniqueName: \"kubernetes.io/projected/2a28d7a5-22a2-460a-a08c-8eb484e6c382-kube-api-access-wvsmz\") pod \"horizon-5596f9dfb8-m9bxb\" (UID: \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\") " pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:06:36 crc kubenswrapper[4837]: I0313 12:06:36.066630 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4d3df345-07a2-41bf-aae4-088b3ce83b63-config-data\") pod \"horizon-fd6ddfd9b-f66l8\" (UID: \"4d3df345-07a2-41bf-aae4-088b3ce83b63\") " pod="openstack/horizon-fd6ddfd9b-f66l8" Mar 13 12:06:36 crc kubenswrapper[4837]: I0313 12:06:36.066718 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4d3df345-07a2-41bf-aae4-088b3ce83b63-scripts\") pod \"horizon-fd6ddfd9b-f66l8\" (UID: \"4d3df345-07a2-41bf-aae4-088b3ce83b63\") " pod="openstack/horizon-fd6ddfd9b-f66l8" Mar 13 12:06:36 crc kubenswrapper[4837]: I0313 12:06:36.066754 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d3df345-07a2-41bf-aae4-088b3ce83b63-combined-ca-bundle\") pod \"horizon-fd6ddfd9b-f66l8\" (UID: \"4d3df345-07a2-41bf-aae4-088b3ce83b63\") " pod="openstack/horizon-fd6ddfd9b-f66l8" Mar 13 12:06:36 crc kubenswrapper[4837]: I0313 12:06:36.066786 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4d3df345-07a2-41bf-aae4-088b3ce83b63-horizon-secret-key\") pod \"horizon-fd6ddfd9b-f66l8\" (UID: \"4d3df345-07a2-41bf-aae4-088b3ce83b63\") " pod="openstack/horizon-fd6ddfd9b-f66l8" Mar 13 12:06:36 crc kubenswrapper[4837]: I0313 12:06:36.066836 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d3df345-07a2-41bf-aae4-088b3ce83b63-logs\") pod \"horizon-fd6ddfd9b-f66l8\" (UID: \"4d3df345-07a2-41bf-aae4-088b3ce83b63\") " pod="openstack/horizon-fd6ddfd9b-f66l8" Mar 13 12:06:36 crc kubenswrapper[4837]: I0313 12:06:36.066896 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d3df345-07a2-41bf-aae4-088b3ce83b63-horizon-tls-certs\") pod \"horizon-fd6ddfd9b-f66l8\" (UID: \"4d3df345-07a2-41bf-aae4-088b3ce83b63\") " pod="openstack/horizon-fd6ddfd9b-f66l8" Mar 13 12:06:36 crc kubenswrapper[4837]: I0313 12:06:36.066963 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfskx\" (UniqueName: \"kubernetes.io/projected/4d3df345-07a2-41bf-aae4-088b3ce83b63-kube-api-access-pfskx\") pod \"horizon-fd6ddfd9b-f66l8\" (UID: \"4d3df345-07a2-41bf-aae4-088b3ce83b63\") " pod="openstack/horizon-fd6ddfd9b-f66l8" Mar 13 12:06:36 crc kubenswrapper[4837]: I0313 12:06:36.068095 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d3df345-07a2-41bf-aae4-088b3ce83b63-logs\") pod \"horizon-fd6ddfd9b-f66l8\" (UID: \"4d3df345-07a2-41bf-aae4-088b3ce83b63\") " pod="openstack/horizon-fd6ddfd9b-f66l8" Mar 13 12:06:36 crc kubenswrapper[4837]: I0313 12:06:36.068608 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4d3df345-07a2-41bf-aae4-088b3ce83b63-scripts\") pod \"horizon-fd6ddfd9b-f66l8\" (UID: \"4d3df345-07a2-41bf-aae4-088b3ce83b63\") " pod="openstack/horizon-fd6ddfd9b-f66l8" Mar 13 12:06:36 crc kubenswrapper[4837]: I0313 12:06:36.069736 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4d3df345-07a2-41bf-aae4-088b3ce83b63-config-data\") pod \"horizon-fd6ddfd9b-f66l8\" (UID: \"4d3df345-07a2-41bf-aae4-088b3ce83b63\") " pod="openstack/horizon-fd6ddfd9b-f66l8" Mar 13 12:06:36 crc kubenswrapper[4837]: I0313 12:06:36.074120 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4d3df345-07a2-41bf-aae4-088b3ce83b63-horizon-secret-key\") pod \"horizon-fd6ddfd9b-f66l8\" (UID: \"4d3df345-07a2-41bf-aae4-088b3ce83b63\") " pod="openstack/horizon-fd6ddfd9b-f66l8" Mar 13 12:06:36 crc kubenswrapper[4837]: I0313 12:06:36.075177 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d3df345-07a2-41bf-aae4-088b3ce83b63-combined-ca-bundle\") pod \"horizon-fd6ddfd9b-f66l8\" (UID: \"4d3df345-07a2-41bf-aae4-088b3ce83b63\") " pod="openstack/horizon-fd6ddfd9b-f66l8" Mar 13 12:06:36 crc kubenswrapper[4837]: I0313 12:06:36.081287 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d3df345-07a2-41bf-aae4-088b3ce83b63-horizon-tls-certs\") pod \"horizon-fd6ddfd9b-f66l8\" (UID: \"4d3df345-07a2-41bf-aae4-088b3ce83b63\") " pod="openstack/horizon-fd6ddfd9b-f66l8" Mar 13 12:06:36 crc kubenswrapper[4837]: I0313 12:06:36.096660 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfskx\" (UniqueName: \"kubernetes.io/projected/4d3df345-07a2-41bf-aae4-088b3ce83b63-kube-api-access-pfskx\") pod \"horizon-fd6ddfd9b-f66l8\" (UID: \"4d3df345-07a2-41bf-aae4-088b3ce83b63\") " pod="openstack/horizon-fd6ddfd9b-f66l8" Mar 13 12:06:36 crc kubenswrapper[4837]: I0313 12:06:36.154400 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:06:36 crc kubenswrapper[4837]: I0313 12:06:36.249546 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-fd6ddfd9b-f66l8" Mar 13 12:06:37 crc kubenswrapper[4837]: I0313 12:06:37.878145 4837 generic.go:334] "Generic (PLEG): container finished" podID="b4490fb3-45d7-4b40-ad34-5bf33ba88491" containerID="483a91e4e8aeb62a4bc9d00fab2fa3f3452e90337b10ae7eb6d6d40d39b495c8" exitCode=0 Mar 13 12:06:37 crc kubenswrapper[4837]: I0313 12:06:37.878560 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-jkthw" event={"ID":"b4490fb3-45d7-4b40-ad34-5bf33ba88491","Type":"ContainerDied","Data":"483a91e4e8aeb62a4bc9d00fab2fa3f3452e90337b10ae7eb6d6d40d39b495c8"} Mar 13 12:06:38 crc kubenswrapper[4837]: I0313 12:06:38.168794 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" Mar 13 12:06:38 crc kubenswrapper[4837]: I0313 12:06:38.238742 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-5nlfg"] Mar 13 12:06:38 crc kubenswrapper[4837]: I0313 12:06:38.239068 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" podUID="1a847add-da54-4a5d-9bca-5aea455eefe8" containerName="dnsmasq-dns" containerID="cri-o://7fd2e269ac89746bd02c6eb6e013fcc551156a1538f9f4807e06a63dd46236d2" gracePeriod=10 Mar 13 12:06:38 crc kubenswrapper[4837]: I0313 12:06:38.898457 4837 generic.go:334] "Generic (PLEG): container finished" podID="1a847add-da54-4a5d-9bca-5aea455eefe8" containerID="7fd2e269ac89746bd02c6eb6e013fcc551156a1538f9f4807e06a63dd46236d2" exitCode=0 Mar 13 12:06:38 crc kubenswrapper[4837]: I0313 12:06:38.898500 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" event={"ID":"1a847add-da54-4a5d-9bca-5aea455eefe8","Type":"ContainerDied","Data":"7fd2e269ac89746bd02c6eb6e013fcc551156a1538f9f4807e06a63dd46236d2"} Mar 13 12:06:40 crc kubenswrapper[4837]: I0313 12:06:40.153480 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" podUID="1a847add-da54-4a5d-9bca-5aea455eefe8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.137:5353: connect: connection refused" Mar 13 12:06:45 crc kubenswrapper[4837]: I0313 12:06:45.153738 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" podUID="1a847add-da54-4a5d-9bca-5aea455eefe8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.137:5353: connect: connection refused" Mar 13 12:06:48 crc kubenswrapper[4837]: E0313 12:06:48.689500 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Mar 13 12:06:48 crc kubenswrapper[4837]: E0313 12:06:48.690146 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n55dh5bfh565h684h667h549h5f4h55bh68fh5d8h8ch57h69h65h694h6dh5d7h9dhf6h57fh548h6chb4h549h57ch5f4h5cbhd5h658h548hf5h56cq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6r7pn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 12:06:48 crc kubenswrapper[4837]: I0313 12:06:48.985206 4837 generic.go:334] "Generic (PLEG): container finished" podID="d2d0a770-288f-40d8-832e-f5463863bef1" containerID="167d2264a85f4435f333e5de927afa95b020419521d018ef924666fe1959c6ff" exitCode=0 Mar 13 12:06:48 crc kubenswrapper[4837]: I0313 12:06:48.985256 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-wdwg2" event={"ID":"d2d0a770-288f-40d8-832e-f5463863bef1","Type":"ContainerDied","Data":"167d2264a85f4435f333e5de927afa95b020419521d018ef924666fe1959c6ff"} Mar 13 12:06:50 crc kubenswrapper[4837]: I0313 12:06:50.154194 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" podUID="1a847add-da54-4a5d-9bca-5aea455eefe8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.137:5353: connect: connection refused" Mar 13 12:06:50 crc kubenswrapper[4837]: I0313 12:06:50.154570 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" Mar 13 12:06:50 crc kubenswrapper[4837]: E0313 12:06:50.295400 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 13 12:06:50 crc kubenswrapper[4837]: E0313 12:06:50.295984 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n687h5b7h54ch5b7h87h5cfh568hf6h75hc6h646hd8hcbh5f6h686hfh576h567h64fhc9h55h677h645h576h568hfbhbbh5d9h567hf6h54hb7q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jr9qj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-7f5fbb4dd7-wcbjd_openstack(99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 12:06:50 crc kubenswrapper[4837]: E0313 12:06:50.298699 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-7f5fbb4dd7-wcbjd" podUID="99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2" Mar 13 12:06:51 crc kubenswrapper[4837]: E0313 12:06:51.711108 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Mar 13 12:06:51 crc kubenswrapper[4837]: E0313 12:06:51.711624 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2vtt2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-8vx8g_openstack(08c7b2a5-b0b8-433f-b55d-c64eaeea8b76): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 12:06:51 crc kubenswrapper[4837]: E0313 12:06:51.712876 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-8vx8g" podUID="08c7b2a5-b0b8-433f-b55d-c64eaeea8b76" Mar 13 12:06:51 crc kubenswrapper[4837]: E0313 12:06:51.726682 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 13 12:06:51 crc kubenswrapper[4837]: E0313 12:06:51.726839 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n57h598hcfh55ch5dbhf7hbh579h57ch687h88hcfh4h67ch7fhc6h5c5h57bh68ch5b4h5dbhch64bh9fh66h666h546h98h646h5f9h5fh5d5q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-scc7s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-6b5f9b5c85-p584g_openstack(1f2afb5c-bfb2-4349-8000-4c0c90892d56): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 12:06:51 crc kubenswrapper[4837]: I0313 12:06:51.865784 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kz7j9" Mar 13 12:06:51 crc kubenswrapper[4837]: I0313 12:06:51.873451 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-jkthw" Mar 13 12:06:51 crc kubenswrapper[4837]: I0313 12:06:51.973358 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsj8v\" (UniqueName: \"kubernetes.io/projected/3dec188c-ab95-4544-ac61-6f435f830f97-kube-api-access-gsj8v\") pod \"3dec188c-ab95-4544-ac61-6f435f830f97\" (UID: \"3dec188c-ab95-4544-ac61-6f435f830f97\") " Mar 13 12:06:51 crc kubenswrapper[4837]: I0313 12:06:51.973439 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-combined-ca-bundle\") pod \"3dec188c-ab95-4544-ac61-6f435f830f97\" (UID: \"3dec188c-ab95-4544-ac61-6f435f830f97\") " Mar 13 12:06:51 crc kubenswrapper[4837]: I0313 12:06:51.973498 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-credential-keys\") pod \"3dec188c-ab95-4544-ac61-6f435f830f97\" (UID: \"3dec188c-ab95-4544-ac61-6f435f830f97\") " Mar 13 12:06:51 crc kubenswrapper[4837]: I0313 12:06:51.973519 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-scripts\") pod \"3dec188c-ab95-4544-ac61-6f435f830f97\" (UID: \"3dec188c-ab95-4544-ac61-6f435f830f97\") " Mar 13 12:06:51 crc kubenswrapper[4837]: I0313 12:06:51.973542 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b4490fb3-45d7-4b40-ad34-5bf33ba88491-db-sync-config-data\") pod \"b4490fb3-45d7-4b40-ad34-5bf33ba88491\" (UID: \"b4490fb3-45d7-4b40-ad34-5bf33ba88491\") " Mar 13 12:06:51 crc kubenswrapper[4837]: I0313 12:06:51.973607 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4490fb3-45d7-4b40-ad34-5bf33ba88491-config-data\") pod \"b4490fb3-45d7-4b40-ad34-5bf33ba88491\" (UID: \"b4490fb3-45d7-4b40-ad34-5bf33ba88491\") " Mar 13 12:06:51 crc kubenswrapper[4837]: I0313 12:06:51.973672 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-config-data\") pod \"3dec188c-ab95-4544-ac61-6f435f830f97\" (UID: \"3dec188c-ab95-4544-ac61-6f435f830f97\") " Mar 13 12:06:51 crc kubenswrapper[4837]: I0313 12:06:51.973700 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4490fb3-45d7-4b40-ad34-5bf33ba88491-combined-ca-bundle\") pod \"b4490fb3-45d7-4b40-ad34-5bf33ba88491\" (UID: \"b4490fb3-45d7-4b40-ad34-5bf33ba88491\") " Mar 13 12:06:51 crc kubenswrapper[4837]: I0313 12:06:51.973732 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-fernet-keys\") pod \"3dec188c-ab95-4544-ac61-6f435f830f97\" (UID: \"3dec188c-ab95-4544-ac61-6f435f830f97\") " Mar 13 12:06:51 crc kubenswrapper[4837]: I0313 12:06:51.973755 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zh4r\" (UniqueName: \"kubernetes.io/projected/b4490fb3-45d7-4b40-ad34-5bf33ba88491-kube-api-access-5zh4r\") pod \"b4490fb3-45d7-4b40-ad34-5bf33ba88491\" (UID: \"b4490fb3-45d7-4b40-ad34-5bf33ba88491\") " Mar 13 12:06:51 crc kubenswrapper[4837]: I0313 12:06:51.979913 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-scripts" (OuterVolumeSpecName: "scripts") pod "3dec188c-ab95-4544-ac61-6f435f830f97" (UID: "3dec188c-ab95-4544-ac61-6f435f830f97"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:06:51 crc kubenswrapper[4837]: I0313 12:06:51.980830 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "3dec188c-ab95-4544-ac61-6f435f830f97" (UID: "3dec188c-ab95-4544-ac61-6f435f830f97"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:06:51 crc kubenswrapper[4837]: I0313 12:06:51.981499 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4490fb3-45d7-4b40-ad34-5bf33ba88491-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "b4490fb3-45d7-4b40-ad34-5bf33ba88491" (UID: "b4490fb3-45d7-4b40-ad34-5bf33ba88491"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:06:51 crc kubenswrapper[4837]: I0313 12:06:51.984703 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4490fb3-45d7-4b40-ad34-5bf33ba88491-kube-api-access-5zh4r" (OuterVolumeSpecName: "kube-api-access-5zh4r") pod "b4490fb3-45d7-4b40-ad34-5bf33ba88491" (UID: "b4490fb3-45d7-4b40-ad34-5bf33ba88491"). InnerVolumeSpecName "kube-api-access-5zh4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:06:51 crc kubenswrapper[4837]: I0313 12:06:51.986137 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "3dec188c-ab95-4544-ac61-6f435f830f97" (UID: "3dec188c-ab95-4544-ac61-6f435f830f97"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:06:51 crc kubenswrapper[4837]: I0313 12:06:51.995878 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dec188c-ab95-4544-ac61-6f435f830f97-kube-api-access-gsj8v" (OuterVolumeSpecName: "kube-api-access-gsj8v") pod "3dec188c-ab95-4544-ac61-6f435f830f97" (UID: "3dec188c-ab95-4544-ac61-6f435f830f97"). InnerVolumeSpecName "kube-api-access-gsj8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:06:52 crc kubenswrapper[4837]: I0313 12:06:52.003174 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-config-data" (OuterVolumeSpecName: "config-data") pod "3dec188c-ab95-4544-ac61-6f435f830f97" (UID: "3dec188c-ab95-4544-ac61-6f435f830f97"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:06:52 crc kubenswrapper[4837]: I0313 12:06:52.008953 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4490fb3-45d7-4b40-ad34-5bf33ba88491-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b4490fb3-45d7-4b40-ad34-5bf33ba88491" (UID: "b4490fb3-45d7-4b40-ad34-5bf33ba88491"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:06:52 crc kubenswrapper[4837]: I0313 12:06:52.011389 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3dec188c-ab95-4544-ac61-6f435f830f97" (UID: "3dec188c-ab95-4544-ac61-6f435f830f97"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:06:52 crc kubenswrapper[4837]: I0313 12:06:52.017754 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-jkthw" event={"ID":"b4490fb3-45d7-4b40-ad34-5bf33ba88491","Type":"ContainerDied","Data":"17d86872aee9655dc63bbe1e8b164cedfec91be43293c0487555e85e1e22c479"} Mar 13 12:06:52 crc kubenswrapper[4837]: I0313 12:06:52.017790 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17d86872aee9655dc63bbe1e8b164cedfec91be43293c0487555e85e1e22c479" Mar 13 12:06:52 crc kubenswrapper[4837]: I0313 12:06:52.017846 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-jkthw" Mar 13 12:06:52 crc kubenswrapper[4837]: I0313 12:06:52.021375 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kz7j9" Mar 13 12:06:52 crc kubenswrapper[4837]: I0313 12:06:52.023323 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kz7j9" event={"ID":"3dec188c-ab95-4544-ac61-6f435f830f97","Type":"ContainerDied","Data":"a81b99f16a86752e1a0e2bc8e53026978a648af00da7f92ffdb871f715779f44"} Mar 13 12:06:52 crc kubenswrapper[4837]: I0313 12:06:52.023365 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a81b99f16a86752e1a0e2bc8e53026978a648af00da7f92ffdb871f715779f44" Mar 13 12:06:52 crc kubenswrapper[4837]: E0313 12:06:52.023889 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-8vx8g" podUID="08c7b2a5-b0b8-433f-b55d-c64eaeea8b76" Mar 13 12:06:52 crc kubenswrapper[4837]: I0313 12:06:52.051388 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4490fb3-45d7-4b40-ad34-5bf33ba88491-config-data" (OuterVolumeSpecName: "config-data") pod "b4490fb3-45d7-4b40-ad34-5bf33ba88491" (UID: "b4490fb3-45d7-4b40-ad34-5bf33ba88491"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:06:52 crc kubenswrapper[4837]: I0313 12:06:52.076163 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:52 crc kubenswrapper[4837]: I0313 12:06:52.076197 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4490fb3-45d7-4b40-ad34-5bf33ba88491-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:52 crc kubenswrapper[4837]: I0313 12:06:52.076209 4837 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:52 crc kubenswrapper[4837]: I0313 12:06:52.076218 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zh4r\" (UniqueName: \"kubernetes.io/projected/b4490fb3-45d7-4b40-ad34-5bf33ba88491-kube-api-access-5zh4r\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:52 crc kubenswrapper[4837]: I0313 12:06:52.076230 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gsj8v\" (UniqueName: \"kubernetes.io/projected/3dec188c-ab95-4544-ac61-6f435f830f97-kube-api-access-gsj8v\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:52 crc kubenswrapper[4837]: I0313 12:06:52.076238 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:52 crc kubenswrapper[4837]: I0313 12:06:52.076245 4837 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:52 crc kubenswrapper[4837]: I0313 12:06:52.076252 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dec188c-ab95-4544-ac61-6f435f830f97-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:52 crc kubenswrapper[4837]: I0313 12:06:52.076265 4837 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b4490fb3-45d7-4b40-ad34-5bf33ba88491-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:52 crc kubenswrapper[4837]: I0313 12:06:52.076272 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4490fb3-45d7-4b40-ad34-5bf33ba88491-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:52 crc kubenswrapper[4837]: E0313 12:06:52.786845 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 13 12:06:52 crc kubenswrapper[4837]: E0313 12:06:52.787367 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4p5k2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-qdzjz_openstack(a44db1d6-6da2-41a5-a37f-ffc602f0d55a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 12:06:52 crc kubenswrapper[4837]: E0313 12:06:52.788563 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-qdzjz" podUID="a44db1d6-6da2-41a5-a37f-ffc602f0d55a" Mar 13 12:06:52 crc kubenswrapper[4837]: I0313 12:06:52.946319 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-kz7j9"] Mar 13 12:06:52 crc kubenswrapper[4837]: I0313 12:06:52.953691 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-kz7j9"] Mar 13 12:06:53 crc kubenswrapper[4837]: E0313 12:06:53.042401 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-qdzjz" podUID="a44db1d6-6da2-41a5-a37f-ffc602f0d55a" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.059850 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dec188c-ab95-4544-ac61-6f435f830f97" path="/var/lib/kubelet/pods/3dec188c-ab95-4544-ac61-6f435f830f97/volumes" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.060382 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-s7m97"] Mar 13 12:06:53 crc kubenswrapper[4837]: E0313 12:06:53.062313 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dec188c-ab95-4544-ac61-6f435f830f97" containerName="keystone-bootstrap" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.062342 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dec188c-ab95-4544-ac61-6f435f830f97" containerName="keystone-bootstrap" Mar 13 12:06:53 crc kubenswrapper[4837]: E0313 12:06:53.062363 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4490fb3-45d7-4b40-ad34-5bf33ba88491" containerName="glance-db-sync" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.062372 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4490fb3-45d7-4b40-ad34-5bf33ba88491" containerName="glance-db-sync" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.062888 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4490fb3-45d7-4b40-ad34-5bf33ba88491" containerName="glance-db-sync" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.062917 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dec188c-ab95-4544-ac61-6f435f830f97" containerName="keystone-bootstrap" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.063507 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-s7m97"] Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.063595 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-s7m97" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.073194 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.073542 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-w6mdg" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.073790 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.074020 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.074386 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.198943 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-combined-ca-bundle\") pod \"keystone-bootstrap-s7m97\" (UID: \"3af4ac68-a437-4be7-adab-1ef336f0cbda\") " pod="openstack/keystone-bootstrap-s7m97" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.198992 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-fernet-keys\") pod \"keystone-bootstrap-s7m97\" (UID: \"3af4ac68-a437-4be7-adab-1ef336f0cbda\") " pod="openstack/keystone-bootstrap-s7m97" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.199128 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7vwb\" (UniqueName: \"kubernetes.io/projected/3af4ac68-a437-4be7-adab-1ef336f0cbda-kube-api-access-w7vwb\") pod \"keystone-bootstrap-s7m97\" (UID: \"3af4ac68-a437-4be7-adab-1ef336f0cbda\") " pod="openstack/keystone-bootstrap-s7m97" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.199148 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-credential-keys\") pod \"keystone-bootstrap-s7m97\" (UID: \"3af4ac68-a437-4be7-adab-1ef336f0cbda\") " pod="openstack/keystone-bootstrap-s7m97" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.199161 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-scripts\") pod \"keystone-bootstrap-s7m97\" (UID: \"3af4ac68-a437-4be7-adab-1ef336f0cbda\") " pod="openstack/keystone-bootstrap-s7m97" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.199226 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-config-data\") pod \"keystone-bootstrap-s7m97\" (UID: \"3af4ac68-a437-4be7-adab-1ef336f0cbda\") " pod="openstack/keystone-bootstrap-s7m97" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.266200 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-gcf4g"] Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.270810 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-gcf4g" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.281745 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-gcf4g"] Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.301535 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-config-data\") pod \"keystone-bootstrap-s7m97\" (UID: \"3af4ac68-a437-4be7-adab-1ef336f0cbda\") " pod="openstack/keystone-bootstrap-s7m97" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.301594 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-combined-ca-bundle\") pod \"keystone-bootstrap-s7m97\" (UID: \"3af4ac68-a437-4be7-adab-1ef336f0cbda\") " pod="openstack/keystone-bootstrap-s7m97" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.301615 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-fernet-keys\") pod \"keystone-bootstrap-s7m97\" (UID: \"3af4ac68-a437-4be7-adab-1ef336f0cbda\") " pod="openstack/keystone-bootstrap-s7m97" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.301703 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7vwb\" (UniqueName: \"kubernetes.io/projected/3af4ac68-a437-4be7-adab-1ef336f0cbda-kube-api-access-w7vwb\") pod \"keystone-bootstrap-s7m97\" (UID: \"3af4ac68-a437-4be7-adab-1ef336f0cbda\") " pod="openstack/keystone-bootstrap-s7m97" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.301726 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-credential-keys\") pod \"keystone-bootstrap-s7m97\" (UID: \"3af4ac68-a437-4be7-adab-1ef336f0cbda\") " pod="openstack/keystone-bootstrap-s7m97" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.301743 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-scripts\") pod \"keystone-bootstrap-s7m97\" (UID: \"3af4ac68-a437-4be7-adab-1ef336f0cbda\") " pod="openstack/keystone-bootstrap-s7m97" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.309128 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-combined-ca-bundle\") pod \"keystone-bootstrap-s7m97\" (UID: \"3af4ac68-a437-4be7-adab-1ef336f0cbda\") " pod="openstack/keystone-bootstrap-s7m97" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.309657 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-credential-keys\") pod \"keystone-bootstrap-s7m97\" (UID: \"3af4ac68-a437-4be7-adab-1ef336f0cbda\") " pod="openstack/keystone-bootstrap-s7m97" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.309865 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-scripts\") pod \"keystone-bootstrap-s7m97\" (UID: \"3af4ac68-a437-4be7-adab-1ef336f0cbda\") " pod="openstack/keystone-bootstrap-s7m97" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.311259 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-config-data\") pod \"keystone-bootstrap-s7m97\" (UID: \"3af4ac68-a437-4be7-adab-1ef336f0cbda\") " pod="openstack/keystone-bootstrap-s7m97" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.321468 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7vwb\" (UniqueName: \"kubernetes.io/projected/3af4ac68-a437-4be7-adab-1ef336f0cbda-kube-api-access-w7vwb\") pod \"keystone-bootstrap-s7m97\" (UID: \"3af4ac68-a437-4be7-adab-1ef336f0cbda\") " pod="openstack/keystone-bootstrap-s7m97" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.337866 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-fernet-keys\") pod \"keystone-bootstrap-s7m97\" (UID: \"3af4ac68-a437-4be7-adab-1ef336f0cbda\") " pod="openstack/keystone-bootstrap-s7m97" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.389355 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-s7m97" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.403335 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-gcf4g\" (UID: \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gcf4g" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.403432 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-gcf4g\" (UID: \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gcf4g" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.403452 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkrqw\" (UniqueName: \"kubernetes.io/projected/55b83aaa-c867-44b7-bfea-b43c5bc0e471-kube-api-access-fkrqw\") pod \"dnsmasq-dns-785d8bcb8c-gcf4g\" (UID: \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gcf4g" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.403477 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-gcf4g\" (UID: \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gcf4g" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.403496 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-config\") pod \"dnsmasq-dns-785d8bcb8c-gcf4g\" (UID: \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gcf4g" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.403609 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-gcf4g\" (UID: \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gcf4g" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.505673 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-gcf4g\" (UID: \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gcf4g" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.506116 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-gcf4g\" (UID: \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gcf4g" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.506144 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkrqw\" (UniqueName: \"kubernetes.io/projected/55b83aaa-c867-44b7-bfea-b43c5bc0e471-kube-api-access-fkrqw\") pod \"dnsmasq-dns-785d8bcb8c-gcf4g\" (UID: \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gcf4g" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.506185 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-gcf4g\" (UID: \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gcf4g" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.506212 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-config\") pod \"dnsmasq-dns-785d8bcb8c-gcf4g\" (UID: \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gcf4g" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.506332 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-gcf4g\" (UID: \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gcf4g" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.506749 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-gcf4g\" (UID: \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gcf4g" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.507068 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-gcf4g\" (UID: \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gcf4g" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.507146 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-gcf4g\" (UID: \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gcf4g" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.507671 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-config\") pod \"dnsmasq-dns-785d8bcb8c-gcf4g\" (UID: \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gcf4g" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.511937 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-gcf4g\" (UID: \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gcf4g" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.530392 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkrqw\" (UniqueName: \"kubernetes.io/projected/55b83aaa-c867-44b7-bfea-b43c5bc0e471-kube-api-access-fkrqw\") pod \"dnsmasq-dns-785d8bcb8c-gcf4g\" (UID: \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gcf4g" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.604387 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-gcf4g" Mar 13 12:06:53 crc kubenswrapper[4837]: E0313 12:06:53.624347 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Mar 13 12:06:53 crc kubenswrapper[4837]: E0313 12:06:53.624502 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9sh7t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-b6qnm_openstack(95b808e7-674f-4592-af6e-f7c8682f6a17): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 12:06:53 crc kubenswrapper[4837]: E0313 12:06:53.625956 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-b6qnm" podUID="95b808e7-674f-4592-af6e-f7c8682f6a17" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.716690 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7f5fbb4dd7-wcbjd" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.735599 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-wdwg2" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.745077 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.812469 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-config-data\") pod \"99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2\" (UID: \"99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2\") " Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.812687 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-horizon-secret-key\") pod \"99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2\" (UID: \"99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2\") " Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.812719 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-scripts\") pod \"99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2\" (UID: \"99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2\") " Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.812786 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-logs\") pod \"99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2\" (UID: \"99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2\") " Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.812823 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jr9qj\" (UniqueName: \"kubernetes.io/projected/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-kube-api-access-jr9qj\") pod \"99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2\" (UID: \"99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2\") " Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.813351 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-config-data" (OuterVolumeSpecName: "config-data") pod "99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2" (UID: "99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.813627 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-scripts" (OuterVolumeSpecName: "scripts") pod "99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2" (UID: "99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.813855 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-logs" (OuterVolumeSpecName: "logs") pod "99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2" (UID: "99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.817257 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-kube-api-access-jr9qj" (OuterVolumeSpecName: "kube-api-access-jr9qj") pod "99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2" (UID: "99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2"). InnerVolumeSpecName "kube-api-access-jr9qj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.820022 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2" (UID: "99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.914825 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2d0a770-288f-40d8-832e-f5463863bef1-combined-ca-bundle\") pod \"d2d0a770-288f-40d8-832e-f5463863bef1\" (UID: \"d2d0a770-288f-40d8-832e-f5463863bef1\") " Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.914987 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-dns-swift-storage-0\") pod \"1a847add-da54-4a5d-9bca-5aea455eefe8\" (UID: \"1a847add-da54-4a5d-9bca-5aea455eefe8\") " Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.915090 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qq9wd\" (UniqueName: \"kubernetes.io/projected/d2d0a770-288f-40d8-832e-f5463863bef1-kube-api-access-qq9wd\") pod \"d2d0a770-288f-40d8-832e-f5463863bef1\" (UID: \"d2d0a770-288f-40d8-832e-f5463863bef1\") " Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.915163 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-ovsdbserver-sb\") pod \"1a847add-da54-4a5d-9bca-5aea455eefe8\" (UID: \"1a847add-da54-4a5d-9bca-5aea455eefe8\") " Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.915227 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-config\") pod \"1a847add-da54-4a5d-9bca-5aea455eefe8\" (UID: \"1a847add-da54-4a5d-9bca-5aea455eefe8\") " Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.915289 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-dns-svc\") pod \"1a847add-da54-4a5d-9bca-5aea455eefe8\" (UID: \"1a847add-da54-4a5d-9bca-5aea455eefe8\") " Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.915370 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-ovsdbserver-nb\") pod \"1a847add-da54-4a5d-9bca-5aea455eefe8\" (UID: \"1a847add-da54-4a5d-9bca-5aea455eefe8\") " Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.915459 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d2d0a770-288f-40d8-832e-f5463863bef1-config\") pod \"d2d0a770-288f-40d8-832e-f5463863bef1\" (UID: \"d2d0a770-288f-40d8-832e-f5463863bef1\") " Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.915519 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trr9k\" (UniqueName: \"kubernetes.io/projected/1a847add-da54-4a5d-9bca-5aea455eefe8-kube-api-access-trr9k\") pod \"1a847add-da54-4a5d-9bca-5aea455eefe8\" (UID: \"1a847add-da54-4a5d-9bca-5aea455eefe8\") " Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.916195 4837 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.916238 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.916250 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.916261 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jr9qj\" (UniqueName: \"kubernetes.io/projected/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-kube-api-access-jr9qj\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.916274 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.920597 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2d0a770-288f-40d8-832e-f5463863bef1-kube-api-access-qq9wd" (OuterVolumeSpecName: "kube-api-access-qq9wd") pod "d2d0a770-288f-40d8-832e-f5463863bef1" (UID: "d2d0a770-288f-40d8-832e-f5463863bef1"). InnerVolumeSpecName "kube-api-access-qq9wd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.922018 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a847add-da54-4a5d-9bca-5aea455eefe8-kube-api-access-trr9k" (OuterVolumeSpecName: "kube-api-access-trr9k") pod "1a847add-da54-4a5d-9bca-5aea455eefe8" (UID: "1a847add-da54-4a5d-9bca-5aea455eefe8"). InnerVolumeSpecName "kube-api-access-trr9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.939731 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2d0a770-288f-40d8-832e-f5463863bef1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d2d0a770-288f-40d8-832e-f5463863bef1" (UID: "d2d0a770-288f-40d8-832e-f5463863bef1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.943120 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2d0a770-288f-40d8-832e-f5463863bef1-config" (OuterVolumeSpecName: "config") pod "d2d0a770-288f-40d8-832e-f5463863bef1" (UID: "d2d0a770-288f-40d8-832e-f5463863bef1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.962049 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1a847add-da54-4a5d-9bca-5aea455eefe8" (UID: "1a847add-da54-4a5d-9bca-5aea455eefe8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.963679 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1a847add-da54-4a5d-9bca-5aea455eefe8" (UID: "1a847add-da54-4a5d-9bca-5aea455eefe8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.967662 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1a847add-da54-4a5d-9bca-5aea455eefe8" (UID: "1a847add-da54-4a5d-9bca-5aea455eefe8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.973160 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-config" (OuterVolumeSpecName: "config") pod "1a847add-da54-4a5d-9bca-5aea455eefe8" (UID: "1a847add-da54-4a5d-9bca-5aea455eefe8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:53 crc kubenswrapper[4837]: I0313 12:06:53.977702 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1a847add-da54-4a5d-9bca-5aea455eefe8" (UID: "1a847add-da54-4a5d-9bca-5aea455eefe8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.018549 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qq9wd\" (UniqueName: \"kubernetes.io/projected/d2d0a770-288f-40d8-832e-f5463863bef1-kube-api-access-qq9wd\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.018590 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.018600 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.018627 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.018659 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.018668 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d2d0a770-288f-40d8-832e-f5463863bef1-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.018677 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trr9k\" (UniqueName: \"kubernetes.io/projected/1a847add-da54-4a5d-9bca-5aea455eefe8-kube-api-access-trr9k\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.018703 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2d0a770-288f-40d8-832e-f5463863bef1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.018713 4837 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1a847add-da54-4a5d-9bca-5aea455eefe8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.105941 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-wdwg2" event={"ID":"d2d0a770-288f-40d8-832e-f5463863bef1","Type":"ContainerDied","Data":"20eddadf1412bdac7244116ec35325dbc4b45413968aa761e6fe806d93d5742c"} Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.106176 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20eddadf1412bdac7244116ec35325dbc4b45413968aa761e6fe806d93d5742c" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.106254 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-wdwg2" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.117477 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" event={"ID":"1a847add-da54-4a5d-9bca-5aea455eefe8","Type":"ContainerDied","Data":"2cbead7100d7df29ad960b80cb3c7ee5eb871cec6fea242940565dd0d3726566"} Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.117535 4837 scope.go:117] "RemoveContainer" containerID="7fd2e269ac89746bd02c6eb6e013fcc551156a1538f9f4807e06a63dd46236d2" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.117603 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-5nlfg" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.123994 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7f5fbb4dd7-wcbjd" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.124328 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7f5fbb4dd7-wcbjd" event={"ID":"99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2","Type":"ContainerDied","Data":"041964cbbb19e51e7b1a85074982a092d132a78534f19eb3616f99ddc8aa3e15"} Mar 13 12:06:54 crc kubenswrapper[4837]: E0313 12:06:54.127259 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-b6qnm" podUID="95b808e7-674f-4592-af6e-f7c8682f6a17" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.159195 4837 scope.go:117] "RemoveContainer" containerID="88f5a9c016c890932c1524d02aeb53601bb1a2cc77b41ca9cf3fabeb2713f8a0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.192531 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 12:06:54 crc kubenswrapper[4837]: E0313 12:06:54.193041 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a847add-da54-4a5d-9bca-5aea455eefe8" containerName="init" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.193067 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a847add-da54-4a5d-9bca-5aea455eefe8" containerName="init" Mar 13 12:06:54 crc kubenswrapper[4837]: E0313 12:06:54.193095 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2d0a770-288f-40d8-832e-f5463863bef1" containerName="neutron-db-sync" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.193103 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2d0a770-288f-40d8-832e-f5463863bef1" containerName="neutron-db-sync" Mar 13 12:06:54 crc kubenswrapper[4837]: E0313 12:06:54.193124 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a847add-da54-4a5d-9bca-5aea455eefe8" containerName="dnsmasq-dns" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.193132 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a847add-da54-4a5d-9bca-5aea455eefe8" containerName="dnsmasq-dns" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.193319 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a847add-da54-4a5d-9bca-5aea455eefe8" containerName="dnsmasq-dns" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.193337 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2d0a770-288f-40d8-832e-f5463863bef1" containerName="neutron-db-sync" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.194410 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.196357 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.197448 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-dvhzm" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.197909 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.218266 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.323744 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-scripts\") pod \"glance-default-external-api-0\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") " pod="openstack/glance-default-external-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.323798 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") " pod="openstack/glance-default-external-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.323822 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") " pod="openstack/glance-default-external-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.323900 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-config-data\") pod \"glance-default-external-api-0\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") " pod="openstack/glance-default-external-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.323914 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") " pod="openstack/glance-default-external-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.323932 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-logs\") pod \"glance-default-external-api-0\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") " pod="openstack/glance-default-external-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.323956 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkskj\" (UniqueName: \"kubernetes.io/projected/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-kube-api-access-vkskj\") pod \"glance-default-external-api-0\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") " pod="openstack/glance-default-external-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.326338 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-5nlfg"] Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.335412 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-5nlfg"] Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.405306 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.406757 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.410556 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.427710 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-scripts\") pod \"glance-default-external-api-0\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") " pod="openstack/glance-default-external-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.427772 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") " pod="openstack/glance-default-external-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.427802 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") " pod="openstack/glance-default-external-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.427896 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-config-data\") pod \"glance-default-external-api-0\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") " pod="openstack/glance-default-external-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.427922 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") " pod="openstack/glance-default-external-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.427945 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-logs\") pod \"glance-default-external-api-0\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") " pod="openstack/glance-default-external-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.427979 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkskj\" (UniqueName: \"kubernetes.io/projected/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-kube-api-access-vkskj\") pod \"glance-default-external-api-0\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") " pod="openstack/glance-default-external-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.428443 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.428715 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") " pod="openstack/glance-default-external-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.428811 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-logs\") pod \"glance-default-external-api-0\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") " pod="openstack/glance-default-external-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.432091 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-scripts\") pod \"glance-default-external-api-0\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") " pod="openstack/glance-default-external-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.432406 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-config-data\") pod \"glance-default-external-api-0\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") " pod="openstack/glance-default-external-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.433388 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.435787 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") " pod="openstack/glance-default-external-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.481181 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkskj\" (UniqueName: \"kubernetes.io/projected/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-kube-api-access-vkskj\") pod \"glance-default-external-api-0\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") " pod="openstack/glance-default-external-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.489328 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") " pod="openstack/glance-default-external-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.530042 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/643b18f8-6c85-43ec-977a-c9eade4db120-config-data\") pod \"glance-default-internal-api-0\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.530096 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zz8b\" (UniqueName: \"kubernetes.io/projected/643b18f8-6c85-43ec-977a-c9eade4db120-kube-api-access-9zz8b\") pod \"glance-default-internal-api-0\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.530122 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/643b18f8-6c85-43ec-977a-c9eade4db120-logs\") pod \"glance-default-internal-api-0\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.530355 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/643b18f8-6c85-43ec-977a-c9eade4db120-scripts\") pod \"glance-default-internal-api-0\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.530412 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/643b18f8-6c85-43ec-977a-c9eade4db120-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.530525 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/643b18f8-6c85-43ec-977a-c9eade4db120-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.530599 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.561499 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-fd6ddfd9b-f66l8"] Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.608195 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.632384 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/643b18f8-6c85-43ec-977a-c9eade4db120-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.632435 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.632508 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/643b18f8-6c85-43ec-977a-c9eade4db120-config-data\") pod \"glance-default-internal-api-0\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.632529 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zz8b\" (UniqueName: \"kubernetes.io/projected/643b18f8-6c85-43ec-977a-c9eade4db120-kube-api-access-9zz8b\") pod \"glance-default-internal-api-0\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.632545 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/643b18f8-6c85-43ec-977a-c9eade4db120-logs\") pod \"glance-default-internal-api-0\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.632593 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/643b18f8-6c85-43ec-977a-c9eade4db120-scripts\") pod \"glance-default-internal-api-0\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.632608 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/643b18f8-6c85-43ec-977a-c9eade4db120-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.633187 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/643b18f8-6c85-43ec-977a-c9eade4db120-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.634142 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.637885 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/643b18f8-6c85-43ec-977a-c9eade4db120-scripts\") pod \"glance-default-internal-api-0\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.638087 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/643b18f8-6c85-43ec-977a-c9eade4db120-logs\") pod \"glance-default-internal-api-0\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.644839 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/643b18f8-6c85-43ec-977a-c9eade4db120-config-data\") pod \"glance-default-internal-api-0\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.646052 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/643b18f8-6c85-43ec-977a-c9eade4db120-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.660976 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zz8b\" (UniqueName: \"kubernetes.io/projected/643b18f8-6c85-43ec-977a-c9eade4db120-kube-api-access-9zz8b\") pod \"glance-default-internal-api-0\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.695741 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7f5fbb4dd7-wcbjd"] Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.702022 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7f5fbb4dd7-wcbjd"] Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.709933 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: E0313 12:06:54.742681 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/horizon-6b5f9b5c85-p584g" podUID="1f2afb5c-bfb2-4349-8000-4c0c90892d56" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.747151 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-gcf4g"] Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.753606 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-s7m97"] Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.763524 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5596f9dfb8-m9bxb"] Mar 13 12:06:54 crc kubenswrapper[4837]: W0313 12:06:54.801169 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55b83aaa_c867_44b7_bfea_b43c5bc0e471.slice/crio-b4d001249d11cd6dc828e98e85e66d69123eb5994ed8271394ef31c5c5efbf84 WatchSource:0}: Error finding container b4d001249d11cd6dc828e98e85e66d69123eb5994ed8271394ef31c5c5efbf84: Status 404 returned error can't find the container with id b4d001249d11cd6dc828e98e85e66d69123eb5994ed8271394ef31c5c5efbf84 Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.956913 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 12:06:54 crc kubenswrapper[4837]: I0313 12:06:54.988148 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-gcf4g"] Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.023032 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-t9gtj"] Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.027297 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.083523 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a847add-da54-4a5d-9bca-5aea455eefe8" path="/var/lib/kubelet/pods/1a847add-da54-4a5d-9bca-5aea455eefe8/volumes" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.088431 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2" path="/var/lib/kubelet/pods/99b33c46-0c03-4ef4-b84a-5ca4ac5c15b2/volumes" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.092396 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-t9gtj"] Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.144318 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r558r\" (UniqueName: \"kubernetes.io/projected/e06de12b-6071-4dce-81f1-68539347ca19-kube-api-access-r558r\") pod \"dnsmasq-dns-55f844cf75-t9gtj\" (UID: \"e06de12b-6071-4dce-81f1-68539347ca19\") " pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.144363 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-t9gtj\" (UID: \"e06de12b-6071-4dce-81f1-68539347ca19\") " pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.144406 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-config\") pod \"dnsmasq-dns-55f844cf75-t9gtj\" (UID: \"e06de12b-6071-4dce-81f1-68539347ca19\") " pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.149314 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-t9gtj\" (UID: \"e06de12b-6071-4dce-81f1-68539347ca19\") " pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.149489 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-t9gtj\" (UID: \"e06de12b-6071-4dce-81f1-68539347ca19\") " pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.149568 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-dns-svc\") pod \"dnsmasq-dns-55f844cf75-t9gtj\" (UID: \"e06de12b-6071-4dce-81f1-68539347ca19\") " pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.169897 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-fd6ddfd9b-f66l8" event={"ID":"4d3df345-07a2-41bf-aae4-088b3ce83b63","Type":"ContainerStarted","Data":"04db307b8f39b1892804fc5ba2b134e48dc2f160a0dcc574789fbe9774cfb760"} Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.170135 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-fd6ddfd9b-f66l8" event={"ID":"4d3df345-07a2-41bf-aae4-088b3ce83b63","Type":"ContainerStarted","Data":"5132c4458264f549762024a8619ce82c80b95d512d63406b0390b55fa902c696"} Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.174651 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee","Type":"ContainerStarted","Data":"9c2af154abb9a37a270c00c3cc335b4994ab6bb24ddaf80f1f5bfc313a6b9fb6"} Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.175942 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6b5f9b5c85-p584g" event={"ID":"1f2afb5c-bfb2-4349-8000-4c0c90892d56","Type":"ContainerStarted","Data":"ccf4fdc9606b0ae8a6ecc82badd31da8c6fddc1f4294bee13d5805f8da627b43"} Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.176153 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6b5f9b5c85-p584g" podUID="1f2afb5c-bfb2-4349-8000-4c0c90892d56" containerName="horizon" containerID="cri-o://ccf4fdc9606b0ae8a6ecc82badd31da8c6fddc1f4294bee13d5805f8da627b43" gracePeriod=30 Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.195246 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-c6787dc45-zbdfx" podUID="5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14" containerName="horizon-log" containerID="cri-o://b196a9394882e394baaf7222251dcba129911dfdfe911b4d1d679d89adbed206" gracePeriod=30 Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.195479 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-c6787dc45-zbdfx" podUID="5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14" containerName="horizon" containerID="cri-o://7f77bd5a27791856de608c8a08f3c83e1663f61407b889e9328671983bac96ca" gracePeriod=30 Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.195553 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c6787dc45-zbdfx" event={"ID":"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14","Type":"ContainerStarted","Data":"7f77bd5a27791856de608c8a08f3c83e1663f61407b889e9328671983bac96ca"} Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.195573 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c6787dc45-zbdfx" event={"ID":"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14","Type":"ContainerStarted","Data":"b196a9394882e394baaf7222251dcba129911dfdfe911b4d1d679d89adbed206"} Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.202036 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5596f9dfb8-m9bxb" event={"ID":"2a28d7a5-22a2-460a-a08c-8eb484e6c382","Type":"ContainerStarted","Data":"60985fc2aa747df3481773b902df6591e8f7e0a9aaa937b1d8ccf7c3a2e33f6e"} Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.203280 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-gcf4g" event={"ID":"55b83aaa-c867-44b7-bfea-b43c5bc0e471","Type":"ContainerStarted","Data":"b4d001249d11cd6dc828e98e85e66d69123eb5994ed8271394ef31c5c5efbf84"} Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.214739 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-s7m97" event={"ID":"3af4ac68-a437-4be7-adab-1ef336f0cbda","Type":"ContainerStarted","Data":"2d322ad3eeeb347ecc17c10b7e12064f45bbd098c57202ba37c2350f75cdbf0c"} Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.214784 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-s7m97" event={"ID":"3af4ac68-a437-4be7-adab-1ef336f0cbda","Type":"ContainerStarted","Data":"a67502d2e54c1c7cec9684d1027629e2f2e584a89eba555b6f508bab8dc6003d"} Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.218145 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-67f9f46cf4-9cvcg"] Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.219583 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67f9f46cf4-9cvcg" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.225658 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.225982 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-88ssc" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.226161 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.226291 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.259715 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-config\") pod \"dnsmasq-dns-55f844cf75-t9gtj\" (UID: \"e06de12b-6071-4dce-81f1-68539347ca19\") " pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.259841 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/073acab9-3b9b-432a-aef7-b59bad9fa6ea-config\") pod \"neutron-67f9f46cf4-9cvcg\" (UID: \"073acab9-3b9b-432a-aef7-b59bad9fa6ea\") " pod="openstack/neutron-67f9f46cf4-9cvcg" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.259870 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/073acab9-3b9b-432a-aef7-b59bad9fa6ea-ovndb-tls-certs\") pod \"neutron-67f9f46cf4-9cvcg\" (UID: \"073acab9-3b9b-432a-aef7-b59bad9fa6ea\") " pod="openstack/neutron-67f9f46cf4-9cvcg" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.259908 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-t9gtj\" (UID: \"e06de12b-6071-4dce-81f1-68539347ca19\") " pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.260014 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/073acab9-3b9b-432a-aef7-b59bad9fa6ea-combined-ca-bundle\") pod \"neutron-67f9f46cf4-9cvcg\" (UID: \"073acab9-3b9b-432a-aef7-b59bad9fa6ea\") " pod="openstack/neutron-67f9f46cf4-9cvcg" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.260057 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-t9gtj\" (UID: \"e06de12b-6071-4dce-81f1-68539347ca19\") " pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.260096 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s56d9\" (UniqueName: \"kubernetes.io/projected/073acab9-3b9b-432a-aef7-b59bad9fa6ea-kube-api-access-s56d9\") pod \"neutron-67f9f46cf4-9cvcg\" (UID: \"073acab9-3b9b-432a-aef7-b59bad9fa6ea\") " pod="openstack/neutron-67f9f46cf4-9cvcg" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.260138 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-dns-svc\") pod \"dnsmasq-dns-55f844cf75-t9gtj\" (UID: \"e06de12b-6071-4dce-81f1-68539347ca19\") " pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.260182 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r558r\" (UniqueName: \"kubernetes.io/projected/e06de12b-6071-4dce-81f1-68539347ca19-kube-api-access-r558r\") pod \"dnsmasq-dns-55f844cf75-t9gtj\" (UID: \"e06de12b-6071-4dce-81f1-68539347ca19\") " pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.260216 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-t9gtj\" (UID: \"e06de12b-6071-4dce-81f1-68539347ca19\") " pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.260247 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/073acab9-3b9b-432a-aef7-b59bad9fa6ea-httpd-config\") pod \"neutron-67f9f46cf4-9cvcg\" (UID: \"073acab9-3b9b-432a-aef7-b59bad9fa6ea\") " pod="openstack/neutron-67f9f46cf4-9cvcg" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.264209 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-t9gtj\" (UID: \"e06de12b-6071-4dce-81f1-68539347ca19\") " pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.264731 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-t9gtj\" (UID: \"e06de12b-6071-4dce-81f1-68539347ca19\") " pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.264952 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-t9gtj\" (UID: \"e06de12b-6071-4dce-81f1-68539347ca19\") " pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.265289 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-config\") pod \"dnsmasq-dns-55f844cf75-t9gtj\" (UID: \"e06de12b-6071-4dce-81f1-68539347ca19\") " pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.268055 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-67f9f46cf4-9cvcg"] Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.268712 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-dns-svc\") pod \"dnsmasq-dns-55f844cf75-t9gtj\" (UID: \"e06de12b-6071-4dce-81f1-68539347ca19\") " pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.283979 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-c6787dc45-zbdfx" podStartSLOduration=2.668016912 podStartE2EDuration="26.283964199s" podCreationTimestamp="2026-03-13 12:06:29 +0000 UTC" firstStartedPulling="2026-03-13 12:06:29.991444589 +0000 UTC m=+1105.629711352" lastFinishedPulling="2026-03-13 12:06:53.607391876 +0000 UTC m=+1129.245658639" observedRunningTime="2026-03-13 12:06:55.235507516 +0000 UTC m=+1130.873774269" watchObservedRunningTime="2026-03-13 12:06:55.283964199 +0000 UTC m=+1130.922230952" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.297065 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r558r\" (UniqueName: \"kubernetes.io/projected/e06de12b-6071-4dce-81f1-68539347ca19-kube-api-access-r558r\") pod \"dnsmasq-dns-55f844cf75-t9gtj\" (UID: \"e06de12b-6071-4dce-81f1-68539347ca19\") " pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.342786 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.345869 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-s7m97" podStartSLOduration=2.3458500239999998 podStartE2EDuration="2.345850024s" podCreationTimestamp="2026-03-13 12:06:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:06:55.314098616 +0000 UTC m=+1130.952365379" watchObservedRunningTime="2026-03-13 12:06:55.345850024 +0000 UTC m=+1130.984116787" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.364939 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/073acab9-3b9b-432a-aef7-b59bad9fa6ea-config\") pod \"neutron-67f9f46cf4-9cvcg\" (UID: \"073acab9-3b9b-432a-aef7-b59bad9fa6ea\") " pod="openstack/neutron-67f9f46cf4-9cvcg" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.364979 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/073acab9-3b9b-432a-aef7-b59bad9fa6ea-ovndb-tls-certs\") pod \"neutron-67f9f46cf4-9cvcg\" (UID: \"073acab9-3b9b-432a-aef7-b59bad9fa6ea\") " pod="openstack/neutron-67f9f46cf4-9cvcg" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.365045 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/073acab9-3b9b-432a-aef7-b59bad9fa6ea-combined-ca-bundle\") pod \"neutron-67f9f46cf4-9cvcg\" (UID: \"073acab9-3b9b-432a-aef7-b59bad9fa6ea\") " pod="openstack/neutron-67f9f46cf4-9cvcg" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.365094 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s56d9\" (UniqueName: \"kubernetes.io/projected/073acab9-3b9b-432a-aef7-b59bad9fa6ea-kube-api-access-s56d9\") pod \"neutron-67f9f46cf4-9cvcg\" (UID: \"073acab9-3b9b-432a-aef7-b59bad9fa6ea\") " pod="openstack/neutron-67f9f46cf4-9cvcg" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.365239 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/073acab9-3b9b-432a-aef7-b59bad9fa6ea-httpd-config\") pod \"neutron-67f9f46cf4-9cvcg\" (UID: \"073acab9-3b9b-432a-aef7-b59bad9fa6ea\") " pod="openstack/neutron-67f9f46cf4-9cvcg" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.372625 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/073acab9-3b9b-432a-aef7-b59bad9fa6ea-config\") pod \"neutron-67f9f46cf4-9cvcg\" (UID: \"073acab9-3b9b-432a-aef7-b59bad9fa6ea\") " pod="openstack/neutron-67f9f46cf4-9cvcg" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.373713 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/073acab9-3b9b-432a-aef7-b59bad9fa6ea-combined-ca-bundle\") pod \"neutron-67f9f46cf4-9cvcg\" (UID: \"073acab9-3b9b-432a-aef7-b59bad9fa6ea\") " pod="openstack/neutron-67f9f46cf4-9cvcg" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.373747 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/073acab9-3b9b-432a-aef7-b59bad9fa6ea-ovndb-tls-certs\") pod \"neutron-67f9f46cf4-9cvcg\" (UID: \"073acab9-3b9b-432a-aef7-b59bad9fa6ea\") " pod="openstack/neutron-67f9f46cf4-9cvcg" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.385598 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/073acab9-3b9b-432a-aef7-b59bad9fa6ea-httpd-config\") pod \"neutron-67f9f46cf4-9cvcg\" (UID: \"073acab9-3b9b-432a-aef7-b59bad9fa6ea\") " pod="openstack/neutron-67f9f46cf4-9cvcg" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.408346 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s56d9\" (UniqueName: \"kubernetes.io/projected/073acab9-3b9b-432a-aef7-b59bad9fa6ea-kube-api-access-s56d9\") pod \"neutron-67f9f46cf4-9cvcg\" (UID: \"073acab9-3b9b-432a-aef7-b59bad9fa6ea\") " pod="openstack/neutron-67f9f46cf4-9cvcg" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.443858 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.611654 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67f9f46cf4-9cvcg" Mar 13 12:06:55 crc kubenswrapper[4837]: I0313 12:06:55.790145 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 12:06:56 crc kubenswrapper[4837]: I0313 12:06:56.246433 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-t9gtj"] Mar 13 12:06:56 crc kubenswrapper[4837]: I0313 12:06:56.304017 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5596f9dfb8-m9bxb" event={"ID":"2a28d7a5-22a2-460a-a08c-8eb484e6c382","Type":"ContainerStarted","Data":"7e464f7436823332f050e26237bc563d04c928c21ee9b8d3087ae1cc9a85aacb"} Mar 13 12:06:56 crc kubenswrapper[4837]: I0313 12:06:56.304059 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5596f9dfb8-m9bxb" event={"ID":"2a28d7a5-22a2-460a-a08c-8eb484e6c382","Type":"ContainerStarted","Data":"92b3db8efc4bd781409e05974c86a887259d700facd2c2ab05a9fcc6613ce654"} Mar 13 12:06:56 crc kubenswrapper[4837]: I0313 12:06:56.308018 4837 generic.go:334] "Generic (PLEG): container finished" podID="55b83aaa-c867-44b7-bfea-b43c5bc0e471" containerID="2783265a0648c64c9cf8018b7e58d013d90c350f449426212bca3f25289ba423" exitCode=0 Mar 13 12:06:56 crc kubenswrapper[4837]: I0313 12:06:56.308080 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-gcf4g" event={"ID":"55b83aaa-c867-44b7-bfea-b43c5bc0e471","Type":"ContainerDied","Data":"2783265a0648c64c9cf8018b7e58d013d90c350f449426212bca3f25289ba423"} Mar 13 12:06:56 crc kubenswrapper[4837]: I0313 12:06:56.313349 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"643b18f8-6c85-43ec-977a-c9eade4db120","Type":"ContainerStarted","Data":"bd322992210798258d691847d1f616bdd9cfed212e86bf6a55ea38cd76ee2987"} Mar 13 12:06:56 crc kubenswrapper[4837]: I0313 12:06:56.317252 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-fd6ddfd9b-f66l8" event={"ID":"4d3df345-07a2-41bf-aae4-088b3ce83b63","Type":"ContainerStarted","Data":"ea3e72ba663f65975cf0c5f22cbf3d6fabe3df67dd048e7632a78b0056e1eada"} Mar 13 12:06:56 crc kubenswrapper[4837]: I0313 12:06:56.346547 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5596f9dfb8-m9bxb" podStartSLOduration=21.346525239 podStartE2EDuration="21.346525239s" podCreationTimestamp="2026-03-13 12:06:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:06:56.337145813 +0000 UTC m=+1131.975412576" watchObservedRunningTime="2026-03-13 12:06:56.346525239 +0000 UTC m=+1131.984792002" Mar 13 12:06:56 crc kubenswrapper[4837]: I0313 12:06:56.376384 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1","Type":"ContainerStarted","Data":"c6ddb54236268786175c021b9359fcf4e8bd417495e2748483768742d3e54d9e"} Mar 13 12:06:56 crc kubenswrapper[4837]: I0313 12:06:56.387809 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-67f9f46cf4-9cvcg"] Mar 13 12:06:56 crc kubenswrapper[4837]: W0313 12:06:56.405796 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod073acab9_3b9b_432a_aef7_b59bad9fa6ea.slice/crio-f8cb990fe37777f793f0250c14cdfa0b903194e81a71cae32c4012805c32b7c7 WatchSource:0}: Error finding container f8cb990fe37777f793f0250c14cdfa0b903194e81a71cae32c4012805c32b7c7: Status 404 returned error can't find the container with id f8cb990fe37777f793f0250c14cdfa0b903194e81a71cae32c4012805c32b7c7 Mar 13 12:06:56 crc kubenswrapper[4837]: I0313 12:06:56.436697 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-fd6ddfd9b-f66l8" podStartSLOduration=21.436677441 podStartE2EDuration="21.436677441s" podCreationTimestamp="2026-03-13 12:06:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:06:56.407427722 +0000 UTC m=+1132.045694495" watchObservedRunningTime="2026-03-13 12:06:56.436677441 +0000 UTC m=+1132.074944204" Mar 13 12:06:56 crc kubenswrapper[4837]: I0313 12:06:56.784603 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-gcf4g" Mar 13 12:06:56 crc kubenswrapper[4837]: I0313 12:06:56.935453 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-ovsdbserver-sb\") pod \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\" (UID: \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\") " Mar 13 12:06:56 crc kubenswrapper[4837]: I0313 12:06:56.935616 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-dns-svc\") pod \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\" (UID: \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\") " Mar 13 12:06:56 crc kubenswrapper[4837]: I0313 12:06:56.935669 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkrqw\" (UniqueName: \"kubernetes.io/projected/55b83aaa-c867-44b7-bfea-b43c5bc0e471-kube-api-access-fkrqw\") pod \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\" (UID: \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\") " Mar 13 12:06:56 crc kubenswrapper[4837]: I0313 12:06:56.935699 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-config\") pod \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\" (UID: \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\") " Mar 13 12:06:56 crc kubenswrapper[4837]: I0313 12:06:56.935779 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-dns-swift-storage-0\") pod \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\" (UID: \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\") " Mar 13 12:06:56 crc kubenswrapper[4837]: I0313 12:06:56.936002 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-ovsdbserver-nb\") pod \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\" (UID: \"55b83aaa-c867-44b7-bfea-b43c5bc0e471\") " Mar 13 12:06:56 crc kubenswrapper[4837]: I0313 12:06:56.973513 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55b83aaa-c867-44b7-bfea-b43c5bc0e471-kube-api-access-fkrqw" (OuterVolumeSpecName: "kube-api-access-fkrqw") pod "55b83aaa-c867-44b7-bfea-b43c5bc0e471" (UID: "55b83aaa-c867-44b7-bfea-b43c5bc0e471"). InnerVolumeSpecName "kube-api-access-fkrqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:06:56 crc kubenswrapper[4837]: I0313 12:06:56.978414 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 12:06:57 crc kubenswrapper[4837]: I0313 12:06:57.041195 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkrqw\" (UniqueName: \"kubernetes.io/projected/55b83aaa-c867-44b7-bfea-b43c5bc0e471-kube-api-access-fkrqw\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:57 crc kubenswrapper[4837]: I0313 12:06:57.173195 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 12:06:57 crc kubenswrapper[4837]: I0313 12:06:57.295226 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-config" (OuterVolumeSpecName: "config") pod "55b83aaa-c867-44b7-bfea-b43c5bc0e471" (UID: "55b83aaa-c867-44b7-bfea-b43c5bc0e471"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:57 crc kubenswrapper[4837]: I0313 12:06:57.296922 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "55b83aaa-c867-44b7-bfea-b43c5bc0e471" (UID: "55b83aaa-c867-44b7-bfea-b43c5bc0e471"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:57 crc kubenswrapper[4837]: I0313 12:06:57.325249 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "55b83aaa-c867-44b7-bfea-b43c5bc0e471" (UID: "55b83aaa-c867-44b7-bfea-b43c5bc0e471"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:57 crc kubenswrapper[4837]: I0313 12:06:57.354861 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:57 crc kubenswrapper[4837]: I0313 12:06:57.354910 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:57 crc kubenswrapper[4837]: I0313 12:06:57.354925 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:57 crc kubenswrapper[4837]: I0313 12:06:57.399869 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1","Type":"ContainerStarted","Data":"cdf4673c99faf73809ec550d38a150bd081436099afc9a2124cbc76da3873ab7"} Mar 13 12:06:57 crc kubenswrapper[4837]: I0313 12:06:57.414146 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "55b83aaa-c867-44b7-bfea-b43c5bc0e471" (UID: "55b83aaa-c867-44b7-bfea-b43c5bc0e471"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:57 crc kubenswrapper[4837]: I0313 12:06:57.414565 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "55b83aaa-c867-44b7-bfea-b43c5bc0e471" (UID: "55b83aaa-c867-44b7-bfea-b43c5bc0e471"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:06:57 crc kubenswrapper[4837]: I0313 12:06:57.442664 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-gcf4g" Mar 13 12:06:57 crc kubenswrapper[4837]: I0313 12:06:57.442654 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-gcf4g" event={"ID":"55b83aaa-c867-44b7-bfea-b43c5bc0e471","Type":"ContainerDied","Data":"b4d001249d11cd6dc828e98e85e66d69123eb5994ed8271394ef31c5c5efbf84"} Mar 13 12:06:57 crc kubenswrapper[4837]: I0313 12:06:57.443117 4837 scope.go:117] "RemoveContainer" containerID="2783265a0648c64c9cf8018b7e58d013d90c350f449426212bca3f25289ba423" Mar 13 12:06:57 crc kubenswrapper[4837]: I0313 12:06:57.454211 4837 generic.go:334] "Generic (PLEG): container finished" podID="e06de12b-6071-4dce-81f1-68539347ca19" containerID="059e96d08021e09252961e7fbfcfd1f264e2e10514bec4760c10d5076b6990a3" exitCode=0 Mar 13 12:06:57 crc kubenswrapper[4837]: I0313 12:06:57.454288 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" event={"ID":"e06de12b-6071-4dce-81f1-68539347ca19","Type":"ContainerDied","Data":"059e96d08021e09252961e7fbfcfd1f264e2e10514bec4760c10d5076b6990a3"} Mar 13 12:06:57 crc kubenswrapper[4837]: I0313 12:06:57.454311 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" event={"ID":"e06de12b-6071-4dce-81f1-68539347ca19","Type":"ContainerStarted","Data":"bd08737ad8dd4994dc887a5676bf16b9103265c49a66d4c535944bbd694008c2"} Mar 13 12:06:57 crc kubenswrapper[4837]: I0313 12:06:57.456382 4837 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:57 crc kubenswrapper[4837]: I0313 12:06:57.456408 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/55b83aaa-c867-44b7-bfea-b43c5bc0e471-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 12:06:57 crc kubenswrapper[4837]: I0313 12:06:57.468661 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67f9f46cf4-9cvcg" event={"ID":"073acab9-3b9b-432a-aef7-b59bad9fa6ea","Type":"ContainerStarted","Data":"592ae8d9d134287aaaf8e8bd131ed85ae4a0f882f18fe4b309c2608413d81458"} Mar 13 12:06:57 crc kubenswrapper[4837]: I0313 12:06:57.468697 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67f9f46cf4-9cvcg" event={"ID":"073acab9-3b9b-432a-aef7-b59bad9fa6ea","Type":"ContainerStarted","Data":"f8cb990fe37777f793f0250c14cdfa0b903194e81a71cae32c4012805c32b7c7"} Mar 13 12:06:57 crc kubenswrapper[4837]: I0313 12:06:57.655512 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6b5f9b5c85-p584g" Mar 13 12:06:57 crc kubenswrapper[4837]: I0313 12:06:57.698701 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-gcf4g"] Mar 13 12:06:57 crc kubenswrapper[4837]: I0313 12:06:57.725343 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-gcf4g"] Mar 13 12:06:58 crc kubenswrapper[4837]: I0313 12:06:58.500111 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" event={"ID":"e06de12b-6071-4dce-81f1-68539347ca19","Type":"ContainerStarted","Data":"3ac59c1680a2ceb3bddaf98537b9cf745919c45cdb4c77563d14fc2bd8920764"} Mar 13 12:06:58 crc kubenswrapper[4837]: I0313 12:06:58.500762 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" Mar 13 12:06:58 crc kubenswrapper[4837]: I0313 12:06:58.512259 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67f9f46cf4-9cvcg" event={"ID":"073acab9-3b9b-432a-aef7-b59bad9fa6ea","Type":"ContainerStarted","Data":"5abad0665ef76d6dfd0a8789ace2e34eb216059566daed67b5a9ba64a43b080f"} Mar 13 12:06:58 crc kubenswrapper[4837]: I0313 12:06:58.513521 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-67f9f46cf4-9cvcg" Mar 13 12:06:58 crc kubenswrapper[4837]: I0313 12:06:58.531839 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" podStartSLOduration=4.531814667 podStartE2EDuration="4.531814667s" podCreationTimestamp="2026-03-13 12:06:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:06:58.523498686 +0000 UTC m=+1134.161765449" watchObservedRunningTime="2026-03-13 12:06:58.531814667 +0000 UTC m=+1134.170081440" Mar 13 12:06:58 crc kubenswrapper[4837]: I0313 12:06:58.533080 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1","Type":"ContainerStarted","Data":"61a252e0f8f14e9b64e09ce9249c12af6d208dea329e7deb38c30d99c07d2a83"} Mar 13 12:06:58 crc kubenswrapper[4837]: I0313 12:06:58.533166 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1" containerName="glance-log" containerID="cri-o://cdf4673c99faf73809ec550d38a150bd081436099afc9a2124cbc76da3873ab7" gracePeriod=30 Mar 13 12:06:58 crc kubenswrapper[4837]: I0313 12:06:58.533257 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1" containerName="glance-httpd" containerID="cri-o://61a252e0f8f14e9b64e09ce9249c12af6d208dea329e7deb38c30d99c07d2a83" gracePeriod=30 Mar 13 12:06:58 crc kubenswrapper[4837]: I0313 12:06:58.577351 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"643b18f8-6c85-43ec-977a-c9eade4db120","Type":"ContainerStarted","Data":"45488696f1bb86187ce327f2ad0012c25e169de59468b4dda6460464fc5497ce"} Mar 13 12:06:58 crc kubenswrapper[4837]: I0313 12:06:58.580751 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-67f9f46cf4-9cvcg" podStartSLOduration=3.5774781620000002 podStartE2EDuration="3.577478162s" podCreationTimestamp="2026-03-13 12:06:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:06:58.546507589 +0000 UTC m=+1134.184774352" watchObservedRunningTime="2026-03-13 12:06:58.577478162 +0000 UTC m=+1134.215744925" Mar 13 12:06:58 crc kubenswrapper[4837]: I0313 12:06:58.590332 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.590308375 podStartE2EDuration="5.590308375s" podCreationTimestamp="2026-03-13 12:06:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:06:58.571753442 +0000 UTC m=+1134.210020205" watchObservedRunningTime="2026-03-13 12:06:58.590308375 +0000 UTC m=+1134.228575128" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.062052 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55b83aaa-c867-44b7-bfea-b43c5bc0e471" path="/var/lib/kubelet/pods/55b83aaa-c867-44b7-bfea-b43c5bc0e471/volumes" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.277489 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-c5479d889-t9mnp"] Mar 13 12:06:59 crc kubenswrapper[4837]: E0313 12:06:59.277912 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55b83aaa-c867-44b7-bfea-b43c5bc0e471" containerName="init" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.277929 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="55b83aaa-c867-44b7-bfea-b43c5bc0e471" containerName="init" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.278090 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="55b83aaa-c867-44b7-bfea-b43c5bc0e471" containerName="init" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.278995 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c5479d889-t9mnp" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.281290 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.281515 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.298579 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c5479d889-t9mnp"] Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.427894 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-combined-ca-bundle\") pod \"neutron-c5479d889-t9mnp\" (UID: \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\") " pod="openstack/neutron-c5479d889-t9mnp" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.427951 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-ovndb-tls-certs\") pod \"neutron-c5479d889-t9mnp\" (UID: \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\") " pod="openstack/neutron-c5479d889-t9mnp" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.427985 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vh2b\" (UniqueName: \"kubernetes.io/projected/0faefca0-6038-4bdf-856e-b7cb5b6c5536-kube-api-access-4vh2b\") pod \"neutron-c5479d889-t9mnp\" (UID: \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\") " pod="openstack/neutron-c5479d889-t9mnp" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.428100 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-httpd-config\") pod \"neutron-c5479d889-t9mnp\" (UID: \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\") " pod="openstack/neutron-c5479d889-t9mnp" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.428198 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-public-tls-certs\") pod \"neutron-c5479d889-t9mnp\" (UID: \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\") " pod="openstack/neutron-c5479d889-t9mnp" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.428267 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-internal-tls-certs\") pod \"neutron-c5479d889-t9mnp\" (UID: \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\") " pod="openstack/neutron-c5479d889-t9mnp" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.428307 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-config\") pod \"neutron-c5479d889-t9mnp\" (UID: \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\") " pod="openstack/neutron-c5479d889-t9mnp" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.441967 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-c6787dc45-zbdfx" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.529740 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-combined-ca-bundle\") pod \"neutron-c5479d889-t9mnp\" (UID: \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\") " pod="openstack/neutron-c5479d889-t9mnp" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.529804 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-ovndb-tls-certs\") pod \"neutron-c5479d889-t9mnp\" (UID: \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\") " pod="openstack/neutron-c5479d889-t9mnp" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.529836 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vh2b\" (UniqueName: \"kubernetes.io/projected/0faefca0-6038-4bdf-856e-b7cb5b6c5536-kube-api-access-4vh2b\") pod \"neutron-c5479d889-t9mnp\" (UID: \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\") " pod="openstack/neutron-c5479d889-t9mnp" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.529869 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-httpd-config\") pod \"neutron-c5479d889-t9mnp\" (UID: \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\") " pod="openstack/neutron-c5479d889-t9mnp" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.529902 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-public-tls-certs\") pod \"neutron-c5479d889-t9mnp\" (UID: \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\") " pod="openstack/neutron-c5479d889-t9mnp" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.529929 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-internal-tls-certs\") pod \"neutron-c5479d889-t9mnp\" (UID: \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\") " pod="openstack/neutron-c5479d889-t9mnp" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.529948 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-config\") pod \"neutron-c5479d889-t9mnp\" (UID: \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\") " pod="openstack/neutron-c5479d889-t9mnp" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.539459 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-httpd-config\") pod \"neutron-c5479d889-t9mnp\" (UID: \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\") " pod="openstack/neutron-c5479d889-t9mnp" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.540981 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-internal-tls-certs\") pod \"neutron-c5479d889-t9mnp\" (UID: \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\") " pod="openstack/neutron-c5479d889-t9mnp" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.590976 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-ovndb-tls-certs\") pod \"neutron-c5479d889-t9mnp\" (UID: \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\") " pod="openstack/neutron-c5479d889-t9mnp" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.592586 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-config\") pod \"neutron-c5479d889-t9mnp\" (UID: \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\") " pod="openstack/neutron-c5479d889-t9mnp" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.597797 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vh2b\" (UniqueName: \"kubernetes.io/projected/0faefca0-6038-4bdf-856e-b7cb5b6c5536-kube-api-access-4vh2b\") pod \"neutron-c5479d889-t9mnp\" (UID: \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\") " pod="openstack/neutron-c5479d889-t9mnp" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.600315 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-public-tls-certs\") pod \"neutron-c5479d889-t9mnp\" (UID: \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\") " pod="openstack/neutron-c5479d889-t9mnp" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.604308 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-combined-ca-bundle\") pod \"neutron-c5479d889-t9mnp\" (UID: \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\") " pod="openstack/neutron-c5479d889-t9mnp" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.668354 4837 generic.go:334] "Generic (PLEG): container finished" podID="9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1" containerID="61a252e0f8f14e9b64e09ce9249c12af6d208dea329e7deb38c30d99c07d2a83" exitCode=143 Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.668400 4837 generic.go:334] "Generic (PLEG): container finished" podID="9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1" containerID="cdf4673c99faf73809ec550d38a150bd081436099afc9a2124cbc76da3873ab7" exitCode=143 Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.668781 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1","Type":"ContainerDied","Data":"61a252e0f8f14e9b64e09ce9249c12af6d208dea329e7deb38c30d99c07d2a83"} Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.668889 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1","Type":"ContainerDied","Data":"cdf4673c99faf73809ec550d38a150bd081436099afc9a2124cbc76da3873ab7"} Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.686113 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"643b18f8-6c85-43ec-977a-c9eade4db120","Type":"ContainerStarted","Data":"483de9baf8ab7c417841c02e737142f145dc9ff207f9fb3cb2947353d68dc2a1"} Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.686362 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="643b18f8-6c85-43ec-977a-c9eade4db120" containerName="glance-log" containerID="cri-o://45488696f1bb86187ce327f2ad0012c25e169de59468b4dda6460464fc5497ce" gracePeriod=30 Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.686539 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="643b18f8-6c85-43ec-977a-c9eade4db120" containerName="glance-httpd" containerID="cri-o://483de9baf8ab7c417841c02e737142f145dc9ff207f9fb3cb2947353d68dc2a1" gracePeriod=30 Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.715472 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.715452621 podStartE2EDuration="6.715452621s" podCreationTimestamp="2026-03-13 12:06:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:06:59.708196093 +0000 UTC m=+1135.346462856" watchObservedRunningTime="2026-03-13 12:06:59.715452621 +0000 UTC m=+1135.353719384" Mar 13 12:06:59 crc kubenswrapper[4837]: I0313 12:06:59.900985 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c5479d889-t9mnp" Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.026658 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.167592 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-scripts\") pod \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") " Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.167716 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-httpd-run\") pod \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") " Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.167782 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-logs\") pod \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") " Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.167813 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-combined-ca-bundle\") pod \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") " Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.167848 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") " Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.167945 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-config-data\") pod \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") " Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.167987 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkskj\" (UniqueName: \"kubernetes.io/projected/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-kube-api-access-vkskj\") pod \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\" (UID: \"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1\") " Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.170554 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1" (UID: "9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.171218 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-logs" (OuterVolumeSpecName: "logs") pod "9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1" (UID: "9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.180777 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-scripts" (OuterVolumeSpecName: "scripts") pod "9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1" (UID: "9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.180848 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-kube-api-access-vkskj" (OuterVolumeSpecName: "kube-api-access-vkskj") pod "9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1" (UID: "9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1"). InnerVolumeSpecName "kube-api-access-vkskj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.190167 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1" (UID: "9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.213745 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1" (UID: "9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.228094 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-config-data" (OuterVolumeSpecName: "config-data") pod "9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1" (UID: "9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.269951 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.269985 4837 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.269996 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.270007 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.270035 4837 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.270044 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.270058 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkskj\" (UniqueName: \"kubernetes.io/projected/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1-kube-api-access-vkskj\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.289139 4837 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.377738 4837 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.635800 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c5479d889-t9mnp"] Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.704054 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1","Type":"ContainerDied","Data":"c6ddb54236268786175c021b9359fcf4e8bd417495e2748483768742d3e54d9e"} Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.704063 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.704127 4837 scope.go:117] "RemoveContainer" containerID="61a252e0f8f14e9b64e09ce9249c12af6d208dea329e7deb38c30d99c07d2a83" Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.707303 4837 generic.go:334] "Generic (PLEG): container finished" podID="643b18f8-6c85-43ec-977a-c9eade4db120" containerID="483de9baf8ab7c417841c02e737142f145dc9ff207f9fb3cb2947353d68dc2a1" exitCode=0 Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.707334 4837 generic.go:334] "Generic (PLEG): container finished" podID="643b18f8-6c85-43ec-977a-c9eade4db120" containerID="45488696f1bb86187ce327f2ad0012c25e169de59468b4dda6460464fc5497ce" exitCode=143 Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.708451 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"643b18f8-6c85-43ec-977a-c9eade4db120","Type":"ContainerDied","Data":"483de9baf8ab7c417841c02e737142f145dc9ff207f9fb3cb2947353d68dc2a1"} Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.708488 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"643b18f8-6c85-43ec-977a-c9eade4db120","Type":"ContainerDied","Data":"45488696f1bb86187ce327f2ad0012c25e169de59468b4dda6460464fc5497ce"} Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.768473 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.813362 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.828059 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 12:07:00 crc kubenswrapper[4837]: E0313 12:07:00.828456 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1" containerName="glance-log" Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.828475 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1" containerName="glance-log" Mar 13 12:07:00 crc kubenswrapper[4837]: E0313 12:07:00.828498 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1" containerName="glance-httpd" Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.828506 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1" containerName="glance-httpd" Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.828777 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1" containerName="glance-log" Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.828807 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1" containerName="glance-httpd" Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.829765 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.835938 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.836164 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 13 12:07:00 crc kubenswrapper[4837]: I0313 12:07:00.848026 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.001727 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " pod="openstack/glance-default-external-api-0" Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.001767 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvzd7\" (UniqueName: \"kubernetes.io/projected/f0173ba9-535a-435d-bc51-75c069e69e46-kube-api-access-gvzd7\") pod \"glance-default-external-api-0\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " pod="openstack/glance-default-external-api-0" Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.001800 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0173ba9-535a-435d-bc51-75c069e69e46-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " pod="openstack/glance-default-external-api-0" Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.001840 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0173ba9-535a-435d-bc51-75c069e69e46-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " pod="openstack/glance-default-external-api-0" Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.001901 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0173ba9-535a-435d-bc51-75c069e69e46-scripts\") pod \"glance-default-external-api-0\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " pod="openstack/glance-default-external-api-0" Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.001926 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0173ba9-535a-435d-bc51-75c069e69e46-config-data\") pod \"glance-default-external-api-0\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " pod="openstack/glance-default-external-api-0" Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.001944 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f0173ba9-535a-435d-bc51-75c069e69e46-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " pod="openstack/glance-default-external-api-0" Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.001958 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0173ba9-535a-435d-bc51-75c069e69e46-logs\") pod \"glance-default-external-api-0\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " pod="openstack/glance-default-external-api-0" Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.080430 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1" path="/var/lib/kubelet/pods/9dfbb54f-6629-4c5a-8aa9-a17328b8c6f1/volumes" Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.105170 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " pod="openstack/glance-default-external-api-0" Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.105216 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvzd7\" (UniqueName: \"kubernetes.io/projected/f0173ba9-535a-435d-bc51-75c069e69e46-kube-api-access-gvzd7\") pod \"glance-default-external-api-0\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " pod="openstack/glance-default-external-api-0" Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.105248 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0173ba9-535a-435d-bc51-75c069e69e46-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " pod="openstack/glance-default-external-api-0" Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.105280 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0173ba9-535a-435d-bc51-75c069e69e46-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " pod="openstack/glance-default-external-api-0" Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.105331 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0173ba9-535a-435d-bc51-75c069e69e46-scripts\") pod \"glance-default-external-api-0\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " pod="openstack/glance-default-external-api-0" Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.105353 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0173ba9-535a-435d-bc51-75c069e69e46-config-data\") pod \"glance-default-external-api-0\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " pod="openstack/glance-default-external-api-0" Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.105377 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f0173ba9-535a-435d-bc51-75c069e69e46-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " pod="openstack/glance-default-external-api-0" Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.105404 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0173ba9-535a-435d-bc51-75c069e69e46-logs\") pod \"glance-default-external-api-0\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " pod="openstack/glance-default-external-api-0" Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.106304 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.106913 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0173ba9-535a-435d-bc51-75c069e69e46-logs\") pod \"glance-default-external-api-0\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " pod="openstack/glance-default-external-api-0" Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.110922 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f0173ba9-535a-435d-bc51-75c069e69e46-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " pod="openstack/glance-default-external-api-0" Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.116814 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0173ba9-535a-435d-bc51-75c069e69e46-scripts\") pod \"glance-default-external-api-0\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " pod="openstack/glance-default-external-api-0" Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.117824 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0173ba9-535a-435d-bc51-75c069e69e46-config-data\") pod \"glance-default-external-api-0\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " pod="openstack/glance-default-external-api-0" Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.118690 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0173ba9-535a-435d-bc51-75c069e69e46-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " pod="openstack/glance-default-external-api-0" Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.121245 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0173ba9-535a-435d-bc51-75c069e69e46-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " pod="openstack/glance-default-external-api-0" Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.133102 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvzd7\" (UniqueName: \"kubernetes.io/projected/f0173ba9-535a-435d-bc51-75c069e69e46-kube-api-access-gvzd7\") pod \"glance-default-external-api-0\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " pod="openstack/glance-default-external-api-0" Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.151875 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " pod="openstack/glance-default-external-api-0" Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.456357 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.722082 4837 generic.go:334] "Generic (PLEG): container finished" podID="3af4ac68-a437-4be7-adab-1ef336f0cbda" containerID="2d322ad3eeeb347ecc17c10b7e12064f45bbd098c57202ba37c2350f75cdbf0c" exitCode=0 Mar 13 12:07:01 crc kubenswrapper[4837]: I0313 12:07:01.722124 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-s7m97" event={"ID":"3af4ac68-a437-4be7-adab-1ef336f0cbda","Type":"ContainerDied","Data":"2d322ad3eeeb347ecc17c10b7e12064f45bbd098c57202ba37c2350f75cdbf0c"} Mar 13 12:07:04 crc kubenswrapper[4837]: I0313 12:07:04.918047 4837 scope.go:117] "RemoveContainer" containerID="cdf4673c99faf73809ec550d38a150bd081436099afc9a2124cbc76da3873ab7" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.136992 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-s7m97" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.142358 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.299463 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"643b18f8-6c85-43ec-977a-c9eade4db120\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") " Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.304599 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7vwb\" (UniqueName: \"kubernetes.io/projected/3af4ac68-a437-4be7-adab-1ef336f0cbda-kube-api-access-w7vwb\") pod \"3af4ac68-a437-4be7-adab-1ef336f0cbda\" (UID: \"3af4ac68-a437-4be7-adab-1ef336f0cbda\") " Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.304939 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/643b18f8-6c85-43ec-977a-c9eade4db120-config-data\") pod \"643b18f8-6c85-43ec-977a-c9eade4db120\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") " Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.305091 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-config-data\") pod \"3af4ac68-a437-4be7-adab-1ef336f0cbda\" (UID: \"3af4ac68-a437-4be7-adab-1ef336f0cbda\") " Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.305190 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zz8b\" (UniqueName: \"kubernetes.io/projected/643b18f8-6c85-43ec-977a-c9eade4db120-kube-api-access-9zz8b\") pod \"643b18f8-6c85-43ec-977a-c9eade4db120\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") " Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.305295 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-fernet-keys\") pod \"3af4ac68-a437-4be7-adab-1ef336f0cbda\" (UID: \"3af4ac68-a437-4be7-adab-1ef336f0cbda\") " Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.305441 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-scripts\") pod \"3af4ac68-a437-4be7-adab-1ef336f0cbda\" (UID: \"3af4ac68-a437-4be7-adab-1ef336f0cbda\") " Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.305569 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-combined-ca-bundle\") pod \"3af4ac68-a437-4be7-adab-1ef336f0cbda\" (UID: \"3af4ac68-a437-4be7-adab-1ef336f0cbda\") " Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.305713 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/643b18f8-6c85-43ec-977a-c9eade4db120-combined-ca-bundle\") pod \"643b18f8-6c85-43ec-977a-c9eade4db120\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") " Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.305819 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/643b18f8-6c85-43ec-977a-c9eade4db120-httpd-run\") pod \"643b18f8-6c85-43ec-977a-c9eade4db120\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") " Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.305938 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/643b18f8-6c85-43ec-977a-c9eade4db120-logs\") pod \"643b18f8-6c85-43ec-977a-c9eade4db120\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") " Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.306500 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-credential-keys\") pod \"3af4ac68-a437-4be7-adab-1ef336f0cbda\" (UID: \"3af4ac68-a437-4be7-adab-1ef336f0cbda\") " Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.306616 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/643b18f8-6c85-43ec-977a-c9eade4db120-scripts\") pod \"643b18f8-6c85-43ec-977a-c9eade4db120\" (UID: \"643b18f8-6c85-43ec-977a-c9eade4db120\") " Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.307689 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "643b18f8-6c85-43ec-977a-c9eade4db120" (UID: "643b18f8-6c85-43ec-977a-c9eade4db120"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.308003 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/643b18f8-6c85-43ec-977a-c9eade4db120-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "643b18f8-6c85-43ec-977a-c9eade4db120" (UID: "643b18f8-6c85-43ec-977a-c9eade4db120"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.312283 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/643b18f8-6c85-43ec-977a-c9eade4db120-logs" (OuterVolumeSpecName: "logs") pod "643b18f8-6c85-43ec-977a-c9eade4db120" (UID: "643b18f8-6c85-43ec-977a-c9eade4db120"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.321985 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/643b18f8-6c85-43ec-977a-c9eade4db120-kube-api-access-9zz8b" (OuterVolumeSpecName: "kube-api-access-9zz8b") pod "643b18f8-6c85-43ec-977a-c9eade4db120" (UID: "643b18f8-6c85-43ec-977a-c9eade4db120"). InnerVolumeSpecName "kube-api-access-9zz8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.326416 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3af4ac68-a437-4be7-adab-1ef336f0cbda-kube-api-access-w7vwb" (OuterVolumeSpecName: "kube-api-access-w7vwb") pod "3af4ac68-a437-4be7-adab-1ef336f0cbda" (UID: "3af4ac68-a437-4be7-adab-1ef336f0cbda"). InnerVolumeSpecName "kube-api-access-w7vwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.328926 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-scripts" (OuterVolumeSpecName: "scripts") pod "3af4ac68-a437-4be7-adab-1ef336f0cbda" (UID: "3af4ac68-a437-4be7-adab-1ef336f0cbda"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.329569 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "3af4ac68-a437-4be7-adab-1ef336f0cbda" (UID: "3af4ac68-a437-4be7-adab-1ef336f0cbda"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.338180 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/643b18f8-6c85-43ec-977a-c9eade4db120-scripts" (OuterVolumeSpecName: "scripts") pod "643b18f8-6c85-43ec-977a-c9eade4db120" (UID: "643b18f8-6c85-43ec-977a-c9eade4db120"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.368910 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "3af4ac68-a437-4be7-adab-1ef336f0cbda" (UID: "3af4ac68-a437-4be7-adab-1ef336f0cbda"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.410934 4837 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.411207 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.411276 4837 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/643b18f8-6c85-43ec-977a-c9eade4db120-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.411344 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/643b18f8-6c85-43ec-977a-c9eade4db120-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.411400 4837 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.411455 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/643b18f8-6c85-43ec-977a-c9eade4db120-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.411522 4837 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.411583 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7vwb\" (UniqueName: \"kubernetes.io/projected/3af4ac68-a437-4be7-adab-1ef336f0cbda-kube-api-access-w7vwb\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.411664 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zz8b\" (UniqueName: \"kubernetes.io/projected/643b18f8-6c85-43ec-977a-c9eade4db120-kube-api-access-9zz8b\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.445853 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.518132 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/643b18f8-6c85-43ec-977a-c9eade4db120-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "643b18f8-6c85-43ec-977a-c9eade4db120" (UID: "643b18f8-6c85-43ec-977a-c9eade4db120"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.521871 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-7kqcz"] Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.522095 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" podUID="306aa5e9-7f77-4ff8-9cf6-5b3255c85337" containerName="dnsmasq-dns" containerID="cri-o://2f11bc74222520fcf554ba948fc1d1529fb608acebf234b92a60442a96bc720f" gracePeriod=10 Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.527065 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/643b18f8-6c85-43ec-977a-c9eade4db120-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.612063 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-config-data" (OuterVolumeSpecName: "config-data") pod "3af4ac68-a437-4be7-adab-1ef336f0cbda" (UID: "3af4ac68-a437-4be7-adab-1ef336f0cbda"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.612890 4837 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.616899 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3af4ac68-a437-4be7-adab-1ef336f0cbda" (UID: "3af4ac68-a437-4be7-adab-1ef336f0cbda"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.629191 4837 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.629231 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.629241 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3af4ac68-a437-4be7-adab-1ef336f0cbda-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.643795 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/643b18f8-6c85-43ec-977a-c9eade4db120-config-data" (OuterVolumeSpecName: "config-data") pod "643b18f8-6c85-43ec-977a-c9eade4db120" (UID: "643b18f8-6c85-43ec-977a-c9eade4db120"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.684209 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.731339 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/643b18f8-6c85-43ec-977a-c9eade4db120-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.776778 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"643b18f8-6c85-43ec-977a-c9eade4db120","Type":"ContainerDied","Data":"bd322992210798258d691847d1f616bdd9cfed212e86bf6a55ea38cd76ee2987"} Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.776856 4837 scope.go:117] "RemoveContainer" containerID="483de9baf8ab7c417841c02e737142f145dc9ff207f9fb3cb2947353d68dc2a1" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.777103 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.801531 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-s7m97" event={"ID":"3af4ac68-a437-4be7-adab-1ef336f0cbda","Type":"ContainerDied","Data":"a67502d2e54c1c7cec9684d1027629e2f2e584a89eba555b6f508bab8dc6003d"} Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.801576 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a67502d2e54c1c7cec9684d1027629e2f2e584a89eba555b6f508bab8dc6003d" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.801654 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-s7m97" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.816977 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f0173ba9-535a-435d-bc51-75c069e69e46","Type":"ContainerStarted","Data":"a0b7b975f0a853ab5afecbe29fde34fc4210243637b54de91d96895e00f81e30"} Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.841963 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.856483 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.858588 4837 generic.go:334] "Generic (PLEG): container finished" podID="306aa5e9-7f77-4ff8-9cf6-5b3255c85337" containerID="2f11bc74222520fcf554ba948fc1d1529fb608acebf234b92a60442a96bc720f" exitCode=0 Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.858688 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" event={"ID":"306aa5e9-7f77-4ff8-9cf6-5b3255c85337","Type":"ContainerDied","Data":"2f11bc74222520fcf554ba948fc1d1529fb608acebf234b92a60442a96bc720f"} Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.862934 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c5479d889-t9mnp" event={"ID":"0faefca0-6038-4bdf-856e-b7cb5b6c5536","Type":"ContainerStarted","Data":"9a02a987a1d45aed6ebc32b498a9af8ccb4aa210832c48787a22a25a2228e529"} Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.862967 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c5479d889-t9mnp" event={"ID":"0faefca0-6038-4bdf-856e-b7cb5b6c5536","Type":"ContainerStarted","Data":"ee2f2c6cd7031c0c388b4947ca3445235863139c835bb92b8b4570fbe2c76095"} Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.864763 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee","Type":"ContainerStarted","Data":"367b20646b35759313f66e2deebf0c3f1def518ed4ba18cc4ba66cc774436167"} Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.877892 4837 scope.go:117] "RemoveContainer" containerID="45488696f1bb86187ce327f2ad0012c25e169de59468b4dda6460464fc5497ce" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.887081 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 12:07:05 crc kubenswrapper[4837]: E0313 12:07:05.887469 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="643b18f8-6c85-43ec-977a-c9eade4db120" containerName="glance-log" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.887498 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="643b18f8-6c85-43ec-977a-c9eade4db120" containerName="glance-log" Mar 13 12:07:05 crc kubenswrapper[4837]: E0313 12:07:05.887527 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="643b18f8-6c85-43ec-977a-c9eade4db120" containerName="glance-httpd" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.887533 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="643b18f8-6c85-43ec-977a-c9eade4db120" containerName="glance-httpd" Mar 13 12:07:05 crc kubenswrapper[4837]: E0313 12:07:05.887552 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3af4ac68-a437-4be7-adab-1ef336f0cbda" containerName="keystone-bootstrap" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.887558 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="3af4ac68-a437-4be7-adab-1ef336f0cbda" containerName="keystone-bootstrap" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.888275 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="643b18f8-6c85-43ec-977a-c9eade4db120" containerName="glance-httpd" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.888312 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="643b18f8-6c85-43ec-977a-c9eade4db120" containerName="glance-log" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.888335 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="3af4ac68-a437-4be7-adab-1ef336f0cbda" containerName="keystone-bootstrap" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.889486 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.899120 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.903144 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8vx8g" event={"ID":"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76","Type":"ContainerStarted","Data":"117a085c3636a60886a4974e5b0fb9b17907bfbb02c0f28e14e88a6a4aada355"} Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.903622 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.917518 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 12:07:05 crc kubenswrapper[4837]: I0313 12:07:05.993569 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-8vx8g" podStartSLOduration=2.794016875 podStartE2EDuration="38.993544499s" podCreationTimestamp="2026-03-13 12:06:27 +0000 UTC" firstStartedPulling="2026-03-13 12:06:28.886926421 +0000 UTC m=+1104.525193174" lastFinishedPulling="2026-03-13 12:07:05.086454035 +0000 UTC m=+1140.724720798" observedRunningTime="2026-03-13 12:07:05.991804974 +0000 UTC m=+1141.630071737" watchObservedRunningTime="2026-03-13 12:07:05.993544499 +0000 UTC m=+1141.631811252" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.061825 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.061929 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cp4n\" (UniqueName: \"kubernetes.io/projected/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-kube-api-access-4cp4n\") pod \"glance-default-internal-api-0\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.061967 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.061989 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.062026 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-logs\") pod \"glance-default-internal-api-0\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.062056 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.062104 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.062192 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.157445 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.157492 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.169731 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.169792 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.169853 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cp4n\" (UniqueName: \"kubernetes.io/projected/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-kube-api-access-4cp4n\") pod \"glance-default-internal-api-0\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.169877 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.169892 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.169929 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-logs\") pod \"glance-default-internal-api-0\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.169951 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.169996 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.171823 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5596f9dfb8-m9bxb" podUID="2a28d7a5-22a2-460a-a08c-8eb484e6c382" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.172072 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.174883 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-logs\") pod \"glance-default-internal-api-0\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.176094 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.184536 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.190809 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.201798 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.250009 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-fd6ddfd9b-f66l8" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.250040 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-fd6ddfd9b-f66l8" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.252841 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-fd6ddfd9b-f66l8" podUID="4d3df345-07a2-41bf-aae4-088b3ce83b63" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.265191 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cp4n\" (UniqueName: \"kubernetes.io/projected/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-kube-api-access-4cp4n\") pod \"glance-default-internal-api-0\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.270688 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.315534 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.387694 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-55dc4d44f8-mvjvg"] Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.389076 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-55dc4d44f8-mvjvg" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.399435 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.399769 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-w6mdg" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.399939 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.400121 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.400315 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.400364 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.411102 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-55dc4d44f8-mvjvg"] Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.444042 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.477581 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9cb9614d-a433-4be3-8145-4c1c8593404f-fernet-keys\") pod \"keystone-55dc4d44f8-mvjvg\" (UID: \"9cb9614d-a433-4be3-8145-4c1c8593404f\") " pod="openstack/keystone-55dc4d44f8-mvjvg" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.477665 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9cb9614d-a433-4be3-8145-4c1c8593404f-credential-keys\") pod \"keystone-55dc4d44f8-mvjvg\" (UID: \"9cb9614d-a433-4be3-8145-4c1c8593404f\") " pod="openstack/keystone-55dc4d44f8-mvjvg" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.477693 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cb9614d-a433-4be3-8145-4c1c8593404f-internal-tls-certs\") pod \"keystone-55dc4d44f8-mvjvg\" (UID: \"9cb9614d-a433-4be3-8145-4c1c8593404f\") " pod="openstack/keystone-55dc4d44f8-mvjvg" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.477726 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cb9614d-a433-4be3-8145-4c1c8593404f-public-tls-certs\") pod \"keystone-55dc4d44f8-mvjvg\" (UID: \"9cb9614d-a433-4be3-8145-4c1c8593404f\") " pod="openstack/keystone-55dc4d44f8-mvjvg" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.477743 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cb9614d-a433-4be3-8145-4c1c8593404f-combined-ca-bundle\") pod \"keystone-55dc4d44f8-mvjvg\" (UID: \"9cb9614d-a433-4be3-8145-4c1c8593404f\") " pod="openstack/keystone-55dc4d44f8-mvjvg" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.477770 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cb9614d-a433-4be3-8145-4c1c8593404f-config-data\") pod \"keystone-55dc4d44f8-mvjvg\" (UID: \"9cb9614d-a433-4be3-8145-4c1c8593404f\") " pod="openstack/keystone-55dc4d44f8-mvjvg" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.477802 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tn5x\" (UniqueName: \"kubernetes.io/projected/9cb9614d-a433-4be3-8145-4c1c8593404f-kube-api-access-8tn5x\") pod \"keystone-55dc4d44f8-mvjvg\" (UID: \"9cb9614d-a433-4be3-8145-4c1c8593404f\") " pod="openstack/keystone-55dc4d44f8-mvjvg" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.477842 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cb9614d-a433-4be3-8145-4c1c8593404f-scripts\") pod \"keystone-55dc4d44f8-mvjvg\" (UID: \"9cb9614d-a433-4be3-8145-4c1c8593404f\") " pod="openstack/keystone-55dc4d44f8-mvjvg" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.485933 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.579780 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-ovsdbserver-sb\") pod \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\" (UID: \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\") " Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.579882 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-ovsdbserver-nb\") pod \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\" (UID: \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\") " Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.579915 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmp8d\" (UniqueName: \"kubernetes.io/projected/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-kube-api-access-nmp8d\") pod \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\" (UID: \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\") " Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.579978 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-dns-swift-storage-0\") pod \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\" (UID: \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\") " Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.580087 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-config\") pod \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\" (UID: \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\") " Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.580143 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-dns-svc\") pod \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\" (UID: \"306aa5e9-7f77-4ff8-9cf6-5b3255c85337\") " Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.580444 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tn5x\" (UniqueName: \"kubernetes.io/projected/9cb9614d-a433-4be3-8145-4c1c8593404f-kube-api-access-8tn5x\") pod \"keystone-55dc4d44f8-mvjvg\" (UID: \"9cb9614d-a433-4be3-8145-4c1c8593404f\") " pod="openstack/keystone-55dc4d44f8-mvjvg" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.580510 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cb9614d-a433-4be3-8145-4c1c8593404f-scripts\") pod \"keystone-55dc4d44f8-mvjvg\" (UID: \"9cb9614d-a433-4be3-8145-4c1c8593404f\") " pod="openstack/keystone-55dc4d44f8-mvjvg" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.580586 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9cb9614d-a433-4be3-8145-4c1c8593404f-fernet-keys\") pod \"keystone-55dc4d44f8-mvjvg\" (UID: \"9cb9614d-a433-4be3-8145-4c1c8593404f\") " pod="openstack/keystone-55dc4d44f8-mvjvg" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.580650 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9cb9614d-a433-4be3-8145-4c1c8593404f-credential-keys\") pod \"keystone-55dc4d44f8-mvjvg\" (UID: \"9cb9614d-a433-4be3-8145-4c1c8593404f\") " pod="openstack/keystone-55dc4d44f8-mvjvg" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.580682 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cb9614d-a433-4be3-8145-4c1c8593404f-internal-tls-certs\") pod \"keystone-55dc4d44f8-mvjvg\" (UID: \"9cb9614d-a433-4be3-8145-4c1c8593404f\") " pod="openstack/keystone-55dc4d44f8-mvjvg" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.580726 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cb9614d-a433-4be3-8145-4c1c8593404f-public-tls-certs\") pod \"keystone-55dc4d44f8-mvjvg\" (UID: \"9cb9614d-a433-4be3-8145-4c1c8593404f\") " pod="openstack/keystone-55dc4d44f8-mvjvg" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.580747 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cb9614d-a433-4be3-8145-4c1c8593404f-combined-ca-bundle\") pod \"keystone-55dc4d44f8-mvjvg\" (UID: \"9cb9614d-a433-4be3-8145-4c1c8593404f\") " pod="openstack/keystone-55dc4d44f8-mvjvg" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.580783 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cb9614d-a433-4be3-8145-4c1c8593404f-config-data\") pod \"keystone-55dc4d44f8-mvjvg\" (UID: \"9cb9614d-a433-4be3-8145-4c1c8593404f\") " pod="openstack/keystone-55dc4d44f8-mvjvg" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.588091 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cb9614d-a433-4be3-8145-4c1c8593404f-config-data\") pod \"keystone-55dc4d44f8-mvjvg\" (UID: \"9cb9614d-a433-4be3-8145-4c1c8593404f\") " pod="openstack/keystone-55dc4d44f8-mvjvg" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.603458 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9cb9614d-a433-4be3-8145-4c1c8593404f-credential-keys\") pod \"keystone-55dc4d44f8-mvjvg\" (UID: \"9cb9614d-a433-4be3-8145-4c1c8593404f\") " pod="openstack/keystone-55dc4d44f8-mvjvg" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.605300 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9cb9614d-a433-4be3-8145-4c1c8593404f-fernet-keys\") pod \"keystone-55dc4d44f8-mvjvg\" (UID: \"9cb9614d-a433-4be3-8145-4c1c8593404f\") " pod="openstack/keystone-55dc4d44f8-mvjvg" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.606104 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-kube-api-access-nmp8d" (OuterVolumeSpecName: "kube-api-access-nmp8d") pod "306aa5e9-7f77-4ff8-9cf6-5b3255c85337" (UID: "306aa5e9-7f77-4ff8-9cf6-5b3255c85337"). InnerVolumeSpecName "kube-api-access-nmp8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.612665 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cb9614d-a433-4be3-8145-4c1c8593404f-internal-tls-certs\") pod \"keystone-55dc4d44f8-mvjvg\" (UID: \"9cb9614d-a433-4be3-8145-4c1c8593404f\") " pod="openstack/keystone-55dc4d44f8-mvjvg" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.615310 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cb9614d-a433-4be3-8145-4c1c8593404f-scripts\") pod \"keystone-55dc4d44f8-mvjvg\" (UID: \"9cb9614d-a433-4be3-8145-4c1c8593404f\") " pod="openstack/keystone-55dc4d44f8-mvjvg" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.615917 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cb9614d-a433-4be3-8145-4c1c8593404f-combined-ca-bundle\") pod \"keystone-55dc4d44f8-mvjvg\" (UID: \"9cb9614d-a433-4be3-8145-4c1c8593404f\") " pod="openstack/keystone-55dc4d44f8-mvjvg" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.620086 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cb9614d-a433-4be3-8145-4c1c8593404f-public-tls-certs\") pod \"keystone-55dc4d44f8-mvjvg\" (UID: \"9cb9614d-a433-4be3-8145-4c1c8593404f\") " pod="openstack/keystone-55dc4d44f8-mvjvg" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.644157 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tn5x\" (UniqueName: \"kubernetes.io/projected/9cb9614d-a433-4be3-8145-4c1c8593404f-kube-api-access-8tn5x\") pod \"keystone-55dc4d44f8-mvjvg\" (UID: \"9cb9614d-a433-4be3-8145-4c1c8593404f\") " pod="openstack/keystone-55dc4d44f8-mvjvg" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.685870 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmp8d\" (UniqueName: \"kubernetes.io/projected/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-kube-api-access-nmp8d\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.686623 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "306aa5e9-7f77-4ff8-9cf6-5b3255c85337" (UID: "306aa5e9-7f77-4ff8-9cf6-5b3255c85337"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.733653 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "306aa5e9-7f77-4ff8-9cf6-5b3255c85337" (UID: "306aa5e9-7f77-4ff8-9cf6-5b3255c85337"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.743307 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "306aa5e9-7f77-4ff8-9cf6-5b3255c85337" (UID: "306aa5e9-7f77-4ff8-9cf6-5b3255c85337"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.754086 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "306aa5e9-7f77-4ff8-9cf6-5b3255c85337" (UID: "306aa5e9-7f77-4ff8-9cf6-5b3255c85337"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.757342 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-55dc4d44f8-mvjvg" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.762497 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-config" (OuterVolumeSpecName: "config") pod "306aa5e9-7f77-4ff8-9cf6-5b3255c85337" (UID: "306aa5e9-7f77-4ff8-9cf6-5b3255c85337"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.787171 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.787352 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.787653 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.787746 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.787908 4837 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/306aa5e9-7f77-4ff8-9cf6-5b3255c85337-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:06 crc kubenswrapper[4837]: I0313 12:07:06.926936 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f0173ba9-535a-435d-bc51-75c069e69e46","Type":"ContainerStarted","Data":"5e147e84ce0affb8bbda5c741ef88617e4ee699f66923e1ff90efae96ad20482"} Mar 13 12:07:07 crc kubenswrapper[4837]: I0313 12:07:07.054025 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" Mar 13 12:07:07 crc kubenswrapper[4837]: I0313 12:07:07.122555 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="643b18f8-6c85-43ec-977a-c9eade4db120" path="/var/lib/kubelet/pods/643b18f8-6c85-43ec-977a-c9eade4db120/volumes" Mar 13 12:07:07 crc kubenswrapper[4837]: I0313 12:07:07.132141 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-c5479d889-t9mnp" Mar 13 12:07:07 crc kubenswrapper[4837]: I0313 12:07:07.132175 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-7kqcz" event={"ID":"306aa5e9-7f77-4ff8-9cf6-5b3255c85337","Type":"ContainerDied","Data":"2a0f4fde059e2510bd13af9a796ea4745ff14474c756cbe8a1063e240ce40a71"} Mar 13 12:07:07 crc kubenswrapper[4837]: I0313 12:07:07.132207 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c5479d889-t9mnp" event={"ID":"0faefca0-6038-4bdf-856e-b7cb5b6c5536","Type":"ContainerStarted","Data":"541466fc166402b9bfa4140bd97e50553b49c072d454b4f07847687fa559214e"} Mar 13 12:07:07 crc kubenswrapper[4837]: I0313 12:07:07.132219 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-b6qnm" event={"ID":"95b808e7-674f-4592-af6e-f7c8682f6a17","Type":"ContainerStarted","Data":"ba3dda01a90b7b0d00508491184e90f099c7ae7bc849213376ebbc68b88ffd0f"} Mar 13 12:07:07 crc kubenswrapper[4837]: I0313 12:07:07.134489 4837 scope.go:117] "RemoveContainer" containerID="2f11bc74222520fcf554ba948fc1d1529fb608acebf234b92a60442a96bc720f" Mar 13 12:07:07 crc kubenswrapper[4837]: I0313 12:07:07.156031 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-7kqcz"] Mar 13 12:07:07 crc kubenswrapper[4837]: I0313 12:07:07.168304 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-7kqcz"] Mar 13 12:07:07 crc kubenswrapper[4837]: I0313 12:07:07.205209 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 12:07:07 crc kubenswrapper[4837]: I0313 12:07:07.226563 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-b6qnm" podStartSLOduration=3.20119477 podStartE2EDuration="40.226540153s" podCreationTimestamp="2026-03-13 12:06:27 +0000 UTC" firstStartedPulling="2026-03-13 12:06:28.873127308 +0000 UTC m=+1104.511394061" lastFinishedPulling="2026-03-13 12:07:05.898472681 +0000 UTC m=+1141.536739444" observedRunningTime="2026-03-13 12:07:07.187429304 +0000 UTC m=+1142.825696057" watchObservedRunningTime="2026-03-13 12:07:07.226540153 +0000 UTC m=+1142.864806916" Mar 13 12:07:07 crc kubenswrapper[4837]: I0313 12:07:07.233968 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-c5479d889-t9mnp" podStartSLOduration=8.233951736 podStartE2EDuration="8.233951736s" podCreationTimestamp="2026-03-13 12:06:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:07:07.219380798 +0000 UTC m=+1142.857647561" watchObservedRunningTime="2026-03-13 12:07:07.233951736 +0000 UTC m=+1142.872218499" Mar 13 12:07:07 crc kubenswrapper[4837]: I0313 12:07:07.270374 4837 scope.go:117] "RemoveContainer" containerID="a952d72f45aa2f65e1c6c2e7322bdcf16fb7324473b881a19caa21b16b66a760" Mar 13 12:07:07 crc kubenswrapper[4837]: I0313 12:07:07.473395 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-55dc4d44f8-mvjvg"] Mar 13 12:07:08 crc kubenswrapper[4837]: I0313 12:07:08.194906 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9fdb2289-943a-4078-ab5f-cab9a7b4faf1","Type":"ContainerStarted","Data":"4c6f80cedfefe6ca3ffa3fd1f8e5bca2af1a1e041ef15266c27ebfeb6b6939ec"} Mar 13 12:07:08 crc kubenswrapper[4837]: I0313 12:07:08.195424 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9fdb2289-943a-4078-ab5f-cab9a7b4faf1","Type":"ContainerStarted","Data":"9af1f1b6bae1b057a7c5b2be284aed718dd1bd53fd4267a097ec24a461a2d852"} Mar 13 12:07:08 crc kubenswrapper[4837]: I0313 12:07:08.200781 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f0173ba9-535a-435d-bc51-75c069e69e46","Type":"ContainerStarted","Data":"06e09154da451b1a2177b0ac750f567de80b4f12b1c2aa79102cdc2b77f671b6"} Mar 13 12:07:08 crc kubenswrapper[4837]: I0313 12:07:08.222742 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-55dc4d44f8-mvjvg" event={"ID":"9cb9614d-a433-4be3-8145-4c1c8593404f","Type":"ContainerStarted","Data":"3294b982819f14d3b39958636e95d0c9c0debe4154e94196c3a7a5c537db54dd"} Mar 13 12:07:08 crc kubenswrapper[4837]: I0313 12:07:08.223156 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-55dc4d44f8-mvjvg" Mar 13 12:07:08 crc kubenswrapper[4837]: I0313 12:07:08.223169 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-55dc4d44f8-mvjvg" event={"ID":"9cb9614d-a433-4be3-8145-4c1c8593404f","Type":"ContainerStarted","Data":"74e03940743ec716715181a666394a63c07578634a654ed8b412096592d067ad"} Mar 13 12:07:08 crc kubenswrapper[4837]: I0313 12:07:08.234657 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=8.2346299 podStartE2EDuration="8.2346299s" podCreationTimestamp="2026-03-13 12:07:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:07:08.226469464 +0000 UTC m=+1143.864736227" watchObservedRunningTime="2026-03-13 12:07:08.2346299 +0000 UTC m=+1143.872896673" Mar 13 12:07:08 crc kubenswrapper[4837]: I0313 12:07:08.279210 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-55dc4d44f8-mvjvg" podStartSLOduration=2.27919109 podStartE2EDuration="2.27919109s" podCreationTimestamp="2026-03-13 12:07:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:07:08.263145957 +0000 UTC m=+1143.901412720" watchObservedRunningTime="2026-03-13 12:07:08.27919109 +0000 UTC m=+1143.917457853" Mar 13 12:07:09 crc kubenswrapper[4837]: I0313 12:07:09.060322 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="306aa5e9-7f77-4ff8-9cf6-5b3255c85337" path="/var/lib/kubelet/pods/306aa5e9-7f77-4ff8-9cf6-5b3255c85337/volumes" Mar 13 12:07:09 crc kubenswrapper[4837]: I0313 12:07:09.236266 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9fdb2289-943a-4078-ab5f-cab9a7b4faf1","Type":"ContainerStarted","Data":"a7e9d992e509609ea914f80658069ef20b3e4ab7548f88fd1489567b1ca63a1f"} Mar 13 12:07:09 crc kubenswrapper[4837]: I0313 12:07:09.238427 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-qdzjz" event={"ID":"a44db1d6-6da2-41a5-a37f-ffc602f0d55a","Type":"ContainerStarted","Data":"843cf40344096a3f0565478be09bc819697f7ebe87515db62c711cd361ef6ce2"} Mar 13 12:07:09 crc kubenswrapper[4837]: I0313 12:07:09.264797 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.264772961 podStartE2EDuration="4.264772961s" podCreationTimestamp="2026-03-13 12:07:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:07:09.258883495 +0000 UTC m=+1144.897150268" watchObservedRunningTime="2026-03-13 12:07:09.264772961 +0000 UTC m=+1144.903039724" Mar 13 12:07:09 crc kubenswrapper[4837]: I0313 12:07:09.290715 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-qdzjz" podStartSLOduration=4.441583107 podStartE2EDuration="42.290692005s" podCreationTimestamp="2026-03-13 12:06:27 +0000 UTC" firstStartedPulling="2026-03-13 12:06:28.890160553 +0000 UTC m=+1104.528427316" lastFinishedPulling="2026-03-13 12:07:06.739269451 +0000 UTC m=+1142.377536214" observedRunningTime="2026-03-13 12:07:09.28161481 +0000 UTC m=+1144.919881573" watchObservedRunningTime="2026-03-13 12:07:09.290692005 +0000 UTC m=+1144.928958768" Mar 13 12:07:11 crc kubenswrapper[4837]: I0313 12:07:11.457203 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 13 12:07:11 crc kubenswrapper[4837]: I0313 12:07:11.457268 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 13 12:07:11 crc kubenswrapper[4837]: I0313 12:07:11.499987 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 13 12:07:11 crc kubenswrapper[4837]: I0313 12:07:11.515520 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 13 12:07:12 crc kubenswrapper[4837]: I0313 12:07:12.276924 4837 generic.go:334] "Generic (PLEG): container finished" podID="08c7b2a5-b0b8-433f-b55d-c64eaeea8b76" containerID="117a085c3636a60886a4974e5b0fb9b17907bfbb02c0f28e14e88a6a4aada355" exitCode=0 Mar 13 12:07:12 crc kubenswrapper[4837]: I0313 12:07:12.277017 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8vx8g" event={"ID":"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76","Type":"ContainerDied","Data":"117a085c3636a60886a4974e5b0fb9b17907bfbb02c0f28e14e88a6a4aada355"} Mar 13 12:07:12 crc kubenswrapper[4837]: I0313 12:07:12.277411 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 13 12:07:12 crc kubenswrapper[4837]: I0313 12:07:12.277445 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 13 12:07:14 crc kubenswrapper[4837]: I0313 12:07:14.958992 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 13 12:07:15 crc kubenswrapper[4837]: I0313 12:07:15.320789 4837 generic.go:334] "Generic (PLEG): container finished" podID="95b808e7-674f-4592-af6e-f7c8682f6a17" containerID="ba3dda01a90b7b0d00508491184e90f099c7ae7bc849213376ebbc68b88ffd0f" exitCode=0 Mar 13 12:07:15 crc kubenswrapper[4837]: I0313 12:07:15.320844 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-b6qnm" event={"ID":"95b808e7-674f-4592-af6e-f7c8682f6a17","Type":"ContainerDied","Data":"ba3dda01a90b7b0d00508491184e90f099c7ae7bc849213376ebbc68b88ffd0f"} Mar 13 12:07:15 crc kubenswrapper[4837]: I0313 12:07:15.507177 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 13 12:07:15 crc kubenswrapper[4837]: I0313 12:07:15.648166 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8vx8g" Mar 13 12:07:15 crc kubenswrapper[4837]: I0313 12:07:15.732528 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-combined-ca-bundle\") pod \"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76\" (UID: \"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76\") " Mar 13 12:07:15 crc kubenswrapper[4837]: I0313 12:07:15.732684 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-config-data\") pod \"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76\" (UID: \"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76\") " Mar 13 12:07:15 crc kubenswrapper[4837]: I0313 12:07:15.733182 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vtt2\" (UniqueName: \"kubernetes.io/projected/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-kube-api-access-2vtt2\") pod \"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76\" (UID: \"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76\") " Mar 13 12:07:15 crc kubenswrapper[4837]: I0313 12:07:15.733228 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-scripts\") pod \"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76\" (UID: \"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76\") " Mar 13 12:07:15 crc kubenswrapper[4837]: I0313 12:07:15.733411 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-logs\") pod \"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76\" (UID: \"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76\") " Mar 13 12:07:15 crc kubenswrapper[4837]: I0313 12:07:15.735597 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-logs" (OuterVolumeSpecName: "logs") pod "08c7b2a5-b0b8-433f-b55d-c64eaeea8b76" (UID: "08c7b2a5-b0b8-433f-b55d-c64eaeea8b76"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:07:15 crc kubenswrapper[4837]: I0313 12:07:15.749079 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-scripts" (OuterVolumeSpecName: "scripts") pod "08c7b2a5-b0b8-433f-b55d-c64eaeea8b76" (UID: "08c7b2a5-b0b8-433f-b55d-c64eaeea8b76"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:15 crc kubenswrapper[4837]: I0313 12:07:15.756734 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-kube-api-access-2vtt2" (OuterVolumeSpecName: "kube-api-access-2vtt2") pod "08c7b2a5-b0b8-433f-b55d-c64eaeea8b76" (UID: "08c7b2a5-b0b8-433f-b55d-c64eaeea8b76"). InnerVolumeSpecName "kube-api-access-2vtt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:07:15 crc kubenswrapper[4837]: I0313 12:07:15.783341 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "08c7b2a5-b0b8-433f-b55d-c64eaeea8b76" (UID: "08c7b2a5-b0b8-433f-b55d-c64eaeea8b76"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:15 crc kubenswrapper[4837]: I0313 12:07:15.840281 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vtt2\" (UniqueName: \"kubernetes.io/projected/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-kube-api-access-2vtt2\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:15 crc kubenswrapper[4837]: I0313 12:07:15.840315 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:15 crc kubenswrapper[4837]: I0313 12:07:15.840326 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:15 crc kubenswrapper[4837]: I0313 12:07:15.840334 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:15 crc kubenswrapper[4837]: I0313 12:07:15.848960 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-config-data" (OuterVolumeSpecName: "config-data") pod "08c7b2a5-b0b8-433f-b55d-c64eaeea8b76" (UID: "08c7b2a5-b0b8-433f-b55d-c64eaeea8b76"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:15 crc kubenswrapper[4837]: I0313 12:07:15.941881 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.155712 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5596f9dfb8-m9bxb" podUID="2a28d7a5-22a2-460a-a08c-8eb484e6c382" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.251176 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-fd6ddfd9b-f66l8" podUID="4d3df345-07a2-41bf-aae4-088b3ce83b63" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.346541 4837 generic.go:334] "Generic (PLEG): container finished" podID="a44db1d6-6da2-41a5-a37f-ffc602f0d55a" containerID="843cf40344096a3f0565478be09bc819697f7ebe87515db62c711cd361ef6ce2" exitCode=0 Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.346690 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-qdzjz" event={"ID":"a44db1d6-6da2-41a5-a37f-ffc602f0d55a","Type":"ContainerDied","Data":"843cf40344096a3f0565478be09bc819697f7ebe87515db62c711cd361ef6ce2"} Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.350436 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8vx8g" event={"ID":"08c7b2a5-b0b8-433f-b55d-c64eaeea8b76","Type":"ContainerDied","Data":"d8d4fa30fd1f227e47a679c4ebd48ddee761f9902a8c45ed343c205dc3f7e3b1"} Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.350480 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8vx8g" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.350490 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8d4fa30fd1f227e47a679c4ebd48ddee761f9902a8c45ed343c205dc3f7e3b1" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.487006 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.487059 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.552027 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.580818 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.784742 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-b6qnm" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.794378 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-59f7b5dc8d-rnsz6"] Mar 13 12:07:16 crc kubenswrapper[4837]: E0313 12:07:16.794777 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08c7b2a5-b0b8-433f-b55d-c64eaeea8b76" containerName="placement-db-sync" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.794794 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="08c7b2a5-b0b8-433f-b55d-c64eaeea8b76" containerName="placement-db-sync" Mar 13 12:07:16 crc kubenswrapper[4837]: E0313 12:07:16.794817 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="306aa5e9-7f77-4ff8-9cf6-5b3255c85337" containerName="init" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.794823 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="306aa5e9-7f77-4ff8-9cf6-5b3255c85337" containerName="init" Mar 13 12:07:16 crc kubenswrapper[4837]: E0313 12:07:16.794837 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="306aa5e9-7f77-4ff8-9cf6-5b3255c85337" containerName="dnsmasq-dns" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.794844 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="306aa5e9-7f77-4ff8-9cf6-5b3255c85337" containerName="dnsmasq-dns" Mar 13 12:07:16 crc kubenswrapper[4837]: E0313 12:07:16.794864 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95b808e7-674f-4592-af6e-f7c8682f6a17" containerName="barbican-db-sync" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.794869 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="95b808e7-674f-4592-af6e-f7c8682f6a17" containerName="barbican-db-sync" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.795045 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="08c7b2a5-b0b8-433f-b55d-c64eaeea8b76" containerName="placement-db-sync" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.795067 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="95b808e7-674f-4592-af6e-f7c8682f6a17" containerName="barbican-db-sync" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.795079 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="306aa5e9-7f77-4ff8-9cf6-5b3255c85337" containerName="dnsmasq-dns" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.795986 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-59f7b5dc8d-rnsz6" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.798206 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.798512 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.798622 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.799309 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.799563 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-s6tq2" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.811981 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-59f7b5dc8d-rnsz6"] Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.858453 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95b808e7-674f-4592-af6e-f7c8682f6a17-combined-ca-bundle\") pod \"95b808e7-674f-4592-af6e-f7c8682f6a17\" (UID: \"95b808e7-674f-4592-af6e-f7c8682f6a17\") " Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.858581 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sh7t\" (UniqueName: \"kubernetes.io/projected/95b808e7-674f-4592-af6e-f7c8682f6a17-kube-api-access-9sh7t\") pod \"95b808e7-674f-4592-af6e-f7c8682f6a17\" (UID: \"95b808e7-674f-4592-af6e-f7c8682f6a17\") " Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.858734 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/95b808e7-674f-4592-af6e-f7c8682f6a17-db-sync-config-data\") pod \"95b808e7-674f-4592-af6e-f7c8682f6a17\" (UID: \"95b808e7-674f-4592-af6e-f7c8682f6a17\") " Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.859249 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/07eece9e-0e59-4a06-8fea-efb4217d6907-public-tls-certs\") pod \"placement-59f7b5dc8d-rnsz6\" (UID: \"07eece9e-0e59-4a06-8fea-efb4217d6907\") " pod="openstack/placement-59f7b5dc8d-rnsz6" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.859342 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07eece9e-0e59-4a06-8fea-efb4217d6907-logs\") pod \"placement-59f7b5dc8d-rnsz6\" (UID: \"07eece9e-0e59-4a06-8fea-efb4217d6907\") " pod="openstack/placement-59f7b5dc8d-rnsz6" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.859409 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/07eece9e-0e59-4a06-8fea-efb4217d6907-internal-tls-certs\") pod \"placement-59f7b5dc8d-rnsz6\" (UID: \"07eece9e-0e59-4a06-8fea-efb4217d6907\") " pod="openstack/placement-59f7b5dc8d-rnsz6" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.859468 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07eece9e-0e59-4a06-8fea-efb4217d6907-combined-ca-bundle\") pod \"placement-59f7b5dc8d-rnsz6\" (UID: \"07eece9e-0e59-4a06-8fea-efb4217d6907\") " pod="openstack/placement-59f7b5dc8d-rnsz6" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.859567 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07eece9e-0e59-4a06-8fea-efb4217d6907-scripts\") pod \"placement-59f7b5dc8d-rnsz6\" (UID: \"07eece9e-0e59-4a06-8fea-efb4217d6907\") " pod="openstack/placement-59f7b5dc8d-rnsz6" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.859598 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07eece9e-0e59-4a06-8fea-efb4217d6907-config-data\") pod \"placement-59f7b5dc8d-rnsz6\" (UID: \"07eece9e-0e59-4a06-8fea-efb4217d6907\") " pod="openstack/placement-59f7b5dc8d-rnsz6" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.859629 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w87qw\" (UniqueName: \"kubernetes.io/projected/07eece9e-0e59-4a06-8fea-efb4217d6907-kube-api-access-w87qw\") pod \"placement-59f7b5dc8d-rnsz6\" (UID: \"07eece9e-0e59-4a06-8fea-efb4217d6907\") " pod="openstack/placement-59f7b5dc8d-rnsz6" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.880301 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95b808e7-674f-4592-af6e-f7c8682f6a17-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "95b808e7-674f-4592-af6e-f7c8682f6a17" (UID: "95b808e7-674f-4592-af6e-f7c8682f6a17"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.902586 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95b808e7-674f-4592-af6e-f7c8682f6a17-kube-api-access-9sh7t" (OuterVolumeSpecName: "kube-api-access-9sh7t") pod "95b808e7-674f-4592-af6e-f7c8682f6a17" (UID: "95b808e7-674f-4592-af6e-f7c8682f6a17"). InnerVolumeSpecName "kube-api-access-9sh7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.931908 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95b808e7-674f-4592-af6e-f7c8682f6a17-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95b808e7-674f-4592-af6e-f7c8682f6a17" (UID: "95b808e7-674f-4592-af6e-f7c8682f6a17"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.961431 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07eece9e-0e59-4a06-8fea-efb4217d6907-scripts\") pod \"placement-59f7b5dc8d-rnsz6\" (UID: \"07eece9e-0e59-4a06-8fea-efb4217d6907\") " pod="openstack/placement-59f7b5dc8d-rnsz6" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.961619 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07eece9e-0e59-4a06-8fea-efb4217d6907-config-data\") pod \"placement-59f7b5dc8d-rnsz6\" (UID: \"07eece9e-0e59-4a06-8fea-efb4217d6907\") " pod="openstack/placement-59f7b5dc8d-rnsz6" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.961751 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w87qw\" (UniqueName: \"kubernetes.io/projected/07eece9e-0e59-4a06-8fea-efb4217d6907-kube-api-access-w87qw\") pod \"placement-59f7b5dc8d-rnsz6\" (UID: \"07eece9e-0e59-4a06-8fea-efb4217d6907\") " pod="openstack/placement-59f7b5dc8d-rnsz6" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.961806 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/07eece9e-0e59-4a06-8fea-efb4217d6907-public-tls-certs\") pod \"placement-59f7b5dc8d-rnsz6\" (UID: \"07eece9e-0e59-4a06-8fea-efb4217d6907\") " pod="openstack/placement-59f7b5dc8d-rnsz6" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.961870 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07eece9e-0e59-4a06-8fea-efb4217d6907-logs\") pod \"placement-59f7b5dc8d-rnsz6\" (UID: \"07eece9e-0e59-4a06-8fea-efb4217d6907\") " pod="openstack/placement-59f7b5dc8d-rnsz6" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.961920 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/07eece9e-0e59-4a06-8fea-efb4217d6907-internal-tls-certs\") pod \"placement-59f7b5dc8d-rnsz6\" (UID: \"07eece9e-0e59-4a06-8fea-efb4217d6907\") " pod="openstack/placement-59f7b5dc8d-rnsz6" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.961970 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07eece9e-0e59-4a06-8fea-efb4217d6907-combined-ca-bundle\") pod \"placement-59f7b5dc8d-rnsz6\" (UID: \"07eece9e-0e59-4a06-8fea-efb4217d6907\") " pod="openstack/placement-59f7b5dc8d-rnsz6" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.962050 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95b808e7-674f-4592-af6e-f7c8682f6a17-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.962074 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sh7t\" (UniqueName: \"kubernetes.io/projected/95b808e7-674f-4592-af6e-f7c8682f6a17-kube-api-access-9sh7t\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.962087 4837 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/95b808e7-674f-4592-af6e-f7c8682f6a17-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.965884 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07eece9e-0e59-4a06-8fea-efb4217d6907-logs\") pod \"placement-59f7b5dc8d-rnsz6\" (UID: \"07eece9e-0e59-4a06-8fea-efb4217d6907\") " pod="openstack/placement-59f7b5dc8d-rnsz6" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.968622 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07eece9e-0e59-4a06-8fea-efb4217d6907-combined-ca-bundle\") pod \"placement-59f7b5dc8d-rnsz6\" (UID: \"07eece9e-0e59-4a06-8fea-efb4217d6907\") " pod="openstack/placement-59f7b5dc8d-rnsz6" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.970111 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07eece9e-0e59-4a06-8fea-efb4217d6907-scripts\") pod \"placement-59f7b5dc8d-rnsz6\" (UID: \"07eece9e-0e59-4a06-8fea-efb4217d6907\") " pod="openstack/placement-59f7b5dc8d-rnsz6" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.970261 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/07eece9e-0e59-4a06-8fea-efb4217d6907-internal-tls-certs\") pod \"placement-59f7b5dc8d-rnsz6\" (UID: \"07eece9e-0e59-4a06-8fea-efb4217d6907\") " pod="openstack/placement-59f7b5dc8d-rnsz6" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.978260 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/07eece9e-0e59-4a06-8fea-efb4217d6907-public-tls-certs\") pod \"placement-59f7b5dc8d-rnsz6\" (UID: \"07eece9e-0e59-4a06-8fea-efb4217d6907\") " pod="openstack/placement-59f7b5dc8d-rnsz6" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.983606 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w87qw\" (UniqueName: \"kubernetes.io/projected/07eece9e-0e59-4a06-8fea-efb4217d6907-kube-api-access-w87qw\") pod \"placement-59f7b5dc8d-rnsz6\" (UID: \"07eece9e-0e59-4a06-8fea-efb4217d6907\") " pod="openstack/placement-59f7b5dc8d-rnsz6" Mar 13 12:07:16 crc kubenswrapper[4837]: I0313 12:07:16.986367 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07eece9e-0e59-4a06-8fea-efb4217d6907-config-data\") pod \"placement-59f7b5dc8d-rnsz6\" (UID: \"07eece9e-0e59-4a06-8fea-efb4217d6907\") " pod="openstack/placement-59f7b5dc8d-rnsz6" Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.114860 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-59f7b5dc8d-rnsz6" Mar 13 12:07:17 crc kubenswrapper[4837]: E0313 12:07:17.173061 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee" Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.412106 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-b6qnm" event={"ID":"95b808e7-674f-4592-af6e-f7c8682f6a17","Type":"ContainerDied","Data":"02cdc5326e2dbc385d4e7090105a3655b6651929ef4db12950f0c379aaf98274"} Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.412147 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02cdc5326e2dbc385d4e7090105a3655b6651929ef4db12950f0c379aaf98274" Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.412208 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-b6qnm" Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.420697 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee","Type":"ContainerStarted","Data":"efca9fe013107e0157b7dfeec701a6bf70c8455183d2d8d806b63a4c79489237"} Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.420745 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.420868 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee" containerName="ceilometer-notification-agent" containerID="cri-o://9c2af154abb9a37a270c00c3cc335b4994ab6bb24ddaf80f1f5bfc313a6b9fb6" gracePeriod=30 Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.421052 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.421089 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee" containerName="proxy-httpd" containerID="cri-o://efca9fe013107e0157b7dfeec701a6bf70c8455183d2d8d806b63a4c79489237" gracePeriod=30 Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.421126 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee" containerName="sg-core" containerID="cri-o://367b20646b35759313f66e2deebf0c3f1def518ed4ba18cc4ba66cc774436167" gracePeriod=30 Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.421490 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.723273 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6f4ff9ff9-mjmsz"] Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.731904 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6f4ff9ff9-mjmsz" Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.744508 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-ktqxm" Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.745065 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.745863 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.754322 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6f4ff9ff9-mjmsz"] Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.818446 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-58c489697d-dgjtz"] Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.820245 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-58c489697d-dgjtz" Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.836267 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.894577 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55084c82-a823-4f31-926e-21702ba02ba1-combined-ca-bundle\") pod \"barbican-worker-6f4ff9ff9-mjmsz\" (UID: \"55084c82-a823-4f31-926e-21702ba02ba1\") " pod="openstack/barbican-worker-6f4ff9ff9-mjmsz" Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.894626 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1cfe08e-23bd-4f52-ab3c-3d68377de2a9-config-data\") pod \"barbican-keystone-listener-58c489697d-dgjtz\" (UID: \"d1cfe08e-23bd-4f52-ab3c-3d68377de2a9\") " pod="openstack/barbican-keystone-listener-58c489697d-dgjtz" Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.894670 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55084c82-a823-4f31-926e-21702ba02ba1-config-data\") pod \"barbican-worker-6f4ff9ff9-mjmsz\" (UID: \"55084c82-a823-4f31-926e-21702ba02ba1\") " pod="openstack/barbican-worker-6f4ff9ff9-mjmsz" Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.894689 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vp6xz\" (UniqueName: \"kubernetes.io/projected/d1cfe08e-23bd-4f52-ab3c-3d68377de2a9-kube-api-access-vp6xz\") pod \"barbican-keystone-listener-58c489697d-dgjtz\" (UID: \"d1cfe08e-23bd-4f52-ab3c-3d68377de2a9\") " pod="openstack/barbican-keystone-listener-58c489697d-dgjtz" Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.894771 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/55084c82-a823-4f31-926e-21702ba02ba1-config-data-custom\") pod \"barbican-worker-6f4ff9ff9-mjmsz\" (UID: \"55084c82-a823-4f31-926e-21702ba02ba1\") " pod="openstack/barbican-worker-6f4ff9ff9-mjmsz" Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.894789 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ktjq\" (UniqueName: \"kubernetes.io/projected/55084c82-a823-4f31-926e-21702ba02ba1-kube-api-access-4ktjq\") pod \"barbican-worker-6f4ff9ff9-mjmsz\" (UID: \"55084c82-a823-4f31-926e-21702ba02ba1\") " pod="openstack/barbican-worker-6f4ff9ff9-mjmsz" Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.894807 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1cfe08e-23bd-4f52-ab3c-3d68377de2a9-combined-ca-bundle\") pod \"barbican-keystone-listener-58c489697d-dgjtz\" (UID: \"d1cfe08e-23bd-4f52-ab3c-3d68377de2a9\") " pod="openstack/barbican-keystone-listener-58c489697d-dgjtz" Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.894829 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55084c82-a823-4f31-926e-21702ba02ba1-logs\") pod \"barbican-worker-6f4ff9ff9-mjmsz\" (UID: \"55084c82-a823-4f31-926e-21702ba02ba1\") " pod="openstack/barbican-worker-6f4ff9ff9-mjmsz" Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.894849 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d1cfe08e-23bd-4f52-ab3c-3d68377de2a9-config-data-custom\") pod \"barbican-keystone-listener-58c489697d-dgjtz\" (UID: \"d1cfe08e-23bd-4f52-ab3c-3d68377de2a9\") " pod="openstack/barbican-keystone-listener-58c489697d-dgjtz" Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.894883 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1cfe08e-23bd-4f52-ab3c-3d68377de2a9-logs\") pod \"barbican-keystone-listener-58c489697d-dgjtz\" (UID: \"d1cfe08e-23bd-4f52-ab3c-3d68377de2a9\") " pod="openstack/barbican-keystone-listener-58c489697d-dgjtz" Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.954314 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-59f7b5dc8d-rnsz6"] Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.979714 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-nn655"] Mar 13 12:07:17 crc kubenswrapper[4837]: I0313 12:07:17.981305 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-nn655" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.006358 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/55084c82-a823-4f31-926e-21702ba02ba1-config-data-custom\") pod \"barbican-worker-6f4ff9ff9-mjmsz\" (UID: \"55084c82-a823-4f31-926e-21702ba02ba1\") " pod="openstack/barbican-worker-6f4ff9ff9-mjmsz" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.006443 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ktjq\" (UniqueName: \"kubernetes.io/projected/55084c82-a823-4f31-926e-21702ba02ba1-kube-api-access-4ktjq\") pod \"barbican-worker-6f4ff9ff9-mjmsz\" (UID: \"55084c82-a823-4f31-926e-21702ba02ba1\") " pod="openstack/barbican-worker-6f4ff9ff9-mjmsz" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.006484 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1cfe08e-23bd-4f52-ab3c-3d68377de2a9-combined-ca-bundle\") pod \"barbican-keystone-listener-58c489697d-dgjtz\" (UID: \"d1cfe08e-23bd-4f52-ab3c-3d68377de2a9\") " pod="openstack/barbican-keystone-listener-58c489697d-dgjtz" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.006530 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55084c82-a823-4f31-926e-21702ba02ba1-logs\") pod \"barbican-worker-6f4ff9ff9-mjmsz\" (UID: \"55084c82-a823-4f31-926e-21702ba02ba1\") " pod="openstack/barbican-worker-6f4ff9ff9-mjmsz" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.006565 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d1cfe08e-23bd-4f52-ab3c-3d68377de2a9-config-data-custom\") pod \"barbican-keystone-listener-58c489697d-dgjtz\" (UID: \"d1cfe08e-23bd-4f52-ab3c-3d68377de2a9\") " pod="openstack/barbican-keystone-listener-58c489697d-dgjtz" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.006657 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1cfe08e-23bd-4f52-ab3c-3d68377de2a9-logs\") pod \"barbican-keystone-listener-58c489697d-dgjtz\" (UID: \"d1cfe08e-23bd-4f52-ab3c-3d68377de2a9\") " pod="openstack/barbican-keystone-listener-58c489697d-dgjtz" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.006768 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55084c82-a823-4f31-926e-21702ba02ba1-combined-ca-bundle\") pod \"barbican-worker-6f4ff9ff9-mjmsz\" (UID: \"55084c82-a823-4f31-926e-21702ba02ba1\") " pod="openstack/barbican-worker-6f4ff9ff9-mjmsz" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.006803 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1cfe08e-23bd-4f52-ab3c-3d68377de2a9-config-data\") pod \"barbican-keystone-listener-58c489697d-dgjtz\" (UID: \"d1cfe08e-23bd-4f52-ab3c-3d68377de2a9\") " pod="openstack/barbican-keystone-listener-58c489697d-dgjtz" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.006848 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55084c82-a823-4f31-926e-21702ba02ba1-config-data\") pod \"barbican-worker-6f4ff9ff9-mjmsz\" (UID: \"55084c82-a823-4f31-926e-21702ba02ba1\") " pod="openstack/barbican-worker-6f4ff9ff9-mjmsz" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.006877 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vp6xz\" (UniqueName: \"kubernetes.io/projected/d1cfe08e-23bd-4f52-ab3c-3d68377de2a9-kube-api-access-vp6xz\") pod \"barbican-keystone-listener-58c489697d-dgjtz\" (UID: \"d1cfe08e-23bd-4f52-ab3c-3d68377de2a9\") " pod="openstack/barbican-keystone-listener-58c489697d-dgjtz" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.007733 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55084c82-a823-4f31-926e-21702ba02ba1-logs\") pod \"barbican-worker-6f4ff9ff9-mjmsz\" (UID: \"55084c82-a823-4f31-926e-21702ba02ba1\") " pod="openstack/barbican-worker-6f4ff9ff9-mjmsz" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.010948 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1cfe08e-23bd-4f52-ab3c-3d68377de2a9-logs\") pod \"barbican-keystone-listener-58c489697d-dgjtz\" (UID: \"d1cfe08e-23bd-4f52-ab3c-3d68377de2a9\") " pod="openstack/barbican-keystone-listener-58c489697d-dgjtz" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.016157 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d1cfe08e-23bd-4f52-ab3c-3d68377de2a9-config-data-custom\") pod \"barbican-keystone-listener-58c489697d-dgjtz\" (UID: \"d1cfe08e-23bd-4f52-ab3c-3d68377de2a9\") " pod="openstack/barbican-keystone-listener-58c489697d-dgjtz" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.016456 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55084c82-a823-4f31-926e-21702ba02ba1-config-data\") pod \"barbican-worker-6f4ff9ff9-mjmsz\" (UID: \"55084c82-a823-4f31-926e-21702ba02ba1\") " pod="openstack/barbican-worker-6f4ff9ff9-mjmsz" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.016756 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-58c489697d-dgjtz"] Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.018066 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55084c82-a823-4f31-926e-21702ba02ba1-combined-ca-bundle\") pod \"barbican-worker-6f4ff9ff9-mjmsz\" (UID: \"55084c82-a823-4f31-926e-21702ba02ba1\") " pod="openstack/barbican-worker-6f4ff9ff9-mjmsz" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.029733 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1cfe08e-23bd-4f52-ab3c-3d68377de2a9-config-data\") pod \"barbican-keystone-listener-58c489697d-dgjtz\" (UID: \"d1cfe08e-23bd-4f52-ab3c-3d68377de2a9\") " pod="openstack/barbican-keystone-listener-58c489697d-dgjtz" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.030499 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1cfe08e-23bd-4f52-ab3c-3d68377de2a9-combined-ca-bundle\") pod \"barbican-keystone-listener-58c489697d-dgjtz\" (UID: \"d1cfe08e-23bd-4f52-ab3c-3d68377de2a9\") " pod="openstack/barbican-keystone-listener-58c489697d-dgjtz" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.044459 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/55084c82-a823-4f31-926e-21702ba02ba1-config-data-custom\") pod \"barbican-worker-6f4ff9ff9-mjmsz\" (UID: \"55084c82-a823-4f31-926e-21702ba02ba1\") " pod="openstack/barbican-worker-6f4ff9ff9-mjmsz" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.058238 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ktjq\" (UniqueName: \"kubernetes.io/projected/55084c82-a823-4f31-926e-21702ba02ba1-kube-api-access-4ktjq\") pod \"barbican-worker-6f4ff9ff9-mjmsz\" (UID: \"55084c82-a823-4f31-926e-21702ba02ba1\") " pod="openstack/barbican-worker-6f4ff9ff9-mjmsz" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.075575 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6f4ff9ff9-mjmsz" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.078527 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vp6xz\" (UniqueName: \"kubernetes.io/projected/d1cfe08e-23bd-4f52-ab3c-3d68377de2a9-kube-api-access-vp6xz\") pod \"barbican-keystone-listener-58c489697d-dgjtz\" (UID: \"d1cfe08e-23bd-4f52-ab3c-3d68377de2a9\") " pod="openstack/barbican-keystone-listener-58c489697d-dgjtz" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.102156 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-nn655"] Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.116878 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-nn655\" (UID: \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\") " pod="openstack/dnsmasq-dns-85ff748b95-nn655" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.116963 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-nn655\" (UID: \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\") " pod="openstack/dnsmasq-dns-85ff748b95-nn655" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.117035 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hhlz\" (UniqueName: \"kubernetes.io/projected/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-kube-api-access-9hhlz\") pod \"dnsmasq-dns-85ff748b95-nn655\" (UID: \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\") " pod="openstack/dnsmasq-dns-85ff748b95-nn655" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.117124 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-nn655\" (UID: \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\") " pod="openstack/dnsmasq-dns-85ff748b95-nn655" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.117167 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-dns-svc\") pod \"dnsmasq-dns-85ff748b95-nn655\" (UID: \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\") " pod="openstack/dnsmasq-dns-85ff748b95-nn655" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.117211 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-config\") pod \"dnsmasq-dns-85ff748b95-nn655\" (UID: \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\") " pod="openstack/dnsmasq-dns-85ff748b95-nn655" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.180619 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7598d89cd4-qfmh9"] Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.206184 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7598d89cd4-qfmh9"] Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.206292 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7598d89cd4-qfmh9" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.212117 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.218836 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-nn655\" (UID: \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\") " pod="openstack/dnsmasq-dns-85ff748b95-nn655" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.218924 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-nn655\" (UID: \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\") " pod="openstack/dnsmasq-dns-85ff748b95-nn655" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.219055 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hhlz\" (UniqueName: \"kubernetes.io/projected/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-kube-api-access-9hhlz\") pod \"dnsmasq-dns-85ff748b95-nn655\" (UID: \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\") " pod="openstack/dnsmasq-dns-85ff748b95-nn655" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.219206 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-nn655\" (UID: \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\") " pod="openstack/dnsmasq-dns-85ff748b95-nn655" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.219260 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-dns-svc\") pod \"dnsmasq-dns-85ff748b95-nn655\" (UID: \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\") " pod="openstack/dnsmasq-dns-85ff748b95-nn655" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.219331 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-config\") pod \"dnsmasq-dns-85ff748b95-nn655\" (UID: \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\") " pod="openstack/dnsmasq-dns-85ff748b95-nn655" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.220175 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-nn655\" (UID: \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\") " pod="openstack/dnsmasq-dns-85ff748b95-nn655" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.220672 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-nn655\" (UID: \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\") " pod="openstack/dnsmasq-dns-85ff748b95-nn655" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.226517 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-nn655\" (UID: \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\") " pod="openstack/dnsmasq-dns-85ff748b95-nn655" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.227197 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-dns-svc\") pod \"dnsmasq-dns-85ff748b95-nn655\" (UID: \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\") " pod="openstack/dnsmasq-dns-85ff748b95-nn655" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.227512 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-config\") pod \"dnsmasq-dns-85ff748b95-nn655\" (UID: \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\") " pod="openstack/dnsmasq-dns-85ff748b95-nn655" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.242953 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hhlz\" (UniqueName: \"kubernetes.io/projected/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-kube-api-access-9hhlz\") pod \"dnsmasq-dns-85ff748b95-nn655\" (UID: \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\") " pod="openstack/dnsmasq-dns-85ff748b95-nn655" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.317138 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-qdzjz" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.323245 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx5rf\" (UniqueName: \"kubernetes.io/projected/91206ea2-5d2b-478d-983e-6c842f02819b-kube-api-access-bx5rf\") pod \"barbican-api-7598d89cd4-qfmh9\" (UID: \"91206ea2-5d2b-478d-983e-6c842f02819b\") " pod="openstack/barbican-api-7598d89cd4-qfmh9" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.323294 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/91206ea2-5d2b-478d-983e-6c842f02819b-config-data-custom\") pod \"barbican-api-7598d89cd4-qfmh9\" (UID: \"91206ea2-5d2b-478d-983e-6c842f02819b\") " pod="openstack/barbican-api-7598d89cd4-qfmh9" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.323313 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91206ea2-5d2b-478d-983e-6c842f02819b-config-data\") pod \"barbican-api-7598d89cd4-qfmh9\" (UID: \"91206ea2-5d2b-478d-983e-6c842f02819b\") " pod="openstack/barbican-api-7598d89cd4-qfmh9" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.323366 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91206ea2-5d2b-478d-983e-6c842f02819b-logs\") pod \"barbican-api-7598d89cd4-qfmh9\" (UID: \"91206ea2-5d2b-478d-983e-6c842f02819b\") " pod="openstack/barbican-api-7598d89cd4-qfmh9" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.323395 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91206ea2-5d2b-478d-983e-6c842f02819b-combined-ca-bundle\") pod \"barbican-api-7598d89cd4-qfmh9\" (UID: \"91206ea2-5d2b-478d-983e-6c842f02819b\") " pod="openstack/barbican-api-7598d89cd4-qfmh9" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.349288 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-58c489697d-dgjtz" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.384636 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-nn655" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.426431 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-config-data\") pod \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\" (UID: \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\") " Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.427208 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-etc-machine-id\") pod \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\" (UID: \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\") " Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.427242 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-combined-ca-bundle\") pod \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\" (UID: \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\") " Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.427273 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-scripts\") pod \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\" (UID: \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\") " Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.427330 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4p5k2\" (UniqueName: \"kubernetes.io/projected/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-kube-api-access-4p5k2\") pod \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\" (UID: \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\") " Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.427404 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-db-sync-config-data\") pod \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\" (UID: \"a44db1d6-6da2-41a5-a37f-ffc602f0d55a\") " Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.427622 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx5rf\" (UniqueName: \"kubernetes.io/projected/91206ea2-5d2b-478d-983e-6c842f02819b-kube-api-access-bx5rf\") pod \"barbican-api-7598d89cd4-qfmh9\" (UID: \"91206ea2-5d2b-478d-983e-6c842f02819b\") " pod="openstack/barbican-api-7598d89cd4-qfmh9" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.427733 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/91206ea2-5d2b-478d-983e-6c842f02819b-config-data-custom\") pod \"barbican-api-7598d89cd4-qfmh9\" (UID: \"91206ea2-5d2b-478d-983e-6c842f02819b\") " pod="openstack/barbican-api-7598d89cd4-qfmh9" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.427758 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91206ea2-5d2b-478d-983e-6c842f02819b-config-data\") pod \"barbican-api-7598d89cd4-qfmh9\" (UID: \"91206ea2-5d2b-478d-983e-6c842f02819b\") " pod="openstack/barbican-api-7598d89cd4-qfmh9" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.427832 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91206ea2-5d2b-478d-983e-6c842f02819b-logs\") pod \"barbican-api-7598d89cd4-qfmh9\" (UID: \"91206ea2-5d2b-478d-983e-6c842f02819b\") " pod="openstack/barbican-api-7598d89cd4-qfmh9" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.427874 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91206ea2-5d2b-478d-983e-6c842f02819b-combined-ca-bundle\") pod \"barbican-api-7598d89cd4-qfmh9\" (UID: \"91206ea2-5d2b-478d-983e-6c842f02819b\") " pod="openstack/barbican-api-7598d89cd4-qfmh9" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.433184 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91206ea2-5d2b-478d-983e-6c842f02819b-combined-ca-bundle\") pod \"barbican-api-7598d89cd4-qfmh9\" (UID: \"91206ea2-5d2b-478d-983e-6c842f02819b\") " pod="openstack/barbican-api-7598d89cd4-qfmh9" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.433234 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a44db1d6-6da2-41a5-a37f-ffc602f0d55a" (UID: "a44db1d6-6da2-41a5-a37f-ffc602f0d55a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.433555 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91206ea2-5d2b-478d-983e-6c842f02819b-logs\") pod \"barbican-api-7598d89cd4-qfmh9\" (UID: \"91206ea2-5d2b-478d-983e-6c842f02819b\") " pod="openstack/barbican-api-7598d89cd4-qfmh9" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.434885 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "a44db1d6-6da2-41a5-a37f-ffc602f0d55a" (UID: "a44db1d6-6da2-41a5-a37f-ffc602f0d55a"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.439577 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/91206ea2-5d2b-478d-983e-6c842f02819b-config-data-custom\") pod \"barbican-api-7598d89cd4-qfmh9\" (UID: \"91206ea2-5d2b-478d-983e-6c842f02819b\") " pod="openstack/barbican-api-7598d89cd4-qfmh9" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.450883 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-kube-api-access-4p5k2" (OuterVolumeSpecName: "kube-api-access-4p5k2") pod "a44db1d6-6da2-41a5-a37f-ffc602f0d55a" (UID: "a44db1d6-6da2-41a5-a37f-ffc602f0d55a"). InnerVolumeSpecName "kube-api-access-4p5k2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.458064 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91206ea2-5d2b-478d-983e-6c842f02819b-config-data\") pod \"barbican-api-7598d89cd4-qfmh9\" (UID: \"91206ea2-5d2b-478d-983e-6c842f02819b\") " pod="openstack/barbican-api-7598d89cd4-qfmh9" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.473863 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-scripts" (OuterVolumeSpecName: "scripts") pod "a44db1d6-6da2-41a5-a37f-ffc602f0d55a" (UID: "a44db1d6-6da2-41a5-a37f-ffc602f0d55a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.480625 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx5rf\" (UniqueName: \"kubernetes.io/projected/91206ea2-5d2b-478d-983e-6c842f02819b-kube-api-access-bx5rf\") pod \"barbican-api-7598d89cd4-qfmh9\" (UID: \"91206ea2-5d2b-478d-983e-6c842f02819b\") " pod="openstack/barbican-api-7598d89cd4-qfmh9" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.528514 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-qdzjz" event={"ID":"a44db1d6-6da2-41a5-a37f-ffc602f0d55a","Type":"ContainerDied","Data":"d87408c4f80f070da48980a1c0c42ec26d6e0f566d37471876ae97d32157796e"} Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.528884 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d87408c4f80f070da48980a1c0c42ec26d6e0f566d37471876ae97d32157796e" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.528988 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-qdzjz" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.529455 4837 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.529482 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.529491 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4p5k2\" (UniqueName: \"kubernetes.io/projected/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-kube-api-access-4p5k2\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.529500 4837 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.537841 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a44db1d6-6da2-41a5-a37f-ffc602f0d55a" (UID: "a44db1d6-6da2-41a5-a37f-ffc602f0d55a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.539982 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-59f7b5dc8d-rnsz6" event={"ID":"07eece9e-0e59-4a06-8fea-efb4217d6907","Type":"ContainerStarted","Data":"790e0c04a60b645164e78de86a6f7c1dd04d1f00a81d1f05a478395fa2197f78"} Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.565319 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7598d89cd4-qfmh9" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.593623 4837 generic.go:334] "Generic (PLEG): container finished" podID="6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee" containerID="efca9fe013107e0157b7dfeec701a6bf70c8455183d2d8d806b63a4c79489237" exitCode=0 Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.593671 4837 generic.go:334] "Generic (PLEG): container finished" podID="6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee" containerID="367b20646b35759313f66e2deebf0c3f1def518ed4ba18cc4ba66cc774436167" exitCode=2 Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.594164 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee","Type":"ContainerDied","Data":"efca9fe013107e0157b7dfeec701a6bf70c8455183d2d8d806b63a4c79489237"} Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.594231 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee","Type":"ContainerDied","Data":"367b20646b35759313f66e2deebf0c3f1def518ed4ba18cc4ba66cc774436167"} Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.632101 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.663724 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 12:07:18 crc kubenswrapper[4837]: E0313 12:07:18.664125 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a44db1d6-6da2-41a5-a37f-ffc602f0d55a" containerName="cinder-db-sync" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.664138 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="a44db1d6-6da2-41a5-a37f-ffc602f0d55a" containerName="cinder-db-sync" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.664356 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="a44db1d6-6da2-41a5-a37f-ffc602f0d55a" containerName="cinder-db-sync" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.665290 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.668423 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.670865 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-config-data" (OuterVolumeSpecName: "config-data") pod "a44db1d6-6da2-41a5-a37f-ffc602f0d55a" (UID: "a44db1d6-6da2-41a5-a37f-ffc602f0d55a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.710208 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.733783 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.734206 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-scripts\") pod \"cinder-scheduler-0\" (UID: \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.734248 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.734385 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-config-data\") pod \"cinder-scheduler-0\" (UID: \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.734441 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwghc\" (UniqueName: \"kubernetes.io/projected/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-kube-api-access-qwghc\") pod \"cinder-scheduler-0\" (UID: \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.734481 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.734622 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a44db1d6-6da2-41a5-a37f-ffc602f0d55a-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.767408 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6f4ff9ff9-mjmsz"] Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.835712 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-config-data\") pod \"cinder-scheduler-0\" (UID: \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.835785 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwghc\" (UniqueName: \"kubernetes.io/projected/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-kube-api-access-qwghc\") pod \"cinder-scheduler-0\" (UID: \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.835814 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.835878 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.835910 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-scripts\") pod \"cinder-scheduler-0\" (UID: \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.835936 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.836780 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.850291 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.850902 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.851916 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-config-data\") pod \"cinder-scheduler-0\" (UID: \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.853797 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-scripts\") pod \"cinder-scheduler-0\" (UID: \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.890370 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwghc\" (UniqueName: \"kubernetes.io/projected/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-kube-api-access-qwghc\") pod \"cinder-scheduler-0\" (UID: \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.897392 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-nn655"] Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.938574 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.958280 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-txzkw"] Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.973183 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.978202 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-txzkw"] Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.990970 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.992756 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 12:07:18 crc kubenswrapper[4837]: I0313 12:07:18.997121 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.040155 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6f484085-7b83-46a8-80c2-b3ef6f8b8798-config-data-custom\") pod \"cinder-api-0\" (UID: \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\") " pod="openstack/cinder-api-0" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.040212 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f484085-7b83-46a8-80c2-b3ef6f8b8798-scripts\") pod \"cinder-api-0\" (UID: \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\") " pod="openstack/cinder-api-0" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.040238 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-txzkw\" (UID: \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\") " pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.040274 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppxfm\" (UniqueName: \"kubernetes.io/projected/6f484085-7b83-46a8-80c2-b3ef6f8b8798-kube-api-access-ppxfm\") pod \"cinder-api-0\" (UID: \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\") " pod="openstack/cinder-api-0" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.040312 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-txzkw\" (UID: \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\") " pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.040378 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6f484085-7b83-46a8-80c2-b3ef6f8b8798-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\") " pod="openstack/cinder-api-0" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.040422 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f484085-7b83-46a8-80c2-b3ef6f8b8798-logs\") pod \"cinder-api-0\" (UID: \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\") " pod="openstack/cinder-api-0" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.040473 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f484085-7b83-46a8-80c2-b3ef6f8b8798-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\") " pod="openstack/cinder-api-0" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.040505 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-config\") pod \"dnsmasq-dns-5c9776ccc5-txzkw\" (UID: \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\") " pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.040528 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2wh2\" (UniqueName: \"kubernetes.io/projected/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-kube-api-access-s2wh2\") pod \"dnsmasq-dns-5c9776ccc5-txzkw\" (UID: \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\") " pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.040551 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-txzkw\" (UID: \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\") " pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.040585 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-txzkw\" (UID: \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\") " pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.040605 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f484085-7b83-46a8-80c2-b3ef6f8b8798-config-data\") pod \"cinder-api-0\" (UID: \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\") " pod="openstack/cinder-api-0" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.095523 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.141823 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6f484085-7b83-46a8-80c2-b3ef6f8b8798-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\") " pod="openstack/cinder-api-0" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.141877 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f484085-7b83-46a8-80c2-b3ef6f8b8798-logs\") pod \"cinder-api-0\" (UID: \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\") " pod="openstack/cinder-api-0" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.141920 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f484085-7b83-46a8-80c2-b3ef6f8b8798-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\") " pod="openstack/cinder-api-0" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.141975 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2wh2\" (UniqueName: \"kubernetes.io/projected/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-kube-api-access-s2wh2\") pod \"dnsmasq-dns-5c9776ccc5-txzkw\" (UID: \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\") " pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.141992 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-config\") pod \"dnsmasq-dns-5c9776ccc5-txzkw\" (UID: \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\") " pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.142011 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-txzkw\" (UID: \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\") " pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.142029 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-txzkw\" (UID: \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\") " pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.142045 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f484085-7b83-46a8-80c2-b3ef6f8b8798-config-data\") pod \"cinder-api-0\" (UID: \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\") " pod="openstack/cinder-api-0" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.142094 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6f484085-7b83-46a8-80c2-b3ef6f8b8798-config-data-custom\") pod \"cinder-api-0\" (UID: \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\") " pod="openstack/cinder-api-0" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.142114 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f484085-7b83-46a8-80c2-b3ef6f8b8798-scripts\") pod \"cinder-api-0\" (UID: \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\") " pod="openstack/cinder-api-0" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.142143 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-txzkw\" (UID: \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\") " pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.142172 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppxfm\" (UniqueName: \"kubernetes.io/projected/6f484085-7b83-46a8-80c2-b3ef6f8b8798-kube-api-access-ppxfm\") pod \"cinder-api-0\" (UID: \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\") " pod="openstack/cinder-api-0" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.142196 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-txzkw\" (UID: \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\") " pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.143049 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-txzkw\" (UID: \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\") " pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.143582 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-txzkw\" (UID: \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\") " pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.143600 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-config\") pod \"dnsmasq-dns-5c9776ccc5-txzkw\" (UID: \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\") " pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.143696 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6f484085-7b83-46a8-80c2-b3ef6f8b8798-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\") " pod="openstack/cinder-api-0" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.144099 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-txzkw\" (UID: \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\") " pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.144721 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f484085-7b83-46a8-80c2-b3ef6f8b8798-logs\") pod \"cinder-api-0\" (UID: \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\") " pod="openstack/cinder-api-0" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.145779 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-txzkw\" (UID: \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\") " pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.162416 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6f484085-7b83-46a8-80c2-b3ef6f8b8798-config-data-custom\") pod \"cinder-api-0\" (UID: \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\") " pod="openstack/cinder-api-0" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.162673 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f484085-7b83-46a8-80c2-b3ef6f8b8798-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\") " pod="openstack/cinder-api-0" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.164286 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f484085-7b83-46a8-80c2-b3ef6f8b8798-config-data\") pod \"cinder-api-0\" (UID: \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\") " pod="openstack/cinder-api-0" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.166534 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f484085-7b83-46a8-80c2-b3ef6f8b8798-scripts\") pod \"cinder-api-0\" (UID: \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\") " pod="openstack/cinder-api-0" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.169139 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2wh2\" (UniqueName: \"kubernetes.io/projected/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-kube-api-access-s2wh2\") pod \"dnsmasq-dns-5c9776ccc5-txzkw\" (UID: \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\") " pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.183294 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppxfm\" (UniqueName: \"kubernetes.io/projected/6f484085-7b83-46a8-80c2-b3ef6f8b8798-kube-api-access-ppxfm\") pod \"cinder-api-0\" (UID: \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\") " pod="openstack/cinder-api-0" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.207888 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-nn655"] Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.213387 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-58c489697d-dgjtz"] Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.355096 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.403327 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.569689 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7598d89cd4-qfmh9"] Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.610381 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-59f7b5dc8d-rnsz6" event={"ID":"07eece9e-0e59-4a06-8fea-efb4217d6907","Type":"ContainerStarted","Data":"6a04090f3a67fe0e7b4a52cc36393d294f41e3eefc5aa787e9d8b0ac7104fabd"} Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.610422 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-59f7b5dc8d-rnsz6" event={"ID":"07eece9e-0e59-4a06-8fea-efb4217d6907","Type":"ContainerStarted","Data":"08f34771db4517922084f9af36c9fc7b53eda6e562f0e4d0c248471961304247"} Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.610899 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-59f7b5dc8d-rnsz6" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.610918 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-59f7b5dc8d-rnsz6" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.614440 4837 generic.go:334] "Generic (PLEG): container finished" podID="8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc" containerID="1ba1ecdd77454b316a59d76a9628041bd162e97b7a353cd219789d008b6bfecc" exitCode=0 Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.614831 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-nn655" event={"ID":"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc","Type":"ContainerDied","Data":"1ba1ecdd77454b316a59d76a9628041bd162e97b7a353cd219789d008b6bfecc"} Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.614866 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-nn655" event={"ID":"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc","Type":"ContainerStarted","Data":"0c767985dc8b487e12a79302a8fc49009a12a9085c34846f811ba339c130fda9"} Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.620102 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6f4ff9ff9-mjmsz" event={"ID":"55084c82-a823-4f31-926e-21702ba02ba1","Type":"ContainerStarted","Data":"c2b48c97eeb59bbfb8ae4e79f9743fb8b22a8899c906ed3b96615efa53b32d03"} Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.622896 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-58c489697d-dgjtz" event={"ID":"d1cfe08e-23bd-4f52-ab3c-3d68377de2a9","Type":"ContainerStarted","Data":"22b99983d4298366daa1ac0a7327f79a61b6b54e0d37feba4a924c3afab8ca2c"} Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.622927 4837 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.622949 4837 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.665318 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-59f7b5dc8d-rnsz6" podStartSLOduration=3.665297548 podStartE2EDuration="3.665297548s" podCreationTimestamp="2026-03-13 12:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:07:19.644375241 +0000 UTC m=+1155.282642004" watchObservedRunningTime="2026-03-13 12:07:19.665297548 +0000 UTC m=+1155.303564311" Mar 13 12:07:19 crc kubenswrapper[4837]: I0313 12:07:19.754895 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.115905 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-txzkw"] Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.269199 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.361529 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-nn655" Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.517976 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-ovsdbserver-nb\") pod \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\" (UID: \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\") " Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.518048 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-config\") pod \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\" (UID: \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\") " Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.518175 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-ovsdbserver-sb\") pod \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\" (UID: \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\") " Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.518194 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-dns-swift-storage-0\") pod \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\" (UID: \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\") " Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.518228 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hhlz\" (UniqueName: \"kubernetes.io/projected/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-kube-api-access-9hhlz\") pod \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\" (UID: \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\") " Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.518305 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-dns-svc\") pod \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\" (UID: \"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc\") " Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.564059 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc" (UID: "8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.567183 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc" (UID: "8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.571238 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc" (UID: "8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.572800 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-kube-api-access-9hhlz" (OuterVolumeSpecName: "kube-api-access-9hhlz") pod "8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc" (UID: "8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc"). InnerVolumeSpecName "kube-api-access-9hhlz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.596503 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-config" (OuterVolumeSpecName: "config") pod "8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc" (UID: "8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.611709 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc" (UID: "8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.622906 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.622947 4837 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.622961 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hhlz\" (UniqueName: \"kubernetes.io/projected/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-kube-api-access-9hhlz\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.622976 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.622987 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.622999 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.666166 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7598d89cd4-qfmh9" event={"ID":"91206ea2-5d2b-478d-983e-6c842f02819b","Type":"ContainerStarted","Data":"f929d8442913bfaecc7956e8f7c394bf2287e01e6f101666b06a41edc759a582"} Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.666209 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7598d89cd4-qfmh9" event={"ID":"91206ea2-5d2b-478d-983e-6c842f02819b","Type":"ContainerStarted","Data":"104e38d91432a24be429666c7aef47a48dc5e37624f7f42d829e3d5a83308ad5"} Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.666220 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7598d89cd4-qfmh9" event={"ID":"91206ea2-5d2b-478d-983e-6c842f02819b","Type":"ContainerStarted","Data":"e401c09fc39f0377fdb0e13cc3564c85b21b640ae75df7edda1290f89d0c1fda"} Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.667180 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7598d89cd4-qfmh9" Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.667238 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7598d89cd4-qfmh9" Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.671850 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6f484085-7b83-46a8-80c2-b3ef6f8b8798","Type":"ContainerStarted","Data":"2608f88642291363e7163567a42948a0027f5da1a879663defaa6a0c943729b9"} Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.675802 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"de6b1e01-3054-46d9-b2f3-a8f3a7e504af","Type":"ContainerStarted","Data":"036fe40da00c951a03639436f1d70f870b431fe6b0431a148132bcc8ff154aeb"} Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.694020 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-nn655" event={"ID":"8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc","Type":"ContainerDied","Data":"0c767985dc8b487e12a79302a8fc49009a12a9085c34846f811ba339c130fda9"} Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.694083 4837 scope.go:117] "RemoveContainer" containerID="1ba1ecdd77454b316a59d76a9628041bd162e97b7a353cd219789d008b6bfecc" Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.694239 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-nn655" Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.700007 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7598d89cd4-qfmh9" podStartSLOduration=2.699983241 podStartE2EDuration="2.699983241s" podCreationTimestamp="2026-03-13 12:07:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:07:20.68469284 +0000 UTC m=+1156.322959623" watchObservedRunningTime="2026-03-13 12:07:20.699983241 +0000 UTC m=+1156.338250004" Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.732895 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" event={"ID":"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b","Type":"ContainerStarted","Data":"18aeb282fbd8558fc7f2a4d93c502285e6ae25649a3f62cf2708ff5492d7993d"} Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.732937 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" event={"ID":"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b","Type":"ContainerStarted","Data":"3b4bbdde4e1a36119cc27a40f2a694902d8b5f53fa6c902b59c1385e734f5a5e"} Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.770374 4837 generic.go:334] "Generic (PLEG): container finished" podID="6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee" containerID="9c2af154abb9a37a270c00c3cc335b4994ab6bb24ddaf80f1f5bfc313a6b9fb6" exitCode=0 Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.773740 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee","Type":"ContainerDied","Data":"9c2af154abb9a37a270c00c3cc335b4994ab6bb24ddaf80f1f5bfc313a6b9fb6"} Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.821065 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-nn655"] Mar 13 12:07:20 crc kubenswrapper[4837]: I0313 12:07:20.835415 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-nn655"] Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.080964 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.095828 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc" path="/var/lib/kubelet/pods/8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc/volumes" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.241480 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6r7pn\" (UniqueName: \"kubernetes.io/projected/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-kube-api-access-6r7pn\") pod \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\" (UID: \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\") " Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.241628 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-combined-ca-bundle\") pod \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\" (UID: \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\") " Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.241704 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-sg-core-conf-yaml\") pod \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\" (UID: \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\") " Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.241780 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-log-httpd\") pod \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\" (UID: \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\") " Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.241799 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-run-httpd\") pod \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\" (UID: \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\") " Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.241878 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-scripts\") pod \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\" (UID: \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\") " Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.241912 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-config-data\") pod \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\" (UID: \"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee\") " Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.243193 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee" (UID: "6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.243893 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee" (UID: "6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.251816 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-kube-api-access-6r7pn" (OuterVolumeSpecName: "kube-api-access-6r7pn") pod "6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee" (UID: "6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee"). InnerVolumeSpecName "kube-api-access-6r7pn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.257822 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-scripts" (OuterVolumeSpecName: "scripts") pod "6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee" (UID: "6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.287953 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee" (UID: "6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.334897 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee" (UID: "6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.344966 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6r7pn\" (UniqueName: \"kubernetes.io/projected/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-kube-api-access-6r7pn\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.345044 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.345068 4837 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.345078 4837 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.345089 4837 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.345096 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.363187 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-config-data" (OuterVolumeSpecName: "config-data") pod "6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee" (UID: "6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.449248 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.462843 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.464094 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.465825 4837 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.791655 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.791662 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee","Type":"ContainerDied","Data":"2fe508c1e7b8efe966205eebb2665129d9e9d777f425ce11141b713c93504dc7"} Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.791809 4837 scope.go:117] "RemoveContainer" containerID="efca9fe013107e0157b7dfeec701a6bf70c8455183d2d8d806b63a4c79489237" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.798686 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6f484085-7b83-46a8-80c2-b3ef6f8b8798","Type":"ContainerStarted","Data":"bb986b6c527ca78bf1e0896829a89d5b0ab27431c49d56719acca1f95eca36b5"} Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.807072 4837 generic.go:334] "Generic (PLEG): container finished" podID="fd9a8546-e61b-47e0-90b9-e6c8e4365b0b" containerID="18aeb282fbd8558fc7f2a4d93c502285e6ae25649a3f62cf2708ff5492d7993d" exitCode=0 Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.808292 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" event={"ID":"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b","Type":"ContainerDied","Data":"18aeb282fbd8558fc7f2a4d93c502285e6ae25649a3f62cf2708ff5492d7993d"} Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.931323 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.959759 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.968925 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:07:21 crc kubenswrapper[4837]: E0313 12:07:21.969916 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee" containerName="ceilometer-notification-agent" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.969938 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee" containerName="ceilometer-notification-agent" Mar 13 12:07:21 crc kubenswrapper[4837]: E0313 12:07:21.969983 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc" containerName="init" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.969992 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc" containerName="init" Mar 13 12:07:21 crc kubenswrapper[4837]: E0313 12:07:21.970006 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee" containerName="sg-core" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.970014 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee" containerName="sg-core" Mar 13 12:07:21 crc kubenswrapper[4837]: E0313 12:07:21.970032 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee" containerName="proxy-httpd" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.970040 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee" containerName="proxy-httpd" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.970254 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee" containerName="ceilometer-notification-agent" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.970277 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee" containerName="sg-core" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.970295 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee" containerName="proxy-httpd" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.970314 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="8105a7ee-7d4e-471a-a39d-b3b9b75c3dcc" containerName="init" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.972840 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.977215 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.978867 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:07:21 crc kubenswrapper[4837]: I0313 12:07:21.979994 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 12:07:22 crc kubenswrapper[4837]: I0313 12:07:22.064650 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8944c2be-da67-4cdd-9f75-0e473253e932-log-httpd\") pod \"ceilometer-0\" (UID: \"8944c2be-da67-4cdd-9f75-0e473253e932\") " pod="openstack/ceilometer-0" Mar 13 12:07:22 crc kubenswrapper[4837]: I0313 12:07:22.065540 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8944c2be-da67-4cdd-9f75-0e473253e932-scripts\") pod \"ceilometer-0\" (UID: \"8944c2be-da67-4cdd-9f75-0e473253e932\") " pod="openstack/ceilometer-0" Mar 13 12:07:22 crc kubenswrapper[4837]: I0313 12:07:22.066084 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8944c2be-da67-4cdd-9f75-0e473253e932-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8944c2be-da67-4cdd-9f75-0e473253e932\") " pod="openstack/ceilometer-0" Mar 13 12:07:22 crc kubenswrapper[4837]: I0313 12:07:22.066258 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz8t7\" (UniqueName: \"kubernetes.io/projected/8944c2be-da67-4cdd-9f75-0e473253e932-kube-api-access-cz8t7\") pod \"ceilometer-0\" (UID: \"8944c2be-da67-4cdd-9f75-0e473253e932\") " pod="openstack/ceilometer-0" Mar 13 12:07:22 crc kubenswrapper[4837]: I0313 12:07:22.066296 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8944c2be-da67-4cdd-9f75-0e473253e932-config-data\") pod \"ceilometer-0\" (UID: \"8944c2be-da67-4cdd-9f75-0e473253e932\") " pod="openstack/ceilometer-0" Mar 13 12:07:22 crc kubenswrapper[4837]: I0313 12:07:22.066423 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8944c2be-da67-4cdd-9f75-0e473253e932-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8944c2be-da67-4cdd-9f75-0e473253e932\") " pod="openstack/ceilometer-0" Mar 13 12:07:22 crc kubenswrapper[4837]: I0313 12:07:22.066552 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8944c2be-da67-4cdd-9f75-0e473253e932-run-httpd\") pod \"ceilometer-0\" (UID: \"8944c2be-da67-4cdd-9f75-0e473253e932\") " pod="openstack/ceilometer-0" Mar 13 12:07:22 crc kubenswrapper[4837]: I0313 12:07:22.168305 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8944c2be-da67-4cdd-9f75-0e473253e932-scripts\") pod \"ceilometer-0\" (UID: \"8944c2be-da67-4cdd-9f75-0e473253e932\") " pod="openstack/ceilometer-0" Mar 13 12:07:22 crc kubenswrapper[4837]: I0313 12:07:22.168487 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8944c2be-da67-4cdd-9f75-0e473253e932-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8944c2be-da67-4cdd-9f75-0e473253e932\") " pod="openstack/ceilometer-0" Mar 13 12:07:22 crc kubenswrapper[4837]: I0313 12:07:22.168535 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cz8t7\" (UniqueName: \"kubernetes.io/projected/8944c2be-da67-4cdd-9f75-0e473253e932-kube-api-access-cz8t7\") pod \"ceilometer-0\" (UID: \"8944c2be-da67-4cdd-9f75-0e473253e932\") " pod="openstack/ceilometer-0" Mar 13 12:07:22 crc kubenswrapper[4837]: I0313 12:07:22.168566 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8944c2be-da67-4cdd-9f75-0e473253e932-config-data\") pod \"ceilometer-0\" (UID: \"8944c2be-da67-4cdd-9f75-0e473253e932\") " pod="openstack/ceilometer-0" Mar 13 12:07:22 crc kubenswrapper[4837]: I0313 12:07:22.168596 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8944c2be-da67-4cdd-9f75-0e473253e932-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8944c2be-da67-4cdd-9f75-0e473253e932\") " pod="openstack/ceilometer-0" Mar 13 12:07:22 crc kubenswrapper[4837]: I0313 12:07:22.168618 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8944c2be-da67-4cdd-9f75-0e473253e932-run-httpd\") pod \"ceilometer-0\" (UID: \"8944c2be-da67-4cdd-9f75-0e473253e932\") " pod="openstack/ceilometer-0" Mar 13 12:07:22 crc kubenswrapper[4837]: I0313 12:07:22.168659 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8944c2be-da67-4cdd-9f75-0e473253e932-log-httpd\") pod \"ceilometer-0\" (UID: \"8944c2be-da67-4cdd-9f75-0e473253e932\") " pod="openstack/ceilometer-0" Mar 13 12:07:22 crc kubenswrapper[4837]: I0313 12:07:22.169104 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8944c2be-da67-4cdd-9f75-0e473253e932-log-httpd\") pod \"ceilometer-0\" (UID: \"8944c2be-da67-4cdd-9f75-0e473253e932\") " pod="openstack/ceilometer-0" Mar 13 12:07:22 crc kubenswrapper[4837]: I0313 12:07:22.171224 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8944c2be-da67-4cdd-9f75-0e473253e932-run-httpd\") pod \"ceilometer-0\" (UID: \"8944c2be-da67-4cdd-9f75-0e473253e932\") " pod="openstack/ceilometer-0" Mar 13 12:07:22 crc kubenswrapper[4837]: I0313 12:07:22.176388 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8944c2be-da67-4cdd-9f75-0e473253e932-config-data\") pod \"ceilometer-0\" (UID: \"8944c2be-da67-4cdd-9f75-0e473253e932\") " pod="openstack/ceilometer-0" Mar 13 12:07:22 crc kubenswrapper[4837]: I0313 12:07:22.176844 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8944c2be-da67-4cdd-9f75-0e473253e932-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8944c2be-da67-4cdd-9f75-0e473253e932\") " pod="openstack/ceilometer-0" Mar 13 12:07:22 crc kubenswrapper[4837]: I0313 12:07:22.179650 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8944c2be-da67-4cdd-9f75-0e473253e932-scripts\") pod \"ceilometer-0\" (UID: \"8944c2be-da67-4cdd-9f75-0e473253e932\") " pod="openstack/ceilometer-0" Mar 13 12:07:22 crc kubenswrapper[4837]: I0313 12:07:22.180683 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8944c2be-da67-4cdd-9f75-0e473253e932-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8944c2be-da67-4cdd-9f75-0e473253e932\") " pod="openstack/ceilometer-0" Mar 13 12:07:22 crc kubenswrapper[4837]: I0313 12:07:22.194166 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz8t7\" (UniqueName: \"kubernetes.io/projected/8944c2be-da67-4cdd-9f75-0e473253e932-kube-api-access-cz8t7\") pod \"ceilometer-0\" (UID: \"8944c2be-da67-4cdd-9f75-0e473253e932\") " pod="openstack/ceilometer-0" Mar 13 12:07:22 crc kubenswrapper[4837]: I0313 12:07:22.337980 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:07:22 crc kubenswrapper[4837]: I0313 12:07:22.393904 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 13 12:07:22 crc kubenswrapper[4837]: I0313 12:07:22.596510 4837 scope.go:117] "RemoveContainer" containerID="367b20646b35759313f66e2deebf0c3f1def518ed4ba18cc4ba66cc774436167" Mar 13 12:07:22 crc kubenswrapper[4837]: I0313 12:07:22.757706 4837 scope.go:117] "RemoveContainer" containerID="9c2af154abb9a37a270c00c3cc335b4994ab6bb24ddaf80f1f5bfc313a6b9fb6" Mar 13 12:07:23 crc kubenswrapper[4837]: I0313 12:07:23.073075 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee" path="/var/lib/kubelet/pods/6e1cc75f-7386-440f-ba9f-9c3fd7b7d4ee/volumes" Mar 13 12:07:23 crc kubenswrapper[4837]: I0313 12:07:23.381988 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:07:23 crc kubenswrapper[4837]: I0313 12:07:23.845119 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8944c2be-da67-4cdd-9f75-0e473253e932","Type":"ContainerStarted","Data":"affa40a245268506c6f6766fb2f158d46986fd7f106dd4cfb003b265c6f1faa4"} Mar 13 12:07:23 crc kubenswrapper[4837]: I0313 12:07:23.850059 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"de6b1e01-3054-46d9-b2f3-a8f3a7e504af","Type":"ContainerStarted","Data":"a728200e4d66707c01f4e20cc7de5a1c1266b885af01c5ea8dd37d28e1bdd6bc"} Mar 13 12:07:23 crc kubenswrapper[4837]: I0313 12:07:23.852747 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6f4ff9ff9-mjmsz" event={"ID":"55084c82-a823-4f31-926e-21702ba02ba1","Type":"ContainerStarted","Data":"44b1cd8a58b692c49a76e5f57bb432f41dc1c5c54cebee1805c8fa67be55ed5c"} Mar 13 12:07:23 crc kubenswrapper[4837]: I0313 12:07:23.852798 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6f4ff9ff9-mjmsz" event={"ID":"55084c82-a823-4f31-926e-21702ba02ba1","Type":"ContainerStarted","Data":"c54851290685ce709d0e6de4969cfb9edf3402a8ad802731045343b1d7b59d2a"} Mar 13 12:07:23 crc kubenswrapper[4837]: I0313 12:07:23.855176 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-58c489697d-dgjtz" event={"ID":"d1cfe08e-23bd-4f52-ab3c-3d68377de2a9","Type":"ContainerStarted","Data":"149e91272542ec915648f0494b9e7a35f69d4dd526e3c7ef28a873474b5326e9"} Mar 13 12:07:23 crc kubenswrapper[4837]: I0313 12:07:23.855213 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-58c489697d-dgjtz" event={"ID":"d1cfe08e-23bd-4f52-ab3c-3d68377de2a9","Type":"ContainerStarted","Data":"2da0bfaeffcd62165fc74c1d17cbbde1617d888708f62aa9c7c0915aba58a23a"} Mar 13 12:07:23 crc kubenswrapper[4837]: I0313 12:07:23.858768 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" event={"ID":"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b","Type":"ContainerStarted","Data":"308f5a2ca30c72015ad1831a239549e973a6a698921b4916b0e838cdf0b49c8a"} Mar 13 12:07:23 crc kubenswrapper[4837]: I0313 12:07:23.858884 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" Mar 13 12:07:23 crc kubenswrapper[4837]: I0313 12:07:23.878212 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6f4ff9ff9-mjmsz" podStartSLOduration=3.049866951 podStartE2EDuration="6.87819637s" podCreationTimestamp="2026-03-13 12:07:17 +0000 UTC" firstStartedPulling="2026-03-13 12:07:18.97576274 +0000 UTC m=+1154.614029503" lastFinishedPulling="2026-03-13 12:07:22.804092159 +0000 UTC m=+1158.442358922" observedRunningTime="2026-03-13 12:07:23.877081696 +0000 UTC m=+1159.515348469" watchObservedRunningTime="2026-03-13 12:07:23.87819637 +0000 UTC m=+1159.516463133" Mar 13 12:07:23 crc kubenswrapper[4837]: I0313 12:07:23.883892 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6f484085-7b83-46a8-80c2-b3ef6f8b8798","Type":"ContainerStarted","Data":"4acde6c31477a18525c2bc313aa155955862f73aaa7708329d4edc29e752be5d"} Mar 13 12:07:23 crc kubenswrapper[4837]: I0313 12:07:23.884063 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="6f484085-7b83-46a8-80c2-b3ef6f8b8798" containerName="cinder-api-log" containerID="cri-o://bb986b6c527ca78bf1e0896829a89d5b0ab27431c49d56719acca1f95eca36b5" gracePeriod=30 Mar 13 12:07:23 crc kubenswrapper[4837]: I0313 12:07:23.884207 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 13 12:07:23 crc kubenswrapper[4837]: I0313 12:07:23.884261 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="6f484085-7b83-46a8-80c2-b3ef6f8b8798" containerName="cinder-api" containerID="cri-o://4acde6c31477a18525c2bc313aa155955862f73aaa7708329d4edc29e752be5d" gracePeriod=30 Mar 13 12:07:23 crc kubenswrapper[4837]: I0313 12:07:23.906660 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-58c489697d-dgjtz" podStartSLOduration=3.382571377 podStartE2EDuration="6.906620944s" podCreationTimestamp="2026-03-13 12:07:17 +0000 UTC" firstStartedPulling="2026-03-13 12:07:19.234349516 +0000 UTC m=+1154.872616279" lastFinishedPulling="2026-03-13 12:07:22.758399083 +0000 UTC m=+1158.396665846" observedRunningTime="2026-03-13 12:07:23.904010802 +0000 UTC m=+1159.542277565" watchObservedRunningTime="2026-03-13 12:07:23.906620944 +0000 UTC m=+1159.544887717" Mar 13 12:07:23 crc kubenswrapper[4837]: I0313 12:07:23.944194 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" podStartSLOduration=5.944175244 podStartE2EDuration="5.944175244s" podCreationTimestamp="2026-03-13 12:07:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:07:23.943360658 +0000 UTC m=+1159.581627411" watchObservedRunningTime="2026-03-13 12:07:23.944175244 +0000 UTC m=+1159.582442007" Mar 13 12:07:23 crc kubenswrapper[4837]: I0313 12:07:23.988161 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.9881429950000005 podStartE2EDuration="5.988142995s" podCreationTimestamp="2026-03-13 12:07:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:07:23.975064985 +0000 UTC m=+1159.613331748" watchObservedRunningTime="2026-03-13 12:07:23.988142995 +0000 UTC m=+1159.626409758" Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.713632 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6d84f6b8c8-8rrwq"] Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.716224 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6d84f6b8c8-8rrwq" Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.721537 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.721854 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.754918 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6d84f6b8c8-8rrwq"] Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.872569 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74c7e377-b579-47bc-a992-cca0cf047627-logs\") pod \"barbican-api-6d84f6b8c8-8rrwq\" (UID: \"74c7e377-b579-47bc-a992-cca0cf047627\") " pod="openstack/barbican-api-6d84f6b8c8-8rrwq" Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.872617 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/74c7e377-b579-47bc-a992-cca0cf047627-config-data-custom\") pod \"barbican-api-6d84f6b8c8-8rrwq\" (UID: \"74c7e377-b579-47bc-a992-cca0cf047627\") " pod="openstack/barbican-api-6d84f6b8c8-8rrwq" Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.872653 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d459r\" (UniqueName: \"kubernetes.io/projected/74c7e377-b579-47bc-a992-cca0cf047627-kube-api-access-d459r\") pod \"barbican-api-6d84f6b8c8-8rrwq\" (UID: \"74c7e377-b579-47bc-a992-cca0cf047627\") " pod="openstack/barbican-api-6d84f6b8c8-8rrwq" Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.872819 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74c7e377-b579-47bc-a992-cca0cf047627-combined-ca-bundle\") pod \"barbican-api-6d84f6b8c8-8rrwq\" (UID: \"74c7e377-b579-47bc-a992-cca0cf047627\") " pod="openstack/barbican-api-6d84f6b8c8-8rrwq" Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.872990 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c7e377-b579-47bc-a992-cca0cf047627-public-tls-certs\") pod \"barbican-api-6d84f6b8c8-8rrwq\" (UID: \"74c7e377-b579-47bc-a992-cca0cf047627\") " pod="openstack/barbican-api-6d84f6b8c8-8rrwq" Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.873053 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c7e377-b579-47bc-a992-cca0cf047627-internal-tls-certs\") pod \"barbican-api-6d84f6b8c8-8rrwq\" (UID: \"74c7e377-b579-47bc-a992-cca0cf047627\") " pod="openstack/barbican-api-6d84f6b8c8-8rrwq" Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.873213 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74c7e377-b579-47bc-a992-cca0cf047627-config-data\") pod \"barbican-api-6d84f6b8c8-8rrwq\" (UID: \"74c7e377-b579-47bc-a992-cca0cf047627\") " pod="openstack/barbican-api-6d84f6b8c8-8rrwq" Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.904920 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"de6b1e01-3054-46d9-b2f3-a8f3a7e504af","Type":"ContainerStarted","Data":"997441b3a97d2775eceaed89ffadcc848e11b76800ad4277b31bca177c72edfb"} Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.909186 4837 generic.go:334] "Generic (PLEG): container finished" podID="6f484085-7b83-46a8-80c2-b3ef6f8b8798" containerID="bb986b6c527ca78bf1e0896829a89d5b0ab27431c49d56719acca1f95eca36b5" exitCode=143 Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.909264 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6f484085-7b83-46a8-80c2-b3ef6f8b8798","Type":"ContainerDied","Data":"bb986b6c527ca78bf1e0896829a89d5b0ab27431c49d56719acca1f95eca36b5"} Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.913610 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8944c2be-da67-4cdd-9f75-0e473253e932","Type":"ContainerStarted","Data":"f106579d1eb92efbafe377b2c5e41ffb980fcd44573e4b8ba73109499680b552"} Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.926726 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.060688637 podStartE2EDuration="6.926708648s" podCreationTimestamp="2026-03-13 12:07:18 +0000 UTC" firstStartedPulling="2026-03-13 12:07:19.824601443 +0000 UTC m=+1155.462868206" lastFinishedPulling="2026-03-13 12:07:22.690621454 +0000 UTC m=+1158.328888217" observedRunningTime="2026-03-13 12:07:24.920343387 +0000 UTC m=+1160.558610150" watchObservedRunningTime="2026-03-13 12:07:24.926708648 +0000 UTC m=+1160.564975411" Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.975246 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c7e377-b579-47bc-a992-cca0cf047627-internal-tls-certs\") pod \"barbican-api-6d84f6b8c8-8rrwq\" (UID: \"74c7e377-b579-47bc-a992-cca0cf047627\") " pod="openstack/barbican-api-6d84f6b8c8-8rrwq" Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.975374 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74c7e377-b579-47bc-a992-cca0cf047627-config-data\") pod \"barbican-api-6d84f6b8c8-8rrwq\" (UID: \"74c7e377-b579-47bc-a992-cca0cf047627\") " pod="openstack/barbican-api-6d84f6b8c8-8rrwq" Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.975432 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74c7e377-b579-47bc-a992-cca0cf047627-logs\") pod \"barbican-api-6d84f6b8c8-8rrwq\" (UID: \"74c7e377-b579-47bc-a992-cca0cf047627\") " pod="openstack/barbican-api-6d84f6b8c8-8rrwq" Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.975460 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/74c7e377-b579-47bc-a992-cca0cf047627-config-data-custom\") pod \"barbican-api-6d84f6b8c8-8rrwq\" (UID: \"74c7e377-b579-47bc-a992-cca0cf047627\") " pod="openstack/barbican-api-6d84f6b8c8-8rrwq" Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.975479 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d459r\" (UniqueName: \"kubernetes.io/projected/74c7e377-b579-47bc-a992-cca0cf047627-kube-api-access-d459r\") pod \"barbican-api-6d84f6b8c8-8rrwq\" (UID: \"74c7e377-b579-47bc-a992-cca0cf047627\") " pod="openstack/barbican-api-6d84f6b8c8-8rrwq" Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.975526 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74c7e377-b579-47bc-a992-cca0cf047627-combined-ca-bundle\") pod \"barbican-api-6d84f6b8c8-8rrwq\" (UID: \"74c7e377-b579-47bc-a992-cca0cf047627\") " pod="openstack/barbican-api-6d84f6b8c8-8rrwq" Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.975579 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c7e377-b579-47bc-a992-cca0cf047627-public-tls-certs\") pod \"barbican-api-6d84f6b8c8-8rrwq\" (UID: \"74c7e377-b579-47bc-a992-cca0cf047627\") " pod="openstack/barbican-api-6d84f6b8c8-8rrwq" Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.976732 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74c7e377-b579-47bc-a992-cca0cf047627-logs\") pod \"barbican-api-6d84f6b8c8-8rrwq\" (UID: \"74c7e377-b579-47bc-a992-cca0cf047627\") " pod="openstack/barbican-api-6d84f6b8c8-8rrwq" Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.982197 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c7e377-b579-47bc-a992-cca0cf047627-public-tls-certs\") pod \"barbican-api-6d84f6b8c8-8rrwq\" (UID: \"74c7e377-b579-47bc-a992-cca0cf047627\") " pod="openstack/barbican-api-6d84f6b8c8-8rrwq" Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.985900 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74c7e377-b579-47bc-a992-cca0cf047627-config-data\") pod \"barbican-api-6d84f6b8c8-8rrwq\" (UID: \"74c7e377-b579-47bc-a992-cca0cf047627\") " pod="openstack/barbican-api-6d84f6b8c8-8rrwq" Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.987964 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c7e377-b579-47bc-a992-cca0cf047627-internal-tls-certs\") pod \"barbican-api-6d84f6b8c8-8rrwq\" (UID: \"74c7e377-b579-47bc-a992-cca0cf047627\") " pod="openstack/barbican-api-6d84f6b8c8-8rrwq" Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.997662 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/74c7e377-b579-47bc-a992-cca0cf047627-config-data-custom\") pod \"barbican-api-6d84f6b8c8-8rrwq\" (UID: \"74c7e377-b579-47bc-a992-cca0cf047627\") " pod="openstack/barbican-api-6d84f6b8c8-8rrwq" Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.998378 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74c7e377-b579-47bc-a992-cca0cf047627-combined-ca-bundle\") pod \"barbican-api-6d84f6b8c8-8rrwq\" (UID: \"74c7e377-b579-47bc-a992-cca0cf047627\") " pod="openstack/barbican-api-6d84f6b8c8-8rrwq" Mar 13 12:07:24 crc kubenswrapper[4837]: I0313 12:07:24.999313 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d459r\" (UniqueName: \"kubernetes.io/projected/74c7e377-b579-47bc-a992-cca0cf047627-kube-api-access-d459r\") pod \"barbican-api-6d84f6b8c8-8rrwq\" (UID: \"74c7e377-b579-47bc-a992-cca0cf047627\") " pod="openstack/barbican-api-6d84f6b8c8-8rrwq" Mar 13 12:07:25 crc kubenswrapper[4837]: I0313 12:07:25.063156 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6d84f6b8c8-8rrwq" Mar 13 12:07:25 crc kubenswrapper[4837]: W0313 12:07:25.318898 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e1cc75f_7386_440f_ba9f_9c3fd7b7d4ee.slice/crio-367b20646b35759313f66e2deebf0c3f1def518ed4ba18cc4ba66cc774436167.scope WatchSource:0}: Error finding container 367b20646b35759313f66e2deebf0c3f1def518ed4ba18cc4ba66cc774436167: Status 404 returned error can't find the container with id 367b20646b35759313f66e2deebf0c3f1def518ed4ba18cc4ba66cc774436167 Mar 13 12:07:25 crc kubenswrapper[4837]: W0313 12:07:25.329127 4837 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda44db1d6_6da2_41a5_a37f_ffc602f0d55a.slice/crio-conmon-843cf40344096a3f0565478be09bc819697f7ebe87515db62c711cd361ef6ce2.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda44db1d6_6da2_41a5_a37f_ffc602f0d55a.slice/crio-conmon-843cf40344096a3f0565478be09bc819697f7ebe87515db62c711cd361ef6ce2.scope: no such file or directory Mar 13 12:07:25 crc kubenswrapper[4837]: W0313 12:07:25.329182 4837 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda44db1d6_6da2_41a5_a37f_ffc602f0d55a.slice/crio-843cf40344096a3f0565478be09bc819697f7ebe87515db62c711cd361ef6ce2.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda44db1d6_6da2_41a5_a37f_ffc602f0d55a.slice/crio-843cf40344096a3f0565478be09bc819697f7ebe87515db62c711cd361ef6ce2.scope: no such file or directory Mar 13 12:07:25 crc kubenswrapper[4837]: W0313 12:07:25.353942 4837 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e1cc75f_7386_440f_ba9f_9c3fd7b7d4ee.slice/crio-conmon-efca9fe013107e0157b7dfeec701a6bf70c8455183d2d8d806b63a4c79489237.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e1cc75f_7386_440f_ba9f_9c3fd7b7d4ee.slice/crio-conmon-efca9fe013107e0157b7dfeec701a6bf70c8455183d2d8d806b63a4c79489237.scope: no such file or directory Mar 13 12:07:25 crc kubenswrapper[4837]: W0313 12:07:25.353988 4837 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e1cc75f_7386_440f_ba9f_9c3fd7b7d4ee.slice/crio-efca9fe013107e0157b7dfeec701a6bf70c8455183d2d8d806b63a4c79489237.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e1cc75f_7386_440f_ba9f_9c3fd7b7d4ee.slice/crio-efca9fe013107e0157b7dfeec701a6bf70c8455183d2d8d806b63a4c79489237.scope: no such file or directory Mar 13 12:07:25 crc kubenswrapper[4837]: W0313 12:07:25.386466 4837 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8105a7ee_7d4e_471a_a39d_b3b9b75c3dcc.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8105a7ee_7d4e_471a_a39d_b3b9b75c3dcc.slice: no such file or directory Mar 13 12:07:25 crc kubenswrapper[4837]: E0313 12:07:25.647653 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08c7b2a5_b0b8_433f_b55d_c64eaeea8b76.slice/crio-117a085c3636a60886a4974e5b0fb9b17907bfbb02c0f28e14e88a6a4aada355.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95b808e7_674f_4592_af6e_f7c8682f6a17.slice/crio-ba3dda01a90b7b0d00508491184e90f099c7ae7bc849213376ebbc68b88ffd0f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08c7b2a5_b0b8_433f_b55d_c64eaeea8b76.slice/crio-conmon-117a085c3636a60886a4974e5b0fb9b17907bfbb02c0f28e14e88a6a4aada355.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08c7b2a5_b0b8_433f_b55d_c64eaeea8b76.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95b808e7_674f_4592_af6e_f7c8682f6a17.slice/crio-conmon-ba3dda01a90b7b0d00508491184e90f099c7ae7bc849213376ebbc68b88ffd0f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda44db1d6_6da2_41a5_a37f_ffc602f0d55a.slice/crio-d87408c4f80f070da48980a1c0c42ec26d6e0f566d37471876ae97d32157796e\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95b808e7_674f_4592_af6e_f7c8682f6a17.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08c7b2a5_b0b8_433f_b55d_c64eaeea8b76.slice/crio-d8d4fa30fd1f227e47a679c4ebd48ddee761f9902a8c45ed343c205dc3f7e3b1\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95b808e7_674f_4592_af6e_f7c8682f6a17.slice/crio-02cdc5326e2dbc385d4e7090105a3655b6651929ef4db12950f0c379aaf98274\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e1cc75f_7386_440f_ba9f_9c3fd7b7d4ee.slice/crio-conmon-367b20646b35759313f66e2deebf0c3f1def518ed4ba18cc4ba66cc774436167.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda44db1d6_6da2_41a5_a37f_ffc602f0d55a.slice\": RecentStats: unable to find data in memory cache]" Mar 13 12:07:25 crc kubenswrapper[4837]: I0313 12:07:25.657429 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-67f9f46cf4-9cvcg" Mar 13 12:07:25 crc kubenswrapper[4837]: I0313 12:07:25.684736 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6d84f6b8c8-8rrwq"] Mar 13 12:07:25 crc kubenswrapper[4837]: I0313 12:07:25.885341 4837 scope.go:117] "RemoveContainer" containerID="1a04d5901dd1375cafd0fc584ce462f13000b8c9b02a1c2603aedb866420cd51" Mar 13 12:07:25 crc kubenswrapper[4837]: I0313 12:07:25.971421 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-c5479d889-t9mnp"] Mar 13 12:07:25 crc kubenswrapper[4837]: I0313 12:07:25.971716 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-c5479d889-t9mnp" podUID="0faefca0-6038-4bdf-856e-b7cb5b6c5536" containerName="neutron-api" containerID="cri-o://9a02a987a1d45aed6ebc32b498a9af8ccb4aa210832c48787a22a25a2228e529" gracePeriod=30 Mar 13 12:07:25 crc kubenswrapper[4837]: I0313 12:07:25.972470 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-c5479d889-t9mnp" podUID="0faefca0-6038-4bdf-856e-b7cb5b6c5536" containerName="neutron-httpd" containerID="cri-o://541466fc166402b9bfa4140bd97e50553b49c072d454b4f07847687fa559214e" gracePeriod=30 Mar 13 12:07:25 crc kubenswrapper[4837]: I0313 12:07:25.992018 4837 generic.go:334] "Generic (PLEG): container finished" podID="5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14" containerID="7f77bd5a27791856de608c8a08f3c83e1663f61407b889e9328671983bac96ca" exitCode=137 Mar 13 12:07:25 crc kubenswrapper[4837]: I0313 12:07:25.992051 4837 generic.go:334] "Generic (PLEG): container finished" podID="5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14" containerID="b196a9394882e394baaf7222251dcba129911dfdfe911b4d1d679d89adbed206" exitCode=137 Mar 13 12:07:25 crc kubenswrapper[4837]: I0313 12:07:25.992138 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c6787dc45-zbdfx" event={"ID":"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14","Type":"ContainerDied","Data":"7f77bd5a27791856de608c8a08f3c83e1663f61407b889e9328671983bac96ca"} Mar 13 12:07:25 crc kubenswrapper[4837]: I0313 12:07:25.992170 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c6787dc45-zbdfx" event={"ID":"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14","Type":"ContainerDied","Data":"b196a9394882e394baaf7222251dcba129911dfdfe911b4d1d679d89adbed206"} Mar 13 12:07:25 crc kubenswrapper[4837]: I0313 12:07:25.997098 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-667d547b9-4p8qm"] Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:25.999189 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-667d547b9-4p8qm" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.005867 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6d84f6b8c8-8rrwq" event={"ID":"74c7e377-b579-47bc-a992-cca0cf047627","Type":"ContainerStarted","Data":"cf495db3422cf0d35cd836e716531a2384380d9bbe3980dd393e4bcfb3fe6343"} Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.036749 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-667d547b9-4p8qm"] Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.039766 4837 generic.go:334] "Generic (PLEG): container finished" podID="1f2afb5c-bfb2-4349-8000-4c0c90892d56" containerID="ccf4fdc9606b0ae8a6ecc82badd31da8c6fddc1f4294bee13d5805f8da627b43" exitCode=137 Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.039885 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6b5f9b5c85-p584g" event={"ID":"1f2afb5c-bfb2-4349-8000-4c0c90892d56","Type":"ContainerDied","Data":"ccf4fdc9606b0ae8a6ecc82badd31da8c6fddc1f4294bee13d5805f8da627b43"} Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.039918 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6b5f9b5c85-p584g" event={"ID":"1f2afb5c-bfb2-4349-8000-4c0c90892d56","Type":"ContainerDied","Data":"18955fe5d50dd684cdf6370fe66929e8470b6ef70461302b53ed56fa3595256c"} Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.039932 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18955fe5d50dd684cdf6370fe66929e8470b6ef70461302b53ed56fa3595256c" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.074513 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6b5f9b5c85-p584g" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.081003 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8944c2be-da67-4cdd-9f75-0e473253e932","Type":"ContainerStarted","Data":"343420b862af8b30fbf01c83c65d52d9d2faba010cdc819f2b823e8a9b058006"} Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.099042 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-c6787dc45-zbdfx" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.109094 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-c5479d889-t9mnp" podUID="0faefca0-6038-4bdf-856e-b7cb5b6c5536" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.158:9696/\": read tcp 10.217.0.2:41566->10.217.0.158:9696: read: connection reset by peer" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.131717 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scc7s\" (UniqueName: \"kubernetes.io/projected/1f2afb5c-bfb2-4349-8000-4c0c90892d56-kube-api-access-scc7s\") pod \"1f2afb5c-bfb2-4349-8000-4c0c90892d56\" (UID: \"1f2afb5c-bfb2-4349-8000-4c0c90892d56\") " Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.131794 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f2afb5c-bfb2-4349-8000-4c0c90892d56-scripts\") pod \"1f2afb5c-bfb2-4349-8000-4c0c90892d56\" (UID: \"1f2afb5c-bfb2-4349-8000-4c0c90892d56\") " Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.131853 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f2afb5c-bfb2-4349-8000-4c0c90892d56-logs\") pod \"1f2afb5c-bfb2-4349-8000-4c0c90892d56\" (UID: \"1f2afb5c-bfb2-4349-8000-4c0c90892d56\") " Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.132007 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1f2afb5c-bfb2-4349-8000-4c0c90892d56-horizon-secret-key\") pod \"1f2afb5c-bfb2-4349-8000-4c0c90892d56\" (UID: \"1f2afb5c-bfb2-4349-8000-4c0c90892d56\") " Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.132041 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1f2afb5c-bfb2-4349-8000-4c0c90892d56-config-data\") pod \"1f2afb5c-bfb2-4349-8000-4c0c90892d56\" (UID: \"1f2afb5c-bfb2-4349-8000-4c0c90892d56\") " Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.132355 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c00dfc0-061b-43ba-b529-a89c9157a0cf-combined-ca-bundle\") pod \"neutron-667d547b9-4p8qm\" (UID: \"3c00dfc0-061b-43ba-b529-a89c9157a0cf\") " pod="openstack/neutron-667d547b9-4p8qm" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.132446 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c00dfc0-061b-43ba-b529-a89c9157a0cf-internal-tls-certs\") pod \"neutron-667d547b9-4p8qm\" (UID: \"3c00dfc0-061b-43ba-b529-a89c9157a0cf\") " pod="openstack/neutron-667d547b9-4p8qm" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.132514 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c00dfc0-061b-43ba-b529-a89c9157a0cf-public-tls-certs\") pod \"neutron-667d547b9-4p8qm\" (UID: \"3c00dfc0-061b-43ba-b529-a89c9157a0cf\") " pod="openstack/neutron-667d547b9-4p8qm" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.132544 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c00dfc0-061b-43ba-b529-a89c9157a0cf-ovndb-tls-certs\") pod \"neutron-667d547b9-4p8qm\" (UID: \"3c00dfc0-061b-43ba-b529-a89c9157a0cf\") " pod="openstack/neutron-667d547b9-4p8qm" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.132595 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3c00dfc0-061b-43ba-b529-a89c9157a0cf-config\") pod \"neutron-667d547b9-4p8qm\" (UID: \"3c00dfc0-061b-43ba-b529-a89c9157a0cf\") " pod="openstack/neutron-667d547b9-4p8qm" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.132619 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wngx\" (UniqueName: \"kubernetes.io/projected/3c00dfc0-061b-43ba-b529-a89c9157a0cf-kube-api-access-9wngx\") pod \"neutron-667d547b9-4p8qm\" (UID: \"3c00dfc0-061b-43ba-b529-a89c9157a0cf\") " pod="openstack/neutron-667d547b9-4p8qm" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.132662 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3c00dfc0-061b-43ba-b529-a89c9157a0cf-httpd-config\") pod \"neutron-667d547b9-4p8qm\" (UID: \"3c00dfc0-061b-43ba-b529-a89c9157a0cf\") " pod="openstack/neutron-667d547b9-4p8qm" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.146274 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f2afb5c-bfb2-4349-8000-4c0c90892d56-logs" (OuterVolumeSpecName: "logs") pod "1f2afb5c-bfb2-4349-8000-4c0c90892d56" (UID: "1f2afb5c-bfb2-4349-8000-4c0c90892d56"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.148888 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f2afb5c-bfb2-4349-8000-4c0c90892d56-kube-api-access-scc7s" (OuterVolumeSpecName: "kube-api-access-scc7s") pod "1f2afb5c-bfb2-4349-8000-4c0c90892d56" (UID: "1f2afb5c-bfb2-4349-8000-4c0c90892d56"). InnerVolumeSpecName "kube-api-access-scc7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.153895 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f2afb5c-bfb2-4349-8000-4c0c90892d56-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "1f2afb5c-bfb2-4349-8000-4c0c90892d56" (UID: "1f2afb5c-bfb2-4349-8000-4c0c90892d56"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.212113 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f2afb5c-bfb2-4349-8000-4c0c90892d56-config-data" (OuterVolumeSpecName: "config-data") pod "1f2afb5c-bfb2-4349-8000-4c0c90892d56" (UID: "1f2afb5c-bfb2-4349-8000-4c0c90892d56"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.231787 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f2afb5c-bfb2-4349-8000-4c0c90892d56-scripts" (OuterVolumeSpecName: "scripts") pod "1f2afb5c-bfb2-4349-8000-4c0c90892d56" (UID: "1f2afb5c-bfb2-4349-8000-4c0c90892d56"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.233811 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-horizon-secret-key\") pod \"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14\" (UID: \"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14\") " Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.233940 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-config-data\") pod \"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14\" (UID: \"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14\") " Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.233993 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-scripts\") pod \"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14\" (UID: \"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14\") " Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.234100 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v796v\" (UniqueName: \"kubernetes.io/projected/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-kube-api-access-v796v\") pod \"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14\" (UID: \"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14\") " Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.234210 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-logs\") pod \"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14\" (UID: \"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14\") " Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.234451 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c00dfc0-061b-43ba-b529-a89c9157a0cf-combined-ca-bundle\") pod \"neutron-667d547b9-4p8qm\" (UID: \"3c00dfc0-061b-43ba-b529-a89c9157a0cf\") " pod="openstack/neutron-667d547b9-4p8qm" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.234582 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c00dfc0-061b-43ba-b529-a89c9157a0cf-internal-tls-certs\") pod \"neutron-667d547b9-4p8qm\" (UID: \"3c00dfc0-061b-43ba-b529-a89c9157a0cf\") " pod="openstack/neutron-667d547b9-4p8qm" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.234764 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c00dfc0-061b-43ba-b529-a89c9157a0cf-public-tls-certs\") pod \"neutron-667d547b9-4p8qm\" (UID: \"3c00dfc0-061b-43ba-b529-a89c9157a0cf\") " pod="openstack/neutron-667d547b9-4p8qm" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.234860 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c00dfc0-061b-43ba-b529-a89c9157a0cf-ovndb-tls-certs\") pod \"neutron-667d547b9-4p8qm\" (UID: \"3c00dfc0-061b-43ba-b529-a89c9157a0cf\") " pod="openstack/neutron-667d547b9-4p8qm" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.234934 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3c00dfc0-061b-43ba-b529-a89c9157a0cf-config\") pod \"neutron-667d547b9-4p8qm\" (UID: \"3c00dfc0-061b-43ba-b529-a89c9157a0cf\") " pod="openstack/neutron-667d547b9-4p8qm" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.234955 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wngx\" (UniqueName: \"kubernetes.io/projected/3c00dfc0-061b-43ba-b529-a89c9157a0cf-kube-api-access-9wngx\") pod \"neutron-667d547b9-4p8qm\" (UID: \"3c00dfc0-061b-43ba-b529-a89c9157a0cf\") " pod="openstack/neutron-667d547b9-4p8qm" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.234991 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3c00dfc0-061b-43ba-b529-a89c9157a0cf-httpd-config\") pod \"neutron-667d547b9-4p8qm\" (UID: \"3c00dfc0-061b-43ba-b529-a89c9157a0cf\") " pod="openstack/neutron-667d547b9-4p8qm" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.235146 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scc7s\" (UniqueName: \"kubernetes.io/projected/1f2afb5c-bfb2-4349-8000-4c0c90892d56-kube-api-access-scc7s\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.235161 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f2afb5c-bfb2-4349-8000-4c0c90892d56-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.235173 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f2afb5c-bfb2-4349-8000-4c0c90892d56-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.235184 4837 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1f2afb5c-bfb2-4349-8000-4c0c90892d56-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.235195 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1f2afb5c-bfb2-4349-8000-4c0c90892d56-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.236569 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-logs" (OuterVolumeSpecName: "logs") pod "5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14" (UID: "5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.242258 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3c00dfc0-061b-43ba-b529-a89c9157a0cf-config\") pod \"neutron-667d547b9-4p8qm\" (UID: \"3c00dfc0-061b-43ba-b529-a89c9157a0cf\") " pod="openstack/neutron-667d547b9-4p8qm" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.266662 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14" (UID: "5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.271890 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-kube-api-access-v796v" (OuterVolumeSpecName: "kube-api-access-v796v") pod "5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14" (UID: "5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14"). InnerVolumeSpecName "kube-api-access-v796v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.275226 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3c00dfc0-061b-43ba-b529-a89c9157a0cf-httpd-config\") pod \"neutron-667d547b9-4p8qm\" (UID: \"3c00dfc0-061b-43ba-b529-a89c9157a0cf\") " pod="openstack/neutron-667d547b9-4p8qm" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.275370 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c00dfc0-061b-43ba-b529-a89c9157a0cf-internal-tls-certs\") pod \"neutron-667d547b9-4p8qm\" (UID: \"3c00dfc0-061b-43ba-b529-a89c9157a0cf\") " pod="openstack/neutron-667d547b9-4p8qm" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.275426 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c00dfc0-061b-43ba-b529-a89c9157a0cf-combined-ca-bundle\") pod \"neutron-667d547b9-4p8qm\" (UID: \"3c00dfc0-061b-43ba-b529-a89c9157a0cf\") " pod="openstack/neutron-667d547b9-4p8qm" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.287847 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c00dfc0-061b-43ba-b529-a89c9157a0cf-public-tls-certs\") pod \"neutron-667d547b9-4p8qm\" (UID: \"3c00dfc0-061b-43ba-b529-a89c9157a0cf\") " pod="openstack/neutron-667d547b9-4p8qm" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.293399 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c00dfc0-061b-43ba-b529-a89c9157a0cf-ovndb-tls-certs\") pod \"neutron-667d547b9-4p8qm\" (UID: \"3c00dfc0-061b-43ba-b529-a89c9157a0cf\") " pod="openstack/neutron-667d547b9-4p8qm" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.294380 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wngx\" (UniqueName: \"kubernetes.io/projected/3c00dfc0-061b-43ba-b529-a89c9157a0cf-kube-api-access-9wngx\") pod \"neutron-667d547b9-4p8qm\" (UID: \"3c00dfc0-061b-43ba-b529-a89c9157a0cf\") " pod="openstack/neutron-667d547b9-4p8qm" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.320993 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-config-data" (OuterVolumeSpecName: "config-data") pod "5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14" (UID: "5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.323807 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-scripts" (OuterVolumeSpecName: "scripts") pod "5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14" (UID: "5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.337245 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.337317 4837 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.337334 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.337350 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.337364 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v796v\" (UniqueName: \"kubernetes.io/projected/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14-kube-api-access-v796v\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:26 crc kubenswrapper[4837]: I0313 12:07:26.388538 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-667d547b9-4p8qm" Mar 13 12:07:27 crc kubenswrapper[4837]: I0313 12:07:27.027014 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-667d547b9-4p8qm"] Mar 13 12:07:27 crc kubenswrapper[4837]: I0313 12:07:27.097256 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8944c2be-da67-4cdd-9f75-0e473253e932","Type":"ContainerStarted","Data":"7d867af8873a7e0421fd52164c9458573cf6ac8847b38f845cf622f104ceb41b"} Mar 13 12:07:27 crc kubenswrapper[4837]: I0313 12:07:27.107995 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c6787dc45-zbdfx" event={"ID":"5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14","Type":"ContainerDied","Data":"c8e41db64721802eb9e2d30e33b7feaf3f233822df5127e44d2dee0b5f64ca8a"} Mar 13 12:07:27 crc kubenswrapper[4837]: I0313 12:07:27.108231 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-c6787dc45-zbdfx" Mar 13 12:07:27 crc kubenswrapper[4837]: I0313 12:07:27.108263 4837 scope.go:117] "RemoveContainer" containerID="7f77bd5a27791856de608c8a08f3c83e1663f61407b889e9328671983bac96ca" Mar 13 12:07:27 crc kubenswrapper[4837]: I0313 12:07:27.113036 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6d84f6b8c8-8rrwq" event={"ID":"74c7e377-b579-47bc-a992-cca0cf047627","Type":"ContainerStarted","Data":"bda4417fc8cd38d3600910af72f405f811adc75b43760e4deafc89dbb5440630"} Mar 13 12:07:27 crc kubenswrapper[4837]: I0313 12:07:27.113082 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6d84f6b8c8-8rrwq" event={"ID":"74c7e377-b579-47bc-a992-cca0cf047627","Type":"ContainerStarted","Data":"f764b12423b04c656194dce53df7362f4f08a5d964fb08ede56b50cd03df2f6b"} Mar 13 12:07:27 crc kubenswrapper[4837]: I0313 12:07:27.113295 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6d84f6b8c8-8rrwq" Mar 13 12:07:27 crc kubenswrapper[4837]: I0313 12:07:27.113347 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6d84f6b8c8-8rrwq" Mar 13 12:07:27 crc kubenswrapper[4837]: I0313 12:07:27.124122 4837 generic.go:334] "Generic (PLEG): container finished" podID="0faefca0-6038-4bdf-856e-b7cb5b6c5536" containerID="541466fc166402b9bfa4140bd97e50553b49c072d454b4f07847687fa559214e" exitCode=0 Mar 13 12:07:27 crc kubenswrapper[4837]: I0313 12:07:27.124219 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c5479d889-t9mnp" event={"ID":"0faefca0-6038-4bdf-856e-b7cb5b6c5536","Type":"ContainerDied","Data":"541466fc166402b9bfa4140bd97e50553b49c072d454b4f07847687fa559214e"} Mar 13 12:07:27 crc kubenswrapper[4837]: I0313 12:07:27.129073 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6b5f9b5c85-p584g" Mar 13 12:07:27 crc kubenswrapper[4837]: I0313 12:07:27.131295 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-667d547b9-4p8qm" event={"ID":"3c00dfc0-061b-43ba-b529-a89c9157a0cf","Type":"ContainerStarted","Data":"752e078a847a78d0c6486ef91987f5453c24a2b19503c1b1179258d77a7a485b"} Mar 13 12:07:27 crc kubenswrapper[4837]: I0313 12:07:27.145422 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6d84f6b8c8-8rrwq" podStartSLOduration=3.145405275 podStartE2EDuration="3.145405275s" podCreationTimestamp="2026-03-13 12:07:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:07:27.136055952 +0000 UTC m=+1162.774322715" watchObservedRunningTime="2026-03-13 12:07:27.145405275 +0000 UTC m=+1162.783672038" Mar 13 12:07:27 crc kubenswrapper[4837]: I0313 12:07:27.283238 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6b5f9b5c85-p584g"] Mar 13 12:07:27 crc kubenswrapper[4837]: I0313 12:07:27.298137 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6b5f9b5c85-p584g"] Mar 13 12:07:27 crc kubenswrapper[4837]: I0313 12:07:27.306709 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-c6787dc45-zbdfx"] Mar 13 12:07:27 crc kubenswrapper[4837]: I0313 12:07:27.313940 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-c6787dc45-zbdfx"] Mar 13 12:07:27 crc kubenswrapper[4837]: I0313 12:07:27.427842 4837 scope.go:117] "RemoveContainer" containerID="b196a9394882e394baaf7222251dcba129911dfdfe911b4d1d679d89adbed206" Mar 13 12:07:28 crc kubenswrapper[4837]: I0313 12:07:28.151868 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-667d547b9-4p8qm" event={"ID":"3c00dfc0-061b-43ba-b529-a89c9157a0cf","Type":"ContainerStarted","Data":"a0dcaabad8d1b5fb470055b19ec292f56bcf50b4090047b7711c1d3f3cea96e6"} Mar 13 12:07:28 crc kubenswrapper[4837]: I0313 12:07:28.152495 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-667d547b9-4p8qm" event={"ID":"3c00dfc0-061b-43ba-b529-a89c9157a0cf","Type":"ContainerStarted","Data":"1779fe880258500d895a7213d8cb917cc09aed870df6ff19a3eb464702b779bb"} Mar 13 12:07:28 crc kubenswrapper[4837]: I0313 12:07:28.182492 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-667d547b9-4p8qm" podStartSLOduration=3.182475064 podStartE2EDuration="3.182475064s" podCreationTimestamp="2026-03-13 12:07:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:07:28.175452252 +0000 UTC m=+1163.813719015" watchObservedRunningTime="2026-03-13 12:07:28.182475064 +0000 UTC m=+1163.820741827" Mar 13 12:07:28 crc kubenswrapper[4837]: E0313 12:07:28.377870 4837 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/9a25fa6b22bb0edd887bfaf8ac40afc80451636491068fad146b7092f8766aa1/diff" to get inode usage: stat /var/lib/containers/storage/overlay/9a25fa6b22bb0edd887bfaf8ac40afc80451636491068fad146b7092f8766aa1/diff: no such file or directory, extraDiskErr: Mar 13 12:07:28 crc kubenswrapper[4837]: I0313 12:07:28.939436 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.062400 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f2afb5c-bfb2-4349-8000-4c0c90892d56" path="/var/lib/kubelet/pods/1f2afb5c-bfb2-4349-8000-4c0c90892d56/volumes" Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.063551 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14" path="/var/lib/kubelet/pods/5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14/volumes" Mar 13 12:07:29 crc kubenswrapper[4837]: E0313 12:07:29.083083 4837 kubelet_node_status.go:756] "Failed to set some node status fields" err="failed to validate nodeIP: route ip+net: no such network interface" node="crc" Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.187914 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8944c2be-da67-4cdd-9f75-0e473253e932","Type":"ContainerStarted","Data":"317a9b585064c8564bcd2ae43ba40834f6b1cd25a3d83d32f956502b5b280276"} Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.188786 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.211060 4837 generic.go:334] "Generic (PLEG): container finished" podID="0faefca0-6038-4bdf-856e-b7cb5b6c5536" containerID="9a02a987a1d45aed6ebc32b498a9af8ccb4aa210832c48787a22a25a2228e529" exitCode=0 Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.212363 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c5479d889-t9mnp" event={"ID":"0faefca0-6038-4bdf-856e-b7cb5b6c5536","Type":"ContainerDied","Data":"9a02a987a1d45aed6ebc32b498a9af8ccb4aa210832c48787a22a25a2228e529"} Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.212417 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-667d547b9-4p8qm" Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.267751 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.307582 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.705845236 podStartE2EDuration="8.307565097s" podCreationTimestamp="2026-03-13 12:07:21 +0000 UTC" firstStartedPulling="2026-03-13 12:07:23.444414269 +0000 UTC m=+1159.082681032" lastFinishedPulling="2026-03-13 12:07:28.04613412 +0000 UTC m=+1163.684400893" observedRunningTime="2026-03-13 12:07:29.219118158 +0000 UTC m=+1164.857384921" watchObservedRunningTime="2026-03-13 12:07:29.307565097 +0000 UTC m=+1164.945831860" Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.337988 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-fd6ddfd9b-f66l8" Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.355774 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c5479d889-t9mnp" Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.357800 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.410173 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.411369 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-httpd-config\") pod \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\" (UID: \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\") " Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.411426 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-ovndb-tls-certs\") pod \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\" (UID: \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\") " Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.411519 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-config\") pod \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\" (UID: \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\") " Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.411580 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vh2b\" (UniqueName: \"kubernetes.io/projected/0faefca0-6038-4bdf-856e-b7cb5b6c5536-kube-api-access-4vh2b\") pod \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\" (UID: \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\") " Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.411856 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-combined-ca-bundle\") pod \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\" (UID: \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\") " Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.411950 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-public-tls-certs\") pod \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\" (UID: \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\") " Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.411983 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-internal-tls-certs\") pod \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\" (UID: \"0faefca0-6038-4bdf-856e-b7cb5b6c5536\") " Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.502174 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "0faefca0-6038-4bdf-856e-b7cb5b6c5536" (UID: "0faefca0-6038-4bdf-856e-b7cb5b6c5536"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.517053 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0faefca0-6038-4bdf-856e-b7cb5b6c5536-kube-api-access-4vh2b" (OuterVolumeSpecName: "kube-api-access-4vh2b") pod "0faefca0-6038-4bdf-856e-b7cb5b6c5536" (UID: "0faefca0-6038-4bdf-856e-b7cb5b6c5536"). InnerVolumeSpecName "kube-api-access-4vh2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.523937 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vh2b\" (UniqueName: \"kubernetes.io/projected/0faefca0-6038-4bdf-856e-b7cb5b6c5536-kube-api-access-4vh2b\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.523996 4837 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.558902 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0faefca0-6038-4bdf-856e-b7cb5b6c5536" (UID: "0faefca0-6038-4bdf-856e-b7cb5b6c5536"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.573169 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0faefca0-6038-4bdf-856e-b7cb5b6c5536" (UID: "0faefca0-6038-4bdf-856e-b7cb5b6c5536"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.588725 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-t9gtj"] Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.589120 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" podUID="e06de12b-6071-4dce-81f1-68539347ca19" containerName="dnsmasq-dns" containerID="cri-o://3ac59c1680a2ceb3bddaf98537b9cf745919c45cdb4c77563d14fc2bd8920764" gracePeriod=10 Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.589769 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0faefca0-6038-4bdf-856e-b7cb5b6c5536" (UID: "0faefca0-6038-4bdf-856e-b7cb5b6c5536"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.614985 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-config" (OuterVolumeSpecName: "config") pod "0faefca0-6038-4bdf-856e-b7cb5b6c5536" (UID: "0faefca0-6038-4bdf-856e-b7cb5b6c5536"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.627295 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.627344 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.627357 4837 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.627369 4837 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.634020 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.661936 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "0faefca0-6038-4bdf-856e-b7cb5b6c5536" (UID: "0faefca0-6038-4bdf-856e-b7cb5b6c5536"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:29 crc kubenswrapper[4837]: I0313 12:07:29.730823 4837 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0faefca0-6038-4bdf-856e-b7cb5b6c5536-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.174141 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.228863 4837 generic.go:334] "Generic (PLEG): container finished" podID="e06de12b-6071-4dce-81f1-68539347ca19" containerID="3ac59c1680a2ceb3bddaf98537b9cf745919c45cdb4c77563d14fc2bd8920764" exitCode=0 Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.228924 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" event={"ID":"e06de12b-6071-4dce-81f1-68539347ca19","Type":"ContainerDied","Data":"3ac59c1680a2ceb3bddaf98537b9cf745919c45cdb4c77563d14fc2bd8920764"} Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.228952 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" event={"ID":"e06de12b-6071-4dce-81f1-68539347ca19","Type":"ContainerDied","Data":"bd08737ad8dd4994dc887a5676bf16b9103265c49a66d4c535944bbd694008c2"} Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.228967 4837 scope.go:117] "RemoveContainer" containerID="3ac59c1680a2ceb3bddaf98537b9cf745919c45cdb4c77563d14fc2bd8920764" Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.229071 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-t9gtj" Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.239979 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c5479d889-t9mnp" Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.240765 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c5479d889-t9mnp" event={"ID":"0faefca0-6038-4bdf-856e-b7cb5b6c5536","Type":"ContainerDied","Data":"ee2f2c6cd7031c0c388b4947ca3445235863139c835bb92b8b4570fbe2c76095"} Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.240990 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="de6b1e01-3054-46d9-b2f3-a8f3a7e504af" containerName="cinder-scheduler" containerID="cri-o://a728200e4d66707c01f4e20cc7de5a1c1266b885af01c5ea8dd37d28e1bdd6bc" gracePeriod=30 Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.241177 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="de6b1e01-3054-46d9-b2f3-a8f3a7e504af" containerName="probe" containerID="cri-o://997441b3a97d2775eceaed89ffadcc848e11b76800ad4277b31bca177c72edfb" gracePeriod=30 Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.241970 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-dns-swift-storage-0\") pod \"e06de12b-6071-4dce-81f1-68539347ca19\" (UID: \"e06de12b-6071-4dce-81f1-68539347ca19\") " Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.242085 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-ovsdbserver-sb\") pod \"e06de12b-6071-4dce-81f1-68539347ca19\" (UID: \"e06de12b-6071-4dce-81f1-68539347ca19\") " Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.242124 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-dns-svc\") pod \"e06de12b-6071-4dce-81f1-68539347ca19\" (UID: \"e06de12b-6071-4dce-81f1-68539347ca19\") " Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.242255 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r558r\" (UniqueName: \"kubernetes.io/projected/e06de12b-6071-4dce-81f1-68539347ca19-kube-api-access-r558r\") pod \"e06de12b-6071-4dce-81f1-68539347ca19\" (UID: \"e06de12b-6071-4dce-81f1-68539347ca19\") " Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.242298 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-config\") pod \"e06de12b-6071-4dce-81f1-68539347ca19\" (UID: \"e06de12b-6071-4dce-81f1-68539347ca19\") " Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.242332 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-ovsdbserver-nb\") pod \"e06de12b-6071-4dce-81f1-68539347ca19\" (UID: \"e06de12b-6071-4dce-81f1-68539347ca19\") " Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.263867 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e06de12b-6071-4dce-81f1-68539347ca19-kube-api-access-r558r" (OuterVolumeSpecName: "kube-api-access-r558r") pod "e06de12b-6071-4dce-81f1-68539347ca19" (UID: "e06de12b-6071-4dce-81f1-68539347ca19"). InnerVolumeSpecName "kube-api-access-r558r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.317378 4837 scope.go:117] "RemoveContainer" containerID="059e96d08021e09252961e7fbfcfd1f264e2e10514bec4760c10d5076b6990a3" Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.336303 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-config" (OuterVolumeSpecName: "config") pod "e06de12b-6071-4dce-81f1-68539347ca19" (UID: "e06de12b-6071-4dce-81f1-68539347ca19"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.350363 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r558r\" (UniqueName: \"kubernetes.io/projected/e06de12b-6071-4dce-81f1-68539347ca19-kube-api-access-r558r\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.350403 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.381825 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e06de12b-6071-4dce-81f1-68539347ca19" (UID: "e06de12b-6071-4dce-81f1-68539347ca19"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.405144 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e06de12b-6071-4dce-81f1-68539347ca19" (UID: "e06de12b-6071-4dce-81f1-68539347ca19"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.416725 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-c5479d889-t9mnp"] Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.420138 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e06de12b-6071-4dce-81f1-68539347ca19" (UID: "e06de12b-6071-4dce-81f1-68539347ca19"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.431957 4837 scope.go:117] "RemoveContainer" containerID="3ac59c1680a2ceb3bddaf98537b9cf745919c45cdb4c77563d14fc2bd8920764" Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.432221 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e06de12b-6071-4dce-81f1-68539347ca19" (UID: "e06de12b-6071-4dce-81f1-68539347ca19"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:07:30 crc kubenswrapper[4837]: E0313 12:07:30.435742 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ac59c1680a2ceb3bddaf98537b9cf745919c45cdb4c77563d14fc2bd8920764\": container with ID starting with 3ac59c1680a2ceb3bddaf98537b9cf745919c45cdb4c77563d14fc2bd8920764 not found: ID does not exist" containerID="3ac59c1680a2ceb3bddaf98537b9cf745919c45cdb4c77563d14fc2bd8920764" Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.435796 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ac59c1680a2ceb3bddaf98537b9cf745919c45cdb4c77563d14fc2bd8920764"} err="failed to get container status \"3ac59c1680a2ceb3bddaf98537b9cf745919c45cdb4c77563d14fc2bd8920764\": rpc error: code = NotFound desc = could not find container \"3ac59c1680a2ceb3bddaf98537b9cf745919c45cdb4c77563d14fc2bd8920764\": container with ID starting with 3ac59c1680a2ceb3bddaf98537b9cf745919c45cdb4c77563d14fc2bd8920764 not found: ID does not exist" Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.435834 4837 scope.go:117] "RemoveContainer" containerID="059e96d08021e09252961e7fbfcfd1f264e2e10514bec4760c10d5076b6990a3" Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.438001 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-c5479d889-t9mnp"] Mar 13 12:07:30 crc kubenswrapper[4837]: E0313 12:07:30.439764 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"059e96d08021e09252961e7fbfcfd1f264e2e10514bec4760c10d5076b6990a3\": container with ID starting with 059e96d08021e09252961e7fbfcfd1f264e2e10514bec4760c10d5076b6990a3 not found: ID does not exist" containerID="059e96d08021e09252961e7fbfcfd1f264e2e10514bec4760c10d5076b6990a3" Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.439803 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"059e96d08021e09252961e7fbfcfd1f264e2e10514bec4760c10d5076b6990a3"} err="failed to get container status \"059e96d08021e09252961e7fbfcfd1f264e2e10514bec4760c10d5076b6990a3\": rpc error: code = NotFound desc = could not find container \"059e96d08021e09252961e7fbfcfd1f264e2e10514bec4760c10d5076b6990a3\": container with ID starting with 059e96d08021e09252961e7fbfcfd1f264e2e10514bec4760c10d5076b6990a3 not found: ID does not exist" Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.439824 4837 scope.go:117] "RemoveContainer" containerID="541466fc166402b9bfa4140bd97e50553b49c072d454b4f07847687fa559214e" Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.455133 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.455204 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.455218 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.455238 4837 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e06de12b-6071-4dce-81f1-68539347ca19-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.464841 4837 scope.go:117] "RemoveContainer" containerID="9a02a987a1d45aed6ebc32b498a9af8ccb4aa210832c48787a22a25a2228e529" Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.560129 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-t9gtj"] Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.572622 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-t9gtj"] Mar 13 12:07:30 crc kubenswrapper[4837]: I0313 12:07:30.723026 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7598d89cd4-qfmh9" Mar 13 12:07:31 crc kubenswrapper[4837]: I0313 12:07:31.062370 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0faefca0-6038-4bdf-856e-b7cb5b6c5536" path="/var/lib/kubelet/pods/0faefca0-6038-4bdf-856e-b7cb5b6c5536/volumes" Mar 13 12:07:31 crc kubenswrapper[4837]: I0313 12:07:31.063384 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e06de12b-6071-4dce-81f1-68539347ca19" path="/var/lib/kubelet/pods/e06de12b-6071-4dce-81f1-68539347ca19/volumes" Mar 13 12:07:31 crc kubenswrapper[4837]: I0313 12:07:31.293570 4837 generic.go:334] "Generic (PLEG): container finished" podID="de6b1e01-3054-46d9-b2f3-a8f3a7e504af" containerID="997441b3a97d2775eceaed89ffadcc848e11b76800ad4277b31bca177c72edfb" exitCode=0 Mar 13 12:07:31 crc kubenswrapper[4837]: I0313 12:07:31.293631 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"de6b1e01-3054-46d9-b2f3-a8f3a7e504af","Type":"ContainerDied","Data":"997441b3a97d2775eceaed89ffadcc848e11b76800ad4277b31bca177c72edfb"} Mar 13 12:07:31 crc kubenswrapper[4837]: I0313 12:07:31.364243 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7598d89cd4-qfmh9" Mar 13 12:07:31 crc kubenswrapper[4837]: I0313 12:07:31.705796 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:07:32 crc kubenswrapper[4837]: I0313 12:07:32.124214 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-fd6ddfd9b-f66l8" Mar 13 12:07:32 crc kubenswrapper[4837]: I0313 12:07:32.209762 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5596f9dfb8-m9bxb"] Mar 13 12:07:32 crc kubenswrapper[4837]: I0313 12:07:32.314578 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5596f9dfb8-m9bxb" podUID="2a28d7a5-22a2-460a-a08c-8eb484e6c382" containerName="horizon-log" containerID="cri-o://92b3db8efc4bd781409e05974c86a887259d700facd2c2ab05a9fcc6613ce654" gracePeriod=30 Mar 13 12:07:32 crc kubenswrapper[4837]: I0313 12:07:32.314994 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5596f9dfb8-m9bxb" podUID="2a28d7a5-22a2-460a-a08c-8eb484e6c382" containerName="horizon" containerID="cri-o://7e464f7436823332f050e26237bc563d04c928c21ee9b8d3087ae1cc9a85aacb" gracePeriod=30 Mar 13 12:07:32 crc kubenswrapper[4837]: I0313 12:07:32.880987 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.000106 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.135470 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwghc\" (UniqueName: \"kubernetes.io/projected/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-kube-api-access-qwghc\") pod \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\" (UID: \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\") " Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.136143 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-etc-machine-id\") pod \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\" (UID: \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\") " Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.136339 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-combined-ca-bundle\") pod \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\" (UID: \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\") " Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.136394 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-scripts\") pod \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\" (UID: \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\") " Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.136395 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "de6b1e01-3054-46d9-b2f3-a8f3a7e504af" (UID: "de6b1e01-3054-46d9-b2f3-a8f3a7e504af"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.136447 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-config-data\") pod \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\" (UID: \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\") " Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.136542 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-config-data-custom\") pod \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\" (UID: \"de6b1e01-3054-46d9-b2f3-a8f3a7e504af\") " Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.137546 4837 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.145982 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-kube-api-access-qwghc" (OuterVolumeSpecName: "kube-api-access-qwghc") pod "de6b1e01-3054-46d9-b2f3-a8f3a7e504af" (UID: "de6b1e01-3054-46d9-b2f3-a8f3a7e504af"). InnerVolumeSpecName "kube-api-access-qwghc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.150792 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "de6b1e01-3054-46d9-b2f3-a8f3a7e504af" (UID: "de6b1e01-3054-46d9-b2f3-a8f3a7e504af"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.157781 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-scripts" (OuterVolumeSpecName: "scripts") pod "de6b1e01-3054-46d9-b2f3-a8f3a7e504af" (UID: "de6b1e01-3054-46d9-b2f3-a8f3a7e504af"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.219410 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de6b1e01-3054-46d9-b2f3-a8f3a7e504af" (UID: "de6b1e01-3054-46d9-b2f3-a8f3a7e504af"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.238937 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwghc\" (UniqueName: \"kubernetes.io/projected/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-kube-api-access-qwghc\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.238976 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.238988 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.239000 4837 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.250363 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-config-data" (OuterVolumeSpecName: "config-data") pod "de6b1e01-3054-46d9-b2f3-a8f3a7e504af" (UID: "de6b1e01-3054-46d9-b2f3-a8f3a7e504af"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.322602 4837 generic.go:334] "Generic (PLEG): container finished" podID="de6b1e01-3054-46d9-b2f3-a8f3a7e504af" containerID="a728200e4d66707c01f4e20cc7de5a1c1266b885af01c5ea8dd37d28e1bdd6bc" exitCode=0 Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.322677 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"de6b1e01-3054-46d9-b2f3-a8f3a7e504af","Type":"ContainerDied","Data":"a728200e4d66707c01f4e20cc7de5a1c1266b885af01c5ea8dd37d28e1bdd6bc"} Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.322718 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"de6b1e01-3054-46d9-b2f3-a8f3a7e504af","Type":"ContainerDied","Data":"036fe40da00c951a03639436f1d70f870b431fe6b0431a148132bcc8ff154aeb"} Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.322754 4837 scope.go:117] "RemoveContainer" containerID="997441b3a97d2775eceaed89ffadcc848e11b76800ad4277b31bca177c72edfb" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.323429 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.340427 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de6b1e01-3054-46d9-b2f3-a8f3a7e504af-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.342058 4837 scope.go:117] "RemoveContainer" containerID="a728200e4d66707c01f4e20cc7de5a1c1266b885af01c5ea8dd37d28e1bdd6bc" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.368600 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.368840 4837 scope.go:117] "RemoveContainer" containerID="997441b3a97d2775eceaed89ffadcc848e11b76800ad4277b31bca177c72edfb" Mar 13 12:07:33 crc kubenswrapper[4837]: E0313 12:07:33.369489 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"997441b3a97d2775eceaed89ffadcc848e11b76800ad4277b31bca177c72edfb\": container with ID starting with 997441b3a97d2775eceaed89ffadcc848e11b76800ad4277b31bca177c72edfb not found: ID does not exist" containerID="997441b3a97d2775eceaed89ffadcc848e11b76800ad4277b31bca177c72edfb" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.369600 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"997441b3a97d2775eceaed89ffadcc848e11b76800ad4277b31bca177c72edfb"} err="failed to get container status \"997441b3a97d2775eceaed89ffadcc848e11b76800ad4277b31bca177c72edfb\": rpc error: code = NotFound desc = could not find container \"997441b3a97d2775eceaed89ffadcc848e11b76800ad4277b31bca177c72edfb\": container with ID starting with 997441b3a97d2775eceaed89ffadcc848e11b76800ad4277b31bca177c72edfb not found: ID does not exist" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.369751 4837 scope.go:117] "RemoveContainer" containerID="a728200e4d66707c01f4e20cc7de5a1c1266b885af01c5ea8dd37d28e1bdd6bc" Mar 13 12:07:33 crc kubenswrapper[4837]: E0313 12:07:33.372101 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a728200e4d66707c01f4e20cc7de5a1c1266b885af01c5ea8dd37d28e1bdd6bc\": container with ID starting with a728200e4d66707c01f4e20cc7de5a1c1266b885af01c5ea8dd37d28e1bdd6bc not found: ID does not exist" containerID="a728200e4d66707c01f4e20cc7de5a1c1266b885af01c5ea8dd37d28e1bdd6bc" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.372143 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a728200e4d66707c01f4e20cc7de5a1c1266b885af01c5ea8dd37d28e1bdd6bc"} err="failed to get container status \"a728200e4d66707c01f4e20cc7de5a1c1266b885af01c5ea8dd37d28e1bdd6bc\": rpc error: code = NotFound desc = could not find container \"a728200e4d66707c01f4e20cc7de5a1c1266b885af01c5ea8dd37d28e1bdd6bc\": container with ID starting with a728200e4d66707c01f4e20cc7de5a1c1266b885af01c5ea8dd37d28e1bdd6bc not found: ID does not exist" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.379431 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.389193 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 12:07:33 crc kubenswrapper[4837]: E0313 12:07:33.389606 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de6b1e01-3054-46d9-b2f3-a8f3a7e504af" containerName="probe" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.389667 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="de6b1e01-3054-46d9-b2f3-a8f3a7e504af" containerName="probe" Mar 13 12:07:33 crc kubenswrapper[4837]: E0313 12:07:33.389688 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0faefca0-6038-4bdf-856e-b7cb5b6c5536" containerName="neutron-api" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.389697 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="0faefca0-6038-4bdf-856e-b7cb5b6c5536" containerName="neutron-api" Mar 13 12:07:33 crc kubenswrapper[4837]: E0313 12:07:33.389711 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de6b1e01-3054-46d9-b2f3-a8f3a7e504af" containerName="cinder-scheduler" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.389718 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="de6b1e01-3054-46d9-b2f3-a8f3a7e504af" containerName="cinder-scheduler" Mar 13 12:07:33 crc kubenswrapper[4837]: E0313 12:07:33.389742 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e06de12b-6071-4dce-81f1-68539347ca19" containerName="dnsmasq-dns" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.389749 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e06de12b-6071-4dce-81f1-68539347ca19" containerName="dnsmasq-dns" Mar 13 12:07:33 crc kubenswrapper[4837]: E0313 12:07:33.389764 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0faefca0-6038-4bdf-856e-b7cb5b6c5536" containerName="neutron-httpd" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.389771 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="0faefca0-6038-4bdf-856e-b7cb5b6c5536" containerName="neutron-httpd" Mar 13 12:07:33 crc kubenswrapper[4837]: E0313 12:07:33.389788 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14" containerName="horizon" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.389795 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14" containerName="horizon" Mar 13 12:07:33 crc kubenswrapper[4837]: E0313 12:07:33.389814 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14" containerName="horizon-log" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.389819 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14" containerName="horizon-log" Mar 13 12:07:33 crc kubenswrapper[4837]: E0313 12:07:33.389831 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f2afb5c-bfb2-4349-8000-4c0c90892d56" containerName="horizon" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.389837 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f2afb5c-bfb2-4349-8000-4c0c90892d56" containerName="horizon" Mar 13 12:07:33 crc kubenswrapper[4837]: E0313 12:07:33.389848 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e06de12b-6071-4dce-81f1-68539347ca19" containerName="init" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.389854 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e06de12b-6071-4dce-81f1-68539347ca19" containerName="init" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.390014 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="de6b1e01-3054-46d9-b2f3-a8f3a7e504af" containerName="cinder-scheduler" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.390024 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="e06de12b-6071-4dce-81f1-68539347ca19" containerName="dnsmasq-dns" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.390033 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f2afb5c-bfb2-4349-8000-4c0c90892d56" containerName="horizon" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.390048 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14" containerName="horizon" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.390055 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="de6b1e01-3054-46d9-b2f3-a8f3a7e504af" containerName="probe" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.390066 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bc7beb5-1ac9-4bcb-adc5-34cd40a67e14" containerName="horizon-log" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.390077 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="0faefca0-6038-4bdf-856e-b7cb5b6c5536" containerName="neutron-httpd" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.390087 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="0faefca0-6038-4bdf-856e-b7cb5b6c5536" containerName="neutron-api" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.391048 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.396462 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.411864 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.441952 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/580b8861-16eb-4142-bd61-6d0221a07f4d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"580b8861-16eb-4142-bd61-6d0221a07f4d\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.442119 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/580b8861-16eb-4142-bd61-6d0221a07f4d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"580b8861-16eb-4142-bd61-6d0221a07f4d\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.442184 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5q2b\" (UniqueName: \"kubernetes.io/projected/580b8861-16eb-4142-bd61-6d0221a07f4d-kube-api-access-f5q2b\") pod \"cinder-scheduler-0\" (UID: \"580b8861-16eb-4142-bd61-6d0221a07f4d\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.442393 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/580b8861-16eb-4142-bd61-6d0221a07f4d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"580b8861-16eb-4142-bd61-6d0221a07f4d\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.442592 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/580b8861-16eb-4142-bd61-6d0221a07f4d-config-data\") pod \"cinder-scheduler-0\" (UID: \"580b8861-16eb-4142-bd61-6d0221a07f4d\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.442675 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/580b8861-16eb-4142-bd61-6d0221a07f4d-scripts\") pod \"cinder-scheduler-0\" (UID: \"580b8861-16eb-4142-bd61-6d0221a07f4d\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.544057 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/580b8861-16eb-4142-bd61-6d0221a07f4d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"580b8861-16eb-4142-bd61-6d0221a07f4d\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.544226 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/580b8861-16eb-4142-bd61-6d0221a07f4d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"580b8861-16eb-4142-bd61-6d0221a07f4d\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.544291 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5q2b\" (UniqueName: \"kubernetes.io/projected/580b8861-16eb-4142-bd61-6d0221a07f4d-kube-api-access-f5q2b\") pod \"cinder-scheduler-0\" (UID: \"580b8861-16eb-4142-bd61-6d0221a07f4d\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.544325 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/580b8861-16eb-4142-bd61-6d0221a07f4d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"580b8861-16eb-4142-bd61-6d0221a07f4d\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.544365 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/580b8861-16eb-4142-bd61-6d0221a07f4d-config-data\") pod \"cinder-scheduler-0\" (UID: \"580b8861-16eb-4142-bd61-6d0221a07f4d\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.544388 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/580b8861-16eb-4142-bd61-6d0221a07f4d-scripts\") pod \"cinder-scheduler-0\" (UID: \"580b8861-16eb-4142-bd61-6d0221a07f4d\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.544489 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/580b8861-16eb-4142-bd61-6d0221a07f4d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"580b8861-16eb-4142-bd61-6d0221a07f4d\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.549751 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/580b8861-16eb-4142-bd61-6d0221a07f4d-scripts\") pod \"cinder-scheduler-0\" (UID: \"580b8861-16eb-4142-bd61-6d0221a07f4d\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.550393 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/580b8861-16eb-4142-bd61-6d0221a07f4d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"580b8861-16eb-4142-bd61-6d0221a07f4d\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.551442 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/580b8861-16eb-4142-bd61-6d0221a07f4d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"580b8861-16eb-4142-bd61-6d0221a07f4d\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.553480 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/580b8861-16eb-4142-bd61-6d0221a07f4d-config-data\") pod \"cinder-scheduler-0\" (UID: \"580b8861-16eb-4142-bd61-6d0221a07f4d\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.579830 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5q2b\" (UniqueName: \"kubernetes.io/projected/580b8861-16eb-4142-bd61-6d0221a07f4d-kube-api-access-f5q2b\") pod \"cinder-scheduler-0\" (UID: \"580b8861-16eb-4142-bd61-6d0221a07f4d\") " pod="openstack/cinder-scheduler-0" Mar 13 12:07:33 crc kubenswrapper[4837]: I0313 12:07:33.725282 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 12:07:34 crc kubenswrapper[4837]: I0313 12:07:34.180067 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 12:07:34 crc kubenswrapper[4837]: I0313 12:07:34.341385 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"580b8861-16eb-4142-bd61-6d0221a07f4d","Type":"ContainerStarted","Data":"5f16e4151a5c5a1d5e3bbbbef9382f3b0bfdb6f80b756732f6067a78ca15814c"} Mar 13 12:07:35 crc kubenswrapper[4837]: I0313 12:07:35.061594 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de6b1e01-3054-46d9-b2f3-a8f3a7e504af" path="/var/lib/kubelet/pods/de6b1e01-3054-46d9-b2f3-a8f3a7e504af/volumes" Mar 13 12:07:35 crc kubenswrapper[4837]: I0313 12:07:35.366782 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"580b8861-16eb-4142-bd61-6d0221a07f4d","Type":"ContainerStarted","Data":"70a8b3af953d1a1c3f624af85221c7d64ee1bd28cc05308446e5cef8cdae4234"} Mar 13 12:07:35 crc kubenswrapper[4837]: I0313 12:07:35.483579 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:07:35 crc kubenswrapper[4837]: I0313 12:07:35.483676 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:07:36 crc kubenswrapper[4837]: I0313 12:07:36.155536 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5596f9dfb8-m9bxb" podUID="2a28d7a5-22a2-460a-a08c-8eb484e6c382" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Mar 13 12:07:36 crc kubenswrapper[4837]: I0313 12:07:36.377997 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"580b8861-16eb-4142-bd61-6d0221a07f4d","Type":"ContainerStarted","Data":"10e53747353169370f11883e6aece96dc7e6854b97f5b0d0c3342f5f1fc98d51"} Mar 13 12:07:36 crc kubenswrapper[4837]: I0313 12:07:36.382518 4837 generic.go:334] "Generic (PLEG): container finished" podID="2a28d7a5-22a2-460a-a08c-8eb484e6c382" containerID="7e464f7436823332f050e26237bc563d04c928c21ee9b8d3087ae1cc9a85aacb" exitCode=0 Mar 13 12:07:36 crc kubenswrapper[4837]: I0313 12:07:36.382582 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5596f9dfb8-m9bxb" event={"ID":"2a28d7a5-22a2-460a-a08c-8eb484e6c382","Type":"ContainerDied","Data":"7e464f7436823332f050e26237bc563d04c928c21ee9b8d3087ae1cc9a85aacb"} Mar 13 12:07:36 crc kubenswrapper[4837]: I0313 12:07:36.417298 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.417270206 podStartE2EDuration="3.417270206s" podCreationTimestamp="2026-03-13 12:07:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:07:36.415603554 +0000 UTC m=+1172.053870317" watchObservedRunningTime="2026-03-13 12:07:36.417270206 +0000 UTC m=+1172.055536969" Mar 13 12:07:36 crc kubenswrapper[4837]: I0313 12:07:36.779113 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6d84f6b8c8-8rrwq" Mar 13 12:07:36 crc kubenswrapper[4837]: I0313 12:07:36.929203 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6d84f6b8c8-8rrwq" Mar 13 12:07:37 crc kubenswrapper[4837]: I0313 12:07:37.025752 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7598d89cd4-qfmh9"] Mar 13 12:07:37 crc kubenswrapper[4837]: I0313 12:07:37.026426 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7598d89cd4-qfmh9" podUID="91206ea2-5d2b-478d-983e-6c842f02819b" containerName="barbican-api-log" containerID="cri-o://104e38d91432a24be429666c7aef47a48dc5e37624f7f42d829e3d5a83308ad5" gracePeriod=30 Mar 13 12:07:37 crc kubenswrapper[4837]: I0313 12:07:37.026576 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7598d89cd4-qfmh9" podUID="91206ea2-5d2b-478d-983e-6c842f02819b" containerName="barbican-api" containerID="cri-o://f929d8442913bfaecc7956e8f7c394bf2287e01e6f101666b06a41edc759a582" gracePeriod=30 Mar 13 12:07:37 crc kubenswrapper[4837]: I0313 12:07:37.395215 4837 generic.go:334] "Generic (PLEG): container finished" podID="91206ea2-5d2b-478d-983e-6c842f02819b" containerID="104e38d91432a24be429666c7aef47a48dc5e37624f7f42d829e3d5a83308ad5" exitCode=143 Mar 13 12:07:37 crc kubenswrapper[4837]: I0313 12:07:37.396217 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7598d89cd4-qfmh9" event={"ID":"91206ea2-5d2b-478d-983e-6c842f02819b","Type":"ContainerDied","Data":"104e38d91432a24be429666c7aef47a48dc5e37624f7f42d829e3d5a83308ad5"} Mar 13 12:07:38 crc kubenswrapper[4837]: I0313 12:07:38.727616 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 13 12:07:39 crc kubenswrapper[4837]: I0313 12:07:39.113424 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-55dc4d44f8-mvjvg" Mar 13 12:07:40 crc kubenswrapper[4837]: I0313 12:07:40.196817 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7598d89cd4-qfmh9" podUID="91206ea2-5d2b-478d-983e-6c842f02819b" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": read tcp 10.217.0.2:46590->10.217.0.166:9311: read: connection reset by peer" Mar 13 12:07:40 crc kubenswrapper[4837]: I0313 12:07:40.196843 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7598d89cd4-qfmh9" podUID="91206ea2-5d2b-478d-983e-6c842f02819b" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": read tcp 10.217.0.2:46598->10.217.0.166:9311: read: connection reset by peer" Mar 13 12:07:40 crc kubenswrapper[4837]: I0313 12:07:40.421333 4837 generic.go:334] "Generic (PLEG): container finished" podID="91206ea2-5d2b-478d-983e-6c842f02819b" containerID="f929d8442913bfaecc7956e8f7c394bf2287e01e6f101666b06a41edc759a582" exitCode=0 Mar 13 12:07:40 crc kubenswrapper[4837]: I0313 12:07:40.421375 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7598d89cd4-qfmh9" event={"ID":"91206ea2-5d2b-478d-983e-6c842f02819b","Type":"ContainerDied","Data":"f929d8442913bfaecc7956e8f7c394bf2287e01e6f101666b06a41edc759a582"} Mar 13 12:07:40 crc kubenswrapper[4837]: I0313 12:07:40.623702 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7598d89cd4-qfmh9" Mar 13 12:07:40 crc kubenswrapper[4837]: I0313 12:07:40.722339 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bx5rf\" (UniqueName: \"kubernetes.io/projected/91206ea2-5d2b-478d-983e-6c842f02819b-kube-api-access-bx5rf\") pod \"91206ea2-5d2b-478d-983e-6c842f02819b\" (UID: \"91206ea2-5d2b-478d-983e-6c842f02819b\") " Mar 13 12:07:40 crc kubenswrapper[4837]: I0313 12:07:40.722386 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/91206ea2-5d2b-478d-983e-6c842f02819b-config-data-custom\") pod \"91206ea2-5d2b-478d-983e-6c842f02819b\" (UID: \"91206ea2-5d2b-478d-983e-6c842f02819b\") " Mar 13 12:07:40 crc kubenswrapper[4837]: I0313 12:07:40.722444 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91206ea2-5d2b-478d-983e-6c842f02819b-config-data\") pod \"91206ea2-5d2b-478d-983e-6c842f02819b\" (UID: \"91206ea2-5d2b-478d-983e-6c842f02819b\") " Mar 13 12:07:40 crc kubenswrapper[4837]: I0313 12:07:40.722509 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91206ea2-5d2b-478d-983e-6c842f02819b-combined-ca-bundle\") pod \"91206ea2-5d2b-478d-983e-6c842f02819b\" (UID: \"91206ea2-5d2b-478d-983e-6c842f02819b\") " Mar 13 12:07:40 crc kubenswrapper[4837]: I0313 12:07:40.722686 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91206ea2-5d2b-478d-983e-6c842f02819b-logs\") pod \"91206ea2-5d2b-478d-983e-6c842f02819b\" (UID: \"91206ea2-5d2b-478d-983e-6c842f02819b\") " Mar 13 12:07:40 crc kubenswrapper[4837]: I0313 12:07:40.723737 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91206ea2-5d2b-478d-983e-6c842f02819b-logs" (OuterVolumeSpecName: "logs") pod "91206ea2-5d2b-478d-983e-6c842f02819b" (UID: "91206ea2-5d2b-478d-983e-6c842f02819b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:07:40 crc kubenswrapper[4837]: I0313 12:07:40.735097 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91206ea2-5d2b-478d-983e-6c842f02819b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "91206ea2-5d2b-478d-983e-6c842f02819b" (UID: "91206ea2-5d2b-478d-983e-6c842f02819b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:40 crc kubenswrapper[4837]: I0313 12:07:40.746017 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91206ea2-5d2b-478d-983e-6c842f02819b-kube-api-access-bx5rf" (OuterVolumeSpecName: "kube-api-access-bx5rf") pod "91206ea2-5d2b-478d-983e-6c842f02819b" (UID: "91206ea2-5d2b-478d-983e-6c842f02819b"). InnerVolumeSpecName "kube-api-access-bx5rf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:07:40 crc kubenswrapper[4837]: I0313 12:07:40.826996 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91206ea2-5d2b-478d-983e-6c842f02819b-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:40 crc kubenswrapper[4837]: I0313 12:07:40.827048 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bx5rf\" (UniqueName: \"kubernetes.io/projected/91206ea2-5d2b-478d-983e-6c842f02819b-kube-api-access-bx5rf\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:40 crc kubenswrapper[4837]: I0313 12:07:40.827063 4837 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/91206ea2-5d2b-478d-983e-6c842f02819b-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:40 crc kubenswrapper[4837]: I0313 12:07:40.851851 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91206ea2-5d2b-478d-983e-6c842f02819b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "91206ea2-5d2b-478d-983e-6c842f02819b" (UID: "91206ea2-5d2b-478d-983e-6c842f02819b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:40 crc kubenswrapper[4837]: I0313 12:07:40.898868 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91206ea2-5d2b-478d-983e-6c842f02819b-config-data" (OuterVolumeSpecName: "config-data") pod "91206ea2-5d2b-478d-983e-6c842f02819b" (UID: "91206ea2-5d2b-478d-983e-6c842f02819b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:40 crc kubenswrapper[4837]: I0313 12:07:40.929372 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91206ea2-5d2b-478d-983e-6c842f02819b-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:40 crc kubenswrapper[4837]: I0313 12:07:40.929427 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91206ea2-5d2b-478d-983e-6c842f02819b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:41 crc kubenswrapper[4837]: I0313 12:07:41.430600 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7598d89cd4-qfmh9" event={"ID":"91206ea2-5d2b-478d-983e-6c842f02819b","Type":"ContainerDied","Data":"e401c09fc39f0377fdb0e13cc3564c85b21b640ae75df7edda1290f89d0c1fda"} Mar 13 12:07:41 crc kubenswrapper[4837]: I0313 12:07:41.430668 4837 scope.go:117] "RemoveContainer" containerID="f929d8442913bfaecc7956e8f7c394bf2287e01e6f101666b06a41edc759a582" Mar 13 12:07:41 crc kubenswrapper[4837]: I0313 12:07:41.430673 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7598d89cd4-qfmh9" Mar 13 12:07:41 crc kubenswrapper[4837]: I0313 12:07:41.451247 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7598d89cd4-qfmh9"] Mar 13 12:07:41 crc kubenswrapper[4837]: I0313 12:07:41.460284 4837 scope.go:117] "RemoveContainer" containerID="104e38d91432a24be429666c7aef47a48dc5e37624f7f42d829e3d5a83308ad5" Mar 13 12:07:41 crc kubenswrapper[4837]: I0313 12:07:41.461143 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7598d89cd4-qfmh9"] Mar 13 12:07:42 crc kubenswrapper[4837]: I0313 12:07:42.813718 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 13 12:07:42 crc kubenswrapper[4837]: E0313 12:07:42.814321 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91206ea2-5d2b-478d-983e-6c842f02819b" containerName="barbican-api-log" Mar 13 12:07:42 crc kubenswrapper[4837]: I0313 12:07:42.814341 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="91206ea2-5d2b-478d-983e-6c842f02819b" containerName="barbican-api-log" Mar 13 12:07:42 crc kubenswrapper[4837]: E0313 12:07:42.814389 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91206ea2-5d2b-478d-983e-6c842f02819b" containerName="barbican-api" Mar 13 12:07:42 crc kubenswrapper[4837]: I0313 12:07:42.814399 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="91206ea2-5d2b-478d-983e-6c842f02819b" containerName="barbican-api" Mar 13 12:07:42 crc kubenswrapper[4837]: I0313 12:07:42.814718 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="91206ea2-5d2b-478d-983e-6c842f02819b" containerName="barbican-api" Mar 13 12:07:42 crc kubenswrapper[4837]: I0313 12:07:42.814750 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="91206ea2-5d2b-478d-983e-6c842f02819b" containerName="barbican-api-log" Mar 13 12:07:42 crc kubenswrapper[4837]: I0313 12:07:42.815776 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 12:07:42 crc kubenswrapper[4837]: I0313 12:07:42.818198 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 13 12:07:42 crc kubenswrapper[4837]: I0313 12:07:42.818535 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 13 12:07:42 crc kubenswrapper[4837]: I0313 12:07:42.818713 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-x4nfx" Mar 13 12:07:42 crc kubenswrapper[4837]: I0313 12:07:42.826497 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 13 12:07:42 crc kubenswrapper[4837]: I0313 12:07:42.877820 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c53a4cf-579b-49dd-88e3-cee64443611e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"3c53a4cf-579b-49dd-88e3-cee64443611e\") " pod="openstack/openstackclient" Mar 13 12:07:42 crc kubenswrapper[4837]: I0313 12:07:42.877887 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgnqf\" (UniqueName: \"kubernetes.io/projected/3c53a4cf-579b-49dd-88e3-cee64443611e-kube-api-access-cgnqf\") pod \"openstackclient\" (UID: \"3c53a4cf-579b-49dd-88e3-cee64443611e\") " pod="openstack/openstackclient" Mar 13 12:07:42 crc kubenswrapper[4837]: I0313 12:07:42.877975 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3c53a4cf-579b-49dd-88e3-cee64443611e-openstack-config-secret\") pod \"openstackclient\" (UID: \"3c53a4cf-579b-49dd-88e3-cee64443611e\") " pod="openstack/openstackclient" Mar 13 12:07:42 crc kubenswrapper[4837]: I0313 12:07:42.878014 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3c53a4cf-579b-49dd-88e3-cee64443611e-openstack-config\") pod \"openstackclient\" (UID: \"3c53a4cf-579b-49dd-88e3-cee64443611e\") " pod="openstack/openstackclient" Mar 13 12:07:42 crc kubenswrapper[4837]: I0313 12:07:42.979824 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3c53a4cf-579b-49dd-88e3-cee64443611e-openstack-config\") pod \"openstackclient\" (UID: \"3c53a4cf-579b-49dd-88e3-cee64443611e\") " pod="openstack/openstackclient" Mar 13 12:07:42 crc kubenswrapper[4837]: I0313 12:07:42.980138 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c53a4cf-579b-49dd-88e3-cee64443611e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"3c53a4cf-579b-49dd-88e3-cee64443611e\") " pod="openstack/openstackclient" Mar 13 12:07:42 crc kubenswrapper[4837]: I0313 12:07:42.980250 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgnqf\" (UniqueName: \"kubernetes.io/projected/3c53a4cf-579b-49dd-88e3-cee64443611e-kube-api-access-cgnqf\") pod \"openstackclient\" (UID: \"3c53a4cf-579b-49dd-88e3-cee64443611e\") " pod="openstack/openstackclient" Mar 13 12:07:42 crc kubenswrapper[4837]: I0313 12:07:42.980377 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3c53a4cf-579b-49dd-88e3-cee64443611e-openstack-config-secret\") pod \"openstackclient\" (UID: \"3c53a4cf-579b-49dd-88e3-cee64443611e\") " pod="openstack/openstackclient" Mar 13 12:07:42 crc kubenswrapper[4837]: I0313 12:07:42.980993 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3c53a4cf-579b-49dd-88e3-cee64443611e-openstack-config\") pod \"openstackclient\" (UID: \"3c53a4cf-579b-49dd-88e3-cee64443611e\") " pod="openstack/openstackclient" Mar 13 12:07:43 crc kubenswrapper[4837]: I0313 12:07:43.000507 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3c53a4cf-579b-49dd-88e3-cee64443611e-openstack-config-secret\") pod \"openstackclient\" (UID: \"3c53a4cf-579b-49dd-88e3-cee64443611e\") " pod="openstack/openstackclient" Mar 13 12:07:43 crc kubenswrapper[4837]: I0313 12:07:43.001824 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c53a4cf-579b-49dd-88e3-cee64443611e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"3c53a4cf-579b-49dd-88e3-cee64443611e\") " pod="openstack/openstackclient" Mar 13 12:07:43 crc kubenswrapper[4837]: I0313 12:07:43.003410 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgnqf\" (UniqueName: \"kubernetes.io/projected/3c53a4cf-579b-49dd-88e3-cee64443611e-kube-api-access-cgnqf\") pod \"openstackclient\" (UID: \"3c53a4cf-579b-49dd-88e3-cee64443611e\") " pod="openstack/openstackclient" Mar 13 12:07:43 crc kubenswrapper[4837]: I0313 12:07:43.059241 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91206ea2-5d2b-478d-983e-6c842f02819b" path="/var/lib/kubelet/pods/91206ea2-5d2b-478d-983e-6c842f02819b/volumes" Mar 13 12:07:43 crc kubenswrapper[4837]: I0313 12:07:43.150279 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 12:07:43 crc kubenswrapper[4837]: I0313 12:07:43.267368 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 13 12:07:43 crc kubenswrapper[4837]: I0313 12:07:43.279700 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 13 12:07:43 crc kubenswrapper[4837]: I0313 12:07:43.298148 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 13 12:07:43 crc kubenswrapper[4837]: I0313 12:07:43.299359 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 12:07:43 crc kubenswrapper[4837]: I0313 12:07:43.308786 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 13 12:07:43 crc kubenswrapper[4837]: I0313 12:07:43.389747 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrnfz\" (UniqueName: \"kubernetes.io/projected/5d15c820-a2ee-4d4c-986f-2c2f09b43f79-kube-api-access-mrnfz\") pod \"openstackclient\" (UID: \"5d15c820-a2ee-4d4c-986f-2c2f09b43f79\") " pod="openstack/openstackclient" Mar 13 12:07:43 crc kubenswrapper[4837]: I0313 12:07:43.390144 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d15c820-a2ee-4d4c-986f-2c2f09b43f79-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5d15c820-a2ee-4d4c-986f-2c2f09b43f79\") " pod="openstack/openstackclient" Mar 13 12:07:43 crc kubenswrapper[4837]: I0313 12:07:43.390237 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5d15c820-a2ee-4d4c-986f-2c2f09b43f79-openstack-config-secret\") pod \"openstackclient\" (UID: \"5d15c820-a2ee-4d4c-986f-2c2f09b43f79\") " pod="openstack/openstackclient" Mar 13 12:07:43 crc kubenswrapper[4837]: I0313 12:07:43.390294 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5d15c820-a2ee-4d4c-986f-2c2f09b43f79-openstack-config\") pod \"openstackclient\" (UID: \"5d15c820-a2ee-4d4c-986f-2c2f09b43f79\") " pod="openstack/openstackclient" Mar 13 12:07:43 crc kubenswrapper[4837]: I0313 12:07:43.492823 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5d15c820-a2ee-4d4c-986f-2c2f09b43f79-openstack-config\") pod \"openstackclient\" (UID: \"5d15c820-a2ee-4d4c-986f-2c2f09b43f79\") " pod="openstack/openstackclient" Mar 13 12:07:43 crc kubenswrapper[4837]: I0313 12:07:43.492978 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrnfz\" (UniqueName: \"kubernetes.io/projected/5d15c820-a2ee-4d4c-986f-2c2f09b43f79-kube-api-access-mrnfz\") pod \"openstackclient\" (UID: \"5d15c820-a2ee-4d4c-986f-2c2f09b43f79\") " pod="openstack/openstackclient" Mar 13 12:07:43 crc kubenswrapper[4837]: I0313 12:07:43.493015 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d15c820-a2ee-4d4c-986f-2c2f09b43f79-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5d15c820-a2ee-4d4c-986f-2c2f09b43f79\") " pod="openstack/openstackclient" Mar 13 12:07:43 crc kubenswrapper[4837]: I0313 12:07:43.493074 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5d15c820-a2ee-4d4c-986f-2c2f09b43f79-openstack-config-secret\") pod \"openstackclient\" (UID: \"5d15c820-a2ee-4d4c-986f-2c2f09b43f79\") " pod="openstack/openstackclient" Mar 13 12:07:43 crc kubenswrapper[4837]: I0313 12:07:43.493820 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5d15c820-a2ee-4d4c-986f-2c2f09b43f79-openstack-config\") pod \"openstackclient\" (UID: \"5d15c820-a2ee-4d4c-986f-2c2f09b43f79\") " pod="openstack/openstackclient" Mar 13 12:07:43 crc kubenswrapper[4837]: I0313 12:07:43.500462 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5d15c820-a2ee-4d4c-986f-2c2f09b43f79-openstack-config-secret\") pod \"openstackclient\" (UID: \"5d15c820-a2ee-4d4c-986f-2c2f09b43f79\") " pod="openstack/openstackclient" Mar 13 12:07:43 crc kubenswrapper[4837]: I0313 12:07:43.500549 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d15c820-a2ee-4d4c-986f-2c2f09b43f79-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5d15c820-a2ee-4d4c-986f-2c2f09b43f79\") " pod="openstack/openstackclient" Mar 13 12:07:43 crc kubenswrapper[4837]: I0313 12:07:43.514810 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrnfz\" (UniqueName: \"kubernetes.io/projected/5d15c820-a2ee-4d4c-986f-2c2f09b43f79-kube-api-access-mrnfz\") pod \"openstackclient\" (UID: \"5d15c820-a2ee-4d4c-986f-2c2f09b43f79\") " pod="openstack/openstackclient" Mar 13 12:07:43 crc kubenswrapper[4837]: I0313 12:07:43.630097 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 12:07:43 crc kubenswrapper[4837]: E0313 12:07:43.677799 4837 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 13 12:07:43 crc kubenswrapper[4837]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_3c53a4cf-579b-49dd-88e3-cee64443611e_0(e06daad3c7287324af1d17357c43cce41d497603c6e8a06dedc6a1df4754a4f2): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"e06daad3c7287324af1d17357c43cce41d497603c6e8a06dedc6a1df4754a4f2" Netns:"/var/run/netns/2bd1c5d9-89fa-4eca-a7d7-beffdd102d90" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=e06daad3c7287324af1d17357c43cce41d497603c6e8a06dedc6a1df4754a4f2;K8S_POD_UID=3c53a4cf-579b-49dd-88e3-cee64443611e" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: [openstack/openstackclient/3c53a4cf-579b-49dd-88e3-cee64443611e:ovn-kubernetes]: error adding container to network "ovn-kubernetes": CNI request failed with status 400: '[openstack/openstackclient e06daad3c7287324af1d17357c43cce41d497603c6e8a06dedc6a1df4754a4f2 network default NAD default] [openstack/openstackclient e06daad3c7287324af1d17357c43cce41d497603c6e8a06dedc6a1df4754a4f2 network default NAD default] failed to configure pod interface: canceled old pod sandbox waiting for OVS port binding for 0a:58:0a:d9:00:ae [10.217.0.174/23] Mar 13 12:07:43 crc kubenswrapper[4837]: ' Mar 13 12:07:43 crc kubenswrapper[4837]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 12:07:43 crc kubenswrapper[4837]: > Mar 13 12:07:43 crc kubenswrapper[4837]: E0313 12:07:43.677931 4837 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 13 12:07:43 crc kubenswrapper[4837]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_3c53a4cf-579b-49dd-88e3-cee64443611e_0(e06daad3c7287324af1d17357c43cce41d497603c6e8a06dedc6a1df4754a4f2): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"e06daad3c7287324af1d17357c43cce41d497603c6e8a06dedc6a1df4754a4f2" Netns:"/var/run/netns/2bd1c5d9-89fa-4eca-a7d7-beffdd102d90" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=e06daad3c7287324af1d17357c43cce41d497603c6e8a06dedc6a1df4754a4f2;K8S_POD_UID=3c53a4cf-579b-49dd-88e3-cee64443611e" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: [openstack/openstackclient/3c53a4cf-579b-49dd-88e3-cee64443611e:ovn-kubernetes]: error adding container to network "ovn-kubernetes": CNI request failed with status 400: '[openstack/openstackclient e06daad3c7287324af1d17357c43cce41d497603c6e8a06dedc6a1df4754a4f2 network default NAD default] [openstack/openstackclient e06daad3c7287324af1d17357c43cce41d497603c6e8a06dedc6a1df4754a4f2 network default NAD default] failed to configure pod interface: canceled old pod sandbox waiting for OVS port binding for 0a:58:0a:d9:00:ae [10.217.0.174/23] Mar 13 12:07:43 crc kubenswrapper[4837]: ' Mar 13 12:07:43 crc kubenswrapper[4837]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 12:07:43 crc kubenswrapper[4837]: > pod="openstack/openstackclient" Mar 13 12:07:44 crc kubenswrapper[4837]: I0313 12:07:44.035448 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 13 12:07:44 crc kubenswrapper[4837]: I0313 12:07:44.105746 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 13 12:07:44 crc kubenswrapper[4837]: I0313 12:07:44.474556 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"5d15c820-a2ee-4d4c-986f-2c2f09b43f79","Type":"ContainerStarted","Data":"f21ec75b73478c7d1fa58479d3e978e46604c8a48f0a1340b54b15d8f3a3caaf"} Mar 13 12:07:44 crc kubenswrapper[4837]: I0313 12:07:44.474585 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 12:07:44 crc kubenswrapper[4837]: I0313 12:07:44.479706 4837 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="3c53a4cf-579b-49dd-88e3-cee64443611e" podUID="5d15c820-a2ee-4d4c-986f-2c2f09b43f79" Mar 13 12:07:44 crc kubenswrapper[4837]: I0313 12:07:44.487578 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 12:07:44 crc kubenswrapper[4837]: I0313 12:07:44.511748 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgnqf\" (UniqueName: \"kubernetes.io/projected/3c53a4cf-579b-49dd-88e3-cee64443611e-kube-api-access-cgnqf\") pod \"3c53a4cf-579b-49dd-88e3-cee64443611e\" (UID: \"3c53a4cf-579b-49dd-88e3-cee64443611e\") " Mar 13 12:07:44 crc kubenswrapper[4837]: I0313 12:07:44.511807 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c53a4cf-579b-49dd-88e3-cee64443611e-combined-ca-bundle\") pod \"3c53a4cf-579b-49dd-88e3-cee64443611e\" (UID: \"3c53a4cf-579b-49dd-88e3-cee64443611e\") " Mar 13 12:07:44 crc kubenswrapper[4837]: I0313 12:07:44.512019 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3c53a4cf-579b-49dd-88e3-cee64443611e-openstack-config-secret\") pod \"3c53a4cf-579b-49dd-88e3-cee64443611e\" (UID: \"3c53a4cf-579b-49dd-88e3-cee64443611e\") " Mar 13 12:07:44 crc kubenswrapper[4837]: I0313 12:07:44.512092 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3c53a4cf-579b-49dd-88e3-cee64443611e-openstack-config\") pod \"3c53a4cf-579b-49dd-88e3-cee64443611e\" (UID: \"3c53a4cf-579b-49dd-88e3-cee64443611e\") " Mar 13 12:07:44 crc kubenswrapper[4837]: I0313 12:07:44.512734 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c53a4cf-579b-49dd-88e3-cee64443611e-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "3c53a4cf-579b-49dd-88e3-cee64443611e" (UID: "3c53a4cf-579b-49dd-88e3-cee64443611e"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:07:44 crc kubenswrapper[4837]: I0313 12:07:44.517950 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c53a4cf-579b-49dd-88e3-cee64443611e-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "3c53a4cf-579b-49dd-88e3-cee64443611e" (UID: "3c53a4cf-579b-49dd-88e3-cee64443611e"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:44 crc kubenswrapper[4837]: I0313 12:07:44.518113 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c53a4cf-579b-49dd-88e3-cee64443611e-kube-api-access-cgnqf" (OuterVolumeSpecName: "kube-api-access-cgnqf") pod "3c53a4cf-579b-49dd-88e3-cee64443611e" (UID: "3c53a4cf-579b-49dd-88e3-cee64443611e"). InnerVolumeSpecName "kube-api-access-cgnqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:07:44 crc kubenswrapper[4837]: I0313 12:07:44.518174 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c53a4cf-579b-49dd-88e3-cee64443611e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3c53a4cf-579b-49dd-88e3-cee64443611e" (UID: "3c53a4cf-579b-49dd-88e3-cee64443611e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:44 crc kubenswrapper[4837]: I0313 12:07:44.614752 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgnqf\" (UniqueName: \"kubernetes.io/projected/3c53a4cf-579b-49dd-88e3-cee64443611e-kube-api-access-cgnqf\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:44 crc kubenswrapper[4837]: I0313 12:07:44.615105 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c53a4cf-579b-49dd-88e3-cee64443611e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:44 crc kubenswrapper[4837]: I0313 12:07:44.615118 4837 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3c53a4cf-579b-49dd-88e3-cee64443611e-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:44 crc kubenswrapper[4837]: I0313 12:07:44.615133 4837 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3c53a4cf-579b-49dd-88e3-cee64443611e-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:45 crc kubenswrapper[4837]: I0313 12:07:45.063436 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c53a4cf-579b-49dd-88e3-cee64443611e" path="/var/lib/kubelet/pods/3c53a4cf-579b-49dd-88e3-cee64443611e/volumes" Mar 13 12:07:45 crc kubenswrapper[4837]: I0313 12:07:45.484437 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 12:07:45 crc kubenswrapper[4837]: I0313 12:07:45.490983 4837 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="3c53a4cf-579b-49dd-88e3-cee64443611e" podUID="5d15c820-a2ee-4d4c-986f-2c2f09b43f79" Mar 13 12:07:46 crc kubenswrapper[4837]: I0313 12:07:46.155941 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5596f9dfb8-m9bxb" podUID="2a28d7a5-22a2-460a-a08c-8eb484e6c382" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.467118 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-bfbc874dc-vsh7q"] Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.469302 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.471749 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.471787 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.481320 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-bfbc874dc-vsh7q"] Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.484088 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.581195 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/36ffa543-526d-4d56-b599-06fcfe0988cf-log-httpd\") pod \"swift-proxy-bfbc874dc-vsh7q\" (UID: \"36ffa543-526d-4d56-b599-06fcfe0988cf\") " pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.581294 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7474p\" (UniqueName: \"kubernetes.io/projected/36ffa543-526d-4d56-b599-06fcfe0988cf-kube-api-access-7474p\") pod \"swift-proxy-bfbc874dc-vsh7q\" (UID: \"36ffa543-526d-4d56-b599-06fcfe0988cf\") " pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.581330 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/36ffa543-526d-4d56-b599-06fcfe0988cf-etc-swift\") pod \"swift-proxy-bfbc874dc-vsh7q\" (UID: \"36ffa543-526d-4d56-b599-06fcfe0988cf\") " pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.581357 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/36ffa543-526d-4d56-b599-06fcfe0988cf-public-tls-certs\") pod \"swift-proxy-bfbc874dc-vsh7q\" (UID: \"36ffa543-526d-4d56-b599-06fcfe0988cf\") " pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.581401 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/36ffa543-526d-4d56-b599-06fcfe0988cf-run-httpd\") pod \"swift-proxy-bfbc874dc-vsh7q\" (UID: \"36ffa543-526d-4d56-b599-06fcfe0988cf\") " pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.581452 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/36ffa543-526d-4d56-b599-06fcfe0988cf-internal-tls-certs\") pod \"swift-proxy-bfbc874dc-vsh7q\" (UID: \"36ffa543-526d-4d56-b599-06fcfe0988cf\") " pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.581486 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36ffa543-526d-4d56-b599-06fcfe0988cf-config-data\") pod \"swift-proxy-bfbc874dc-vsh7q\" (UID: \"36ffa543-526d-4d56-b599-06fcfe0988cf\") " pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.581520 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36ffa543-526d-4d56-b599-06fcfe0988cf-combined-ca-bundle\") pod \"swift-proxy-bfbc874dc-vsh7q\" (UID: \"36ffa543-526d-4d56-b599-06fcfe0988cf\") " pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.682960 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/36ffa543-526d-4d56-b599-06fcfe0988cf-internal-tls-certs\") pod \"swift-proxy-bfbc874dc-vsh7q\" (UID: \"36ffa543-526d-4d56-b599-06fcfe0988cf\") " pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.683023 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36ffa543-526d-4d56-b599-06fcfe0988cf-config-data\") pod \"swift-proxy-bfbc874dc-vsh7q\" (UID: \"36ffa543-526d-4d56-b599-06fcfe0988cf\") " pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.683064 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36ffa543-526d-4d56-b599-06fcfe0988cf-combined-ca-bundle\") pod \"swift-proxy-bfbc874dc-vsh7q\" (UID: \"36ffa543-526d-4d56-b599-06fcfe0988cf\") " pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.683138 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/36ffa543-526d-4d56-b599-06fcfe0988cf-log-httpd\") pod \"swift-proxy-bfbc874dc-vsh7q\" (UID: \"36ffa543-526d-4d56-b599-06fcfe0988cf\") " pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.683168 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7474p\" (UniqueName: \"kubernetes.io/projected/36ffa543-526d-4d56-b599-06fcfe0988cf-kube-api-access-7474p\") pod \"swift-proxy-bfbc874dc-vsh7q\" (UID: \"36ffa543-526d-4d56-b599-06fcfe0988cf\") " pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.683191 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/36ffa543-526d-4d56-b599-06fcfe0988cf-etc-swift\") pod \"swift-proxy-bfbc874dc-vsh7q\" (UID: \"36ffa543-526d-4d56-b599-06fcfe0988cf\") " pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.683212 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/36ffa543-526d-4d56-b599-06fcfe0988cf-public-tls-certs\") pod \"swift-proxy-bfbc874dc-vsh7q\" (UID: \"36ffa543-526d-4d56-b599-06fcfe0988cf\") " pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.683243 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/36ffa543-526d-4d56-b599-06fcfe0988cf-run-httpd\") pod \"swift-proxy-bfbc874dc-vsh7q\" (UID: \"36ffa543-526d-4d56-b599-06fcfe0988cf\") " pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.683873 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/36ffa543-526d-4d56-b599-06fcfe0988cf-run-httpd\") pod \"swift-proxy-bfbc874dc-vsh7q\" (UID: \"36ffa543-526d-4d56-b599-06fcfe0988cf\") " pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.684392 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/36ffa543-526d-4d56-b599-06fcfe0988cf-log-httpd\") pod \"swift-proxy-bfbc874dc-vsh7q\" (UID: \"36ffa543-526d-4d56-b599-06fcfe0988cf\") " pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.692662 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/36ffa543-526d-4d56-b599-06fcfe0988cf-public-tls-certs\") pod \"swift-proxy-bfbc874dc-vsh7q\" (UID: \"36ffa543-526d-4d56-b599-06fcfe0988cf\") " pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.692673 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/36ffa543-526d-4d56-b599-06fcfe0988cf-internal-tls-certs\") pod \"swift-proxy-bfbc874dc-vsh7q\" (UID: \"36ffa543-526d-4d56-b599-06fcfe0988cf\") " pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.692983 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36ffa543-526d-4d56-b599-06fcfe0988cf-config-data\") pod \"swift-proxy-bfbc874dc-vsh7q\" (UID: \"36ffa543-526d-4d56-b599-06fcfe0988cf\") " pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.693516 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/36ffa543-526d-4d56-b599-06fcfe0988cf-etc-swift\") pod \"swift-proxy-bfbc874dc-vsh7q\" (UID: \"36ffa543-526d-4d56-b599-06fcfe0988cf\") " pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.696489 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36ffa543-526d-4d56-b599-06fcfe0988cf-combined-ca-bundle\") pod \"swift-proxy-bfbc874dc-vsh7q\" (UID: \"36ffa543-526d-4d56-b599-06fcfe0988cf\") " pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.706404 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7474p\" (UniqueName: \"kubernetes.io/projected/36ffa543-526d-4d56-b599-06fcfe0988cf-kube-api-access-7474p\") pod \"swift-proxy-bfbc874dc-vsh7q\" (UID: \"36ffa543-526d-4d56-b599-06fcfe0988cf\") " pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:07:47 crc kubenswrapper[4837]: I0313 12:07:47.789616 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:07:48 crc kubenswrapper[4837]: I0313 12:07:48.354948 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-bfbc874dc-vsh7q"] Mar 13 12:07:49 crc kubenswrapper[4837]: I0313 12:07:49.043516 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-59f7b5dc8d-rnsz6" Mar 13 12:07:49 crc kubenswrapper[4837]: I0313 12:07:49.045145 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-59f7b5dc8d-rnsz6" Mar 13 12:07:49 crc kubenswrapper[4837]: I0313 12:07:49.140931 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:07:49 crc kubenswrapper[4837]: I0313 12:07:49.141224 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8944c2be-da67-4cdd-9f75-0e473253e932" containerName="ceilometer-central-agent" containerID="cri-o://f106579d1eb92efbafe377b2c5e41ffb980fcd44573e4b8ba73109499680b552" gracePeriod=30 Mar 13 12:07:49 crc kubenswrapper[4837]: I0313 12:07:49.141282 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8944c2be-da67-4cdd-9f75-0e473253e932" containerName="proxy-httpd" containerID="cri-o://317a9b585064c8564bcd2ae43ba40834f6b1cd25a3d83d32f956502b5b280276" gracePeriod=30 Mar 13 12:07:49 crc kubenswrapper[4837]: I0313 12:07:49.141345 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8944c2be-da67-4cdd-9f75-0e473253e932" containerName="sg-core" containerID="cri-o://7d867af8873a7e0421fd52164c9458573cf6ac8847b38f845cf622f104ceb41b" gracePeriod=30 Mar 13 12:07:49 crc kubenswrapper[4837]: I0313 12:07:49.141411 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8944c2be-da67-4cdd-9f75-0e473253e932" containerName="ceilometer-notification-agent" containerID="cri-o://343420b862af8b30fbf01c83c65d52d9d2faba010cdc819f2b823e8a9b058006" gracePeriod=30 Mar 13 12:07:49 crc kubenswrapper[4837]: I0313 12:07:49.172874 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="8944c2be-da67-4cdd-9f75-0e473253e932" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 13 12:07:49 crc kubenswrapper[4837]: I0313 12:07:49.544044 4837 generic.go:334] "Generic (PLEG): container finished" podID="8944c2be-da67-4cdd-9f75-0e473253e932" containerID="317a9b585064c8564bcd2ae43ba40834f6b1cd25a3d83d32f956502b5b280276" exitCode=0 Mar 13 12:07:49 crc kubenswrapper[4837]: I0313 12:07:49.544085 4837 generic.go:334] "Generic (PLEG): container finished" podID="8944c2be-da67-4cdd-9f75-0e473253e932" containerID="7d867af8873a7e0421fd52164c9458573cf6ac8847b38f845cf622f104ceb41b" exitCode=2 Mar 13 12:07:49 crc kubenswrapper[4837]: I0313 12:07:49.545013 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8944c2be-da67-4cdd-9f75-0e473253e932","Type":"ContainerDied","Data":"317a9b585064c8564bcd2ae43ba40834f6b1cd25a3d83d32f956502b5b280276"} Mar 13 12:07:49 crc kubenswrapper[4837]: I0313 12:07:49.545052 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8944c2be-da67-4cdd-9f75-0e473253e932","Type":"ContainerDied","Data":"7d867af8873a7e0421fd52164c9458573cf6ac8847b38f845cf622f104ceb41b"} Mar 13 12:07:50 crc kubenswrapper[4837]: I0313 12:07:50.563389 4837 generic.go:334] "Generic (PLEG): container finished" podID="8944c2be-da67-4cdd-9f75-0e473253e932" containerID="343420b862af8b30fbf01c83c65d52d9d2faba010cdc819f2b823e8a9b058006" exitCode=0 Mar 13 12:07:50 crc kubenswrapper[4837]: I0313 12:07:50.563742 4837 generic.go:334] "Generic (PLEG): container finished" podID="8944c2be-da67-4cdd-9f75-0e473253e932" containerID="f106579d1eb92efbafe377b2c5e41ffb980fcd44573e4b8ba73109499680b552" exitCode=0 Mar 13 12:07:50 crc kubenswrapper[4837]: I0313 12:07:50.563455 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8944c2be-da67-4cdd-9f75-0e473253e932","Type":"ContainerDied","Data":"343420b862af8b30fbf01c83c65d52d9d2faba010cdc819f2b823e8a9b058006"} Mar 13 12:07:50 crc kubenswrapper[4837]: I0313 12:07:50.563787 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8944c2be-da67-4cdd-9f75-0e473253e932","Type":"ContainerDied","Data":"f106579d1eb92efbafe377b2c5e41ffb980fcd44573e4b8ba73109499680b552"} Mar 13 12:07:52 crc kubenswrapper[4837]: I0313 12:07:52.339338 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="8944c2be-da67-4cdd-9f75-0e473253e932" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.170:3000/\": dial tcp 10.217.0.170:3000: connect: connection refused" Mar 13 12:07:53 crc kubenswrapper[4837]: W0313 12:07:53.939750 4837 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8944c2be_da67_4cdd_9f75_0e473253e932.slice/crio-343420b862af8b30fbf01c83c65d52d9d2faba010cdc819f2b823e8a9b058006.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8944c2be_da67_4cdd_9f75_0e473253e932.slice/crio-343420b862af8b30fbf01c83c65d52d9d2faba010cdc819f2b823e8a9b058006.scope: no such file or directory Mar 13 12:07:53 crc kubenswrapper[4837]: W0313 12:07:53.944871 4837 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8944c2be_da67_4cdd_9f75_0e473253e932.slice/crio-conmon-7d867af8873a7e0421fd52164c9458573cf6ac8847b38f845cf622f104ceb41b.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8944c2be_da67_4cdd_9f75_0e473253e932.slice/crio-conmon-7d867af8873a7e0421fd52164c9458573cf6ac8847b38f845cf622f104ceb41b.scope: no such file or directory Mar 13 12:07:53 crc kubenswrapper[4837]: W0313 12:07:53.944932 4837 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8944c2be_da67_4cdd_9f75_0e473253e932.slice/crio-7d867af8873a7e0421fd52164c9458573cf6ac8847b38f845cf622f104ceb41b.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8944c2be_da67_4cdd_9f75_0e473253e932.slice/crio-7d867af8873a7e0421fd52164c9458573cf6ac8847b38f845cf622f104ceb41b.scope: no such file or directory Mar 13 12:07:53 crc kubenswrapper[4837]: W0313 12:07:53.945002 4837 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8944c2be_da67_4cdd_9f75_0e473253e932.slice/crio-conmon-317a9b585064c8564bcd2ae43ba40834f6b1cd25a3d83d32f956502b5b280276.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8944c2be_da67_4cdd_9f75_0e473253e932.slice/crio-conmon-317a9b585064c8564bcd2ae43ba40834f6b1cd25a3d83d32f956502b5b280276.scope: no such file or directory Mar 13 12:07:53 crc kubenswrapper[4837]: W0313 12:07:53.945022 4837 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8944c2be_da67_4cdd_9f75_0e473253e932.slice/crio-317a9b585064c8564bcd2ae43ba40834f6b1cd25a3d83d32f956502b5b280276.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8944c2be_da67_4cdd_9f75_0e473253e932.slice/crio-317a9b585064c8564bcd2ae43ba40834f6b1cd25a3d83d32f956502b5b280276.scope: no such file or directory Mar 13 12:07:53 crc kubenswrapper[4837]: W0313 12:07:53.960962 4837 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c53a4cf_579b_49dd_88e3_cee64443611e.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c53a4cf_579b_49dd_88e3_cee64443611e.slice: no such file or directory Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.445297 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.514932 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.619744 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cz8t7\" (UniqueName: \"kubernetes.io/projected/8944c2be-da67-4cdd-9f75-0e473253e932-kube-api-access-cz8t7\") pod \"8944c2be-da67-4cdd-9f75-0e473253e932\" (UID: \"8944c2be-da67-4cdd-9f75-0e473253e932\") " Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.619820 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f484085-7b83-46a8-80c2-b3ef6f8b8798-combined-ca-bundle\") pod \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\" (UID: \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\") " Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.619859 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8944c2be-da67-4cdd-9f75-0e473253e932-run-httpd\") pod \"8944c2be-da67-4cdd-9f75-0e473253e932\" (UID: \"8944c2be-da67-4cdd-9f75-0e473253e932\") " Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.619887 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8944c2be-da67-4cdd-9f75-0e473253e932-log-httpd\") pod \"8944c2be-da67-4cdd-9f75-0e473253e932\" (UID: \"8944c2be-da67-4cdd-9f75-0e473253e932\") " Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.619957 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f484085-7b83-46a8-80c2-b3ef6f8b8798-scripts\") pod \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\" (UID: \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\") " Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.619983 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8944c2be-da67-4cdd-9f75-0e473253e932-sg-core-conf-yaml\") pod \"8944c2be-da67-4cdd-9f75-0e473253e932\" (UID: \"8944c2be-da67-4cdd-9f75-0e473253e932\") " Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.620024 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f484085-7b83-46a8-80c2-b3ef6f8b8798-logs\") pod \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\" (UID: \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\") " Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.620049 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f484085-7b83-46a8-80c2-b3ef6f8b8798-config-data\") pod \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\" (UID: \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\") " Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.620080 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8944c2be-da67-4cdd-9f75-0e473253e932-config-data\") pod \"8944c2be-da67-4cdd-9f75-0e473253e932\" (UID: \"8944c2be-da67-4cdd-9f75-0e473253e932\") " Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.620129 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6f484085-7b83-46a8-80c2-b3ef6f8b8798-etc-machine-id\") pod \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\" (UID: \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\") " Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.620155 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8944c2be-da67-4cdd-9f75-0e473253e932-combined-ca-bundle\") pod \"8944c2be-da67-4cdd-9f75-0e473253e932\" (UID: \"8944c2be-da67-4cdd-9f75-0e473253e932\") " Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.620216 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppxfm\" (UniqueName: \"kubernetes.io/projected/6f484085-7b83-46a8-80c2-b3ef6f8b8798-kube-api-access-ppxfm\") pod \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\" (UID: \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\") " Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.620262 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8944c2be-da67-4cdd-9f75-0e473253e932-scripts\") pod \"8944c2be-da67-4cdd-9f75-0e473253e932\" (UID: \"8944c2be-da67-4cdd-9f75-0e473253e932\") " Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.620283 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6f484085-7b83-46a8-80c2-b3ef6f8b8798-config-data-custom\") pod \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\" (UID: \"6f484085-7b83-46a8-80c2-b3ef6f8b8798\") " Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.620831 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8944c2be-da67-4cdd-9f75-0e473253e932-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8944c2be-da67-4cdd-9f75-0e473253e932" (UID: "8944c2be-da67-4cdd-9f75-0e473253e932"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.627786 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8944c2be-da67-4cdd-9f75-0e473253e932-kube-api-access-cz8t7" (OuterVolumeSpecName: "kube-api-access-cz8t7") pod "8944c2be-da67-4cdd-9f75-0e473253e932" (UID: "8944c2be-da67-4cdd-9f75-0e473253e932"). InnerVolumeSpecName "kube-api-access-cz8t7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.628743 4837 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8944c2be-da67-4cdd-9f75-0e473253e932-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.629089 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f484085-7b83-46a8-80c2-b3ef6f8b8798-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6f484085-7b83-46a8-80c2-b3ef6f8b8798" (UID: "6f484085-7b83-46a8-80c2-b3ef6f8b8798"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.630578 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f484085-7b83-46a8-80c2-b3ef6f8b8798-logs" (OuterVolumeSpecName: "logs") pod "6f484085-7b83-46a8-80c2-b3ef6f8b8798" (UID: "6f484085-7b83-46a8-80c2-b3ef6f8b8798"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.630672 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6f484085-7b83-46a8-80c2-b3ef6f8b8798-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6f484085-7b83-46a8-80c2-b3ef6f8b8798" (UID: "6f484085-7b83-46a8-80c2-b3ef6f8b8798"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.631094 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8944c2be-da67-4cdd-9f75-0e473253e932-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8944c2be-da67-4cdd-9f75-0e473253e932" (UID: "8944c2be-da67-4cdd-9f75-0e473253e932"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.631759 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f484085-7b83-46a8-80c2-b3ef6f8b8798-scripts" (OuterVolumeSpecName: "scripts") pod "6f484085-7b83-46a8-80c2-b3ef6f8b8798" (UID: "6f484085-7b83-46a8-80c2-b3ef6f8b8798"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.634115 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-bfbc874dc-vsh7q" event={"ID":"36ffa543-526d-4d56-b599-06fcfe0988cf","Type":"ContainerStarted","Data":"7f0dc730f5a20cd902650f8ce857c7f2a912d9c2636f815700cba37775ba2724"} Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.634165 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-bfbc874dc-vsh7q" event={"ID":"36ffa543-526d-4d56-b599-06fcfe0988cf","Type":"ContainerStarted","Data":"9361990b64ae4a8f16092212426b5d26a1198cd3e68259a624dc137146c20a4c"} Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.642506 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f484085-7b83-46a8-80c2-b3ef6f8b8798-kube-api-access-ppxfm" (OuterVolumeSpecName: "kube-api-access-ppxfm") pod "6f484085-7b83-46a8-80c2-b3ef6f8b8798" (UID: "6f484085-7b83-46a8-80c2-b3ef6f8b8798"). InnerVolumeSpecName "kube-api-access-ppxfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.644369 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"5d15c820-a2ee-4d4c-986f-2c2f09b43f79","Type":"ContainerStarted","Data":"b8ad12b2d30d012686daaa741a37ef92d03c205452f05b4aae0977eeca475799"} Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.646800 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8944c2be-da67-4cdd-9f75-0e473253e932-scripts" (OuterVolumeSpecName: "scripts") pod "8944c2be-da67-4cdd-9f75-0e473253e932" (UID: "8944c2be-da67-4cdd-9f75-0e473253e932"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.663056 4837 generic.go:334] "Generic (PLEG): container finished" podID="6f484085-7b83-46a8-80c2-b3ef6f8b8798" containerID="4acde6c31477a18525c2bc313aa155955862f73aaa7708329d4edc29e752be5d" exitCode=137 Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.663194 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6f484085-7b83-46a8-80c2-b3ef6f8b8798","Type":"ContainerDied","Data":"4acde6c31477a18525c2bc313aa155955862f73aaa7708329d4edc29e752be5d"} Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.663225 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6f484085-7b83-46a8-80c2-b3ef6f8b8798","Type":"ContainerDied","Data":"2608f88642291363e7163567a42948a0027f5da1a879663defaa6a0c943729b9"} Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.663243 4837 scope.go:117] "RemoveContainer" containerID="4acde6c31477a18525c2bc313aa155955862f73aaa7708329d4edc29e752be5d" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.663195 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.668673 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.477881006 podStartE2EDuration="11.668630542s" podCreationTimestamp="2026-03-13 12:07:43 +0000 UTC" firstStartedPulling="2026-03-13 12:07:44.110366657 +0000 UTC m=+1179.748633440" lastFinishedPulling="2026-03-13 12:07:54.301116213 +0000 UTC m=+1189.939382976" observedRunningTime="2026-03-13 12:07:54.663292914 +0000 UTC m=+1190.301559697" watchObservedRunningTime="2026-03-13 12:07:54.668630542 +0000 UTC m=+1190.306897325" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.672787 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f484085-7b83-46a8-80c2-b3ef6f8b8798-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6f484085-7b83-46a8-80c2-b3ef6f8b8798" (UID: "6f484085-7b83-46a8-80c2-b3ef6f8b8798"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.678730 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8944c2be-da67-4cdd-9f75-0e473253e932","Type":"ContainerDied","Data":"affa40a245268506c6f6766fb2f158d46986fd7f106dd4cfb003b265c6f1faa4"} Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.679078 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.681692 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8944c2be-da67-4cdd-9f75-0e473253e932-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8944c2be-da67-4cdd-9f75-0e473253e932" (UID: "8944c2be-da67-4cdd-9f75-0e473253e932"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.700027 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f484085-7b83-46a8-80c2-b3ef6f8b8798-config-data" (OuterVolumeSpecName: "config-data") pod "6f484085-7b83-46a8-80c2-b3ef6f8b8798" (UID: "6f484085-7b83-46a8-80c2-b3ef6f8b8798"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.700232 4837 scope.go:117] "RemoveContainer" containerID="bb986b6c527ca78bf1e0896829a89d5b0ab27431c49d56719acca1f95eca36b5" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.729810 4837 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6f484085-7b83-46a8-80c2-b3ef6f8b8798-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.729847 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppxfm\" (UniqueName: \"kubernetes.io/projected/6f484085-7b83-46a8-80c2-b3ef6f8b8798-kube-api-access-ppxfm\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.729861 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8944c2be-da67-4cdd-9f75-0e473253e932-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.729872 4837 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6f484085-7b83-46a8-80c2-b3ef6f8b8798-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.729884 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cz8t7\" (UniqueName: \"kubernetes.io/projected/8944c2be-da67-4cdd-9f75-0e473253e932-kube-api-access-cz8t7\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.729897 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f484085-7b83-46a8-80c2-b3ef6f8b8798-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.729909 4837 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8944c2be-da67-4cdd-9f75-0e473253e932-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.729921 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f484085-7b83-46a8-80c2-b3ef6f8b8798-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.729932 4837 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8944c2be-da67-4cdd-9f75-0e473253e932-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.729945 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f484085-7b83-46a8-80c2-b3ef6f8b8798-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.729956 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f484085-7b83-46a8-80c2-b3ef6f8b8798-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.736218 4837 scope.go:117] "RemoveContainer" containerID="4acde6c31477a18525c2bc313aa155955862f73aaa7708329d4edc29e752be5d" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.736468 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8944c2be-da67-4cdd-9f75-0e473253e932-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8944c2be-da67-4cdd-9f75-0e473253e932" (UID: "8944c2be-da67-4cdd-9f75-0e473253e932"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:54 crc kubenswrapper[4837]: E0313 12:07:54.736782 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4acde6c31477a18525c2bc313aa155955862f73aaa7708329d4edc29e752be5d\": container with ID starting with 4acde6c31477a18525c2bc313aa155955862f73aaa7708329d4edc29e752be5d not found: ID does not exist" containerID="4acde6c31477a18525c2bc313aa155955862f73aaa7708329d4edc29e752be5d" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.736816 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4acde6c31477a18525c2bc313aa155955862f73aaa7708329d4edc29e752be5d"} err="failed to get container status \"4acde6c31477a18525c2bc313aa155955862f73aaa7708329d4edc29e752be5d\": rpc error: code = NotFound desc = could not find container \"4acde6c31477a18525c2bc313aa155955862f73aaa7708329d4edc29e752be5d\": container with ID starting with 4acde6c31477a18525c2bc313aa155955862f73aaa7708329d4edc29e752be5d not found: ID does not exist" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.736841 4837 scope.go:117] "RemoveContainer" containerID="bb986b6c527ca78bf1e0896829a89d5b0ab27431c49d56719acca1f95eca36b5" Mar 13 12:07:54 crc kubenswrapper[4837]: E0313 12:07:54.737130 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb986b6c527ca78bf1e0896829a89d5b0ab27431c49d56719acca1f95eca36b5\": container with ID starting with bb986b6c527ca78bf1e0896829a89d5b0ab27431c49d56719acca1f95eca36b5 not found: ID does not exist" containerID="bb986b6c527ca78bf1e0896829a89d5b0ab27431c49d56719acca1f95eca36b5" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.737171 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb986b6c527ca78bf1e0896829a89d5b0ab27431c49d56719acca1f95eca36b5"} err="failed to get container status \"bb986b6c527ca78bf1e0896829a89d5b0ab27431c49d56719acca1f95eca36b5\": rpc error: code = NotFound desc = could not find container \"bb986b6c527ca78bf1e0896829a89d5b0ab27431c49d56719acca1f95eca36b5\": container with ID starting with bb986b6c527ca78bf1e0896829a89d5b0ab27431c49d56719acca1f95eca36b5 not found: ID does not exist" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.737202 4837 scope.go:117] "RemoveContainer" containerID="317a9b585064c8564bcd2ae43ba40834f6b1cd25a3d83d32f956502b5b280276" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.749971 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8944c2be-da67-4cdd-9f75-0e473253e932-config-data" (OuterVolumeSpecName: "config-data") pod "8944c2be-da67-4cdd-9f75-0e473253e932" (UID: "8944c2be-da67-4cdd-9f75-0e473253e932"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.764172 4837 scope.go:117] "RemoveContainer" containerID="7d867af8873a7e0421fd52164c9458573cf6ac8847b38f845cf622f104ceb41b" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.792595 4837 scope.go:117] "RemoveContainer" containerID="343420b862af8b30fbf01c83c65d52d9d2faba010cdc819f2b823e8a9b058006" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.813592 4837 scope.go:117] "RemoveContainer" containerID="f106579d1eb92efbafe377b2c5e41ffb980fcd44573e4b8ba73109499680b552" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.830628 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8944c2be-da67-4cdd-9f75-0e473253e932-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.830679 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8944c2be-da67-4cdd-9f75-0e473253e932-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:07:54 crc kubenswrapper[4837]: I0313 12:07:54.999261 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.018600 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.043878 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.083968 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f484085-7b83-46a8-80c2-b3ef6f8b8798" path="/var/lib/kubelet/pods/6f484085-7b83-46a8-80c2-b3ef6f8b8798/volumes" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.095330 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 13 12:07:55 crc kubenswrapper[4837]: E0313 12:07:55.102682 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8944c2be-da67-4cdd-9f75-0e473253e932" containerName="ceilometer-central-agent" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.103186 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="8944c2be-da67-4cdd-9f75-0e473253e932" containerName="ceilometer-central-agent" Mar 13 12:07:55 crc kubenswrapper[4837]: E0313 12:07:55.103369 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f484085-7b83-46a8-80c2-b3ef6f8b8798" containerName="cinder-api" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.103674 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f484085-7b83-46a8-80c2-b3ef6f8b8798" containerName="cinder-api" Mar 13 12:07:55 crc kubenswrapper[4837]: E0313 12:07:55.107081 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8944c2be-da67-4cdd-9f75-0e473253e932" containerName="proxy-httpd" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.107280 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="8944c2be-da67-4cdd-9f75-0e473253e932" containerName="proxy-httpd" Mar 13 12:07:55 crc kubenswrapper[4837]: E0313 12:07:55.107399 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8944c2be-da67-4cdd-9f75-0e473253e932" containerName="sg-core" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.107471 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="8944c2be-da67-4cdd-9f75-0e473253e932" containerName="sg-core" Mar 13 12:07:55 crc kubenswrapper[4837]: E0313 12:07:55.107581 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8944c2be-da67-4cdd-9f75-0e473253e932" containerName="ceilometer-notification-agent" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.107690 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="8944c2be-da67-4cdd-9f75-0e473253e932" containerName="ceilometer-notification-agent" Mar 13 12:07:55 crc kubenswrapper[4837]: E0313 12:07:55.107797 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f484085-7b83-46a8-80c2-b3ef6f8b8798" containerName="cinder-api-log" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.107873 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f484085-7b83-46a8-80c2-b3ef6f8b8798" containerName="cinder-api-log" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.108378 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f484085-7b83-46a8-80c2-b3ef6f8b8798" containerName="cinder-api-log" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.108552 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="8944c2be-da67-4cdd-9f75-0e473253e932" containerName="sg-core" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.108701 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f484085-7b83-46a8-80c2-b3ef6f8b8798" containerName="cinder-api" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.108801 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="8944c2be-da67-4cdd-9f75-0e473253e932" containerName="ceilometer-notification-agent" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.111473 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="8944c2be-da67-4cdd-9f75-0e473253e932" containerName="ceilometer-central-agent" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.111720 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="8944c2be-da67-4cdd-9f75-0e473253e932" containerName="proxy-httpd" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.113155 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.118759 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.114985 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.120010 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.122468 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.122574 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.122723 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.126683 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.129077 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.129330 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.133851 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.138806 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a8004928-50bc-4db8-a701-4458c42bc776-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a8004928-50bc-4db8-a701-4458c42bc776\") " pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.138842 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8004928-50bc-4db8-a701-4458c42bc776-logs\") pod \"cinder-api-0\" (UID: \"a8004928-50bc-4db8-a701-4458c42bc776\") " pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.138879 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec252a2a-f9a4-4894-991d-1a70f596519d-config-data\") pod \"ceilometer-0\" (UID: \"ec252a2a-f9a4-4894-991d-1a70f596519d\") " pod="openstack/ceilometer-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.138918 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rd84\" (UniqueName: \"kubernetes.io/projected/a8004928-50bc-4db8-a701-4458c42bc776-kube-api-access-8rd84\") pod \"cinder-api-0\" (UID: \"a8004928-50bc-4db8-a701-4458c42bc776\") " pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.138935 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8004928-50bc-4db8-a701-4458c42bc776-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"a8004928-50bc-4db8-a701-4458c42bc776\") " pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.138974 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a8004928-50bc-4db8-a701-4458c42bc776-config-data-custom\") pod \"cinder-api-0\" (UID: \"a8004928-50bc-4db8-a701-4458c42bc776\") " pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.138995 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec252a2a-f9a4-4894-991d-1a70f596519d-log-httpd\") pod \"ceilometer-0\" (UID: \"ec252a2a-f9a4-4894-991d-1a70f596519d\") " pod="openstack/ceilometer-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.139011 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec252a2a-f9a4-4894-991d-1a70f596519d-scripts\") pod \"ceilometer-0\" (UID: \"ec252a2a-f9a4-4894-991d-1a70f596519d\") " pod="openstack/ceilometer-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.139063 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8004928-50bc-4db8-a701-4458c42bc776-scripts\") pod \"cinder-api-0\" (UID: \"a8004928-50bc-4db8-a701-4458c42bc776\") " pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.139083 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8004928-50bc-4db8-a701-4458c42bc776-config-data\") pod \"cinder-api-0\" (UID: \"a8004928-50bc-4db8-a701-4458c42bc776\") " pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.139109 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec252a2a-f9a4-4894-991d-1a70f596519d-run-httpd\") pod \"ceilometer-0\" (UID: \"ec252a2a-f9a4-4894-991d-1a70f596519d\") " pod="openstack/ceilometer-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.139122 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec252a2a-f9a4-4894-991d-1a70f596519d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ec252a2a-f9a4-4894-991d-1a70f596519d\") " pod="openstack/ceilometer-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.139142 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8004928-50bc-4db8-a701-4458c42bc776-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a8004928-50bc-4db8-a701-4458c42bc776\") " pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.139158 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ec252a2a-f9a4-4894-991d-1a70f596519d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ec252a2a-f9a4-4894-991d-1a70f596519d\") " pod="openstack/ceilometer-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.139177 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8004928-50bc-4db8-a701-4458c42bc776-public-tls-certs\") pod \"cinder-api-0\" (UID: \"a8004928-50bc-4db8-a701-4458c42bc776\") " pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.139195 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jmwm\" (UniqueName: \"kubernetes.io/projected/ec252a2a-f9a4-4894-991d-1a70f596519d-kube-api-access-9jmwm\") pod \"ceilometer-0\" (UID: \"ec252a2a-f9a4-4894-991d-1a70f596519d\") " pod="openstack/ceilometer-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.241248 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jmwm\" (UniqueName: \"kubernetes.io/projected/ec252a2a-f9a4-4894-991d-1a70f596519d-kube-api-access-9jmwm\") pod \"ceilometer-0\" (UID: \"ec252a2a-f9a4-4894-991d-1a70f596519d\") " pod="openstack/ceilometer-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.241321 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a8004928-50bc-4db8-a701-4458c42bc776-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a8004928-50bc-4db8-a701-4458c42bc776\") " pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.241362 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8004928-50bc-4db8-a701-4458c42bc776-logs\") pod \"cinder-api-0\" (UID: \"a8004928-50bc-4db8-a701-4458c42bc776\") " pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.241419 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec252a2a-f9a4-4894-991d-1a70f596519d-config-data\") pod \"ceilometer-0\" (UID: \"ec252a2a-f9a4-4894-991d-1a70f596519d\") " pod="openstack/ceilometer-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.241462 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rd84\" (UniqueName: \"kubernetes.io/projected/a8004928-50bc-4db8-a701-4458c42bc776-kube-api-access-8rd84\") pod \"cinder-api-0\" (UID: \"a8004928-50bc-4db8-a701-4458c42bc776\") " pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.241482 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8004928-50bc-4db8-a701-4458c42bc776-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"a8004928-50bc-4db8-a701-4458c42bc776\") " pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.241513 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a8004928-50bc-4db8-a701-4458c42bc776-config-data-custom\") pod \"cinder-api-0\" (UID: \"a8004928-50bc-4db8-a701-4458c42bc776\") " pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.241539 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec252a2a-f9a4-4894-991d-1a70f596519d-log-httpd\") pod \"ceilometer-0\" (UID: \"ec252a2a-f9a4-4894-991d-1a70f596519d\") " pod="openstack/ceilometer-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.241559 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec252a2a-f9a4-4894-991d-1a70f596519d-scripts\") pod \"ceilometer-0\" (UID: \"ec252a2a-f9a4-4894-991d-1a70f596519d\") " pod="openstack/ceilometer-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.241610 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8004928-50bc-4db8-a701-4458c42bc776-scripts\") pod \"cinder-api-0\" (UID: \"a8004928-50bc-4db8-a701-4458c42bc776\") " pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.241628 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8004928-50bc-4db8-a701-4458c42bc776-config-data\") pod \"cinder-api-0\" (UID: \"a8004928-50bc-4db8-a701-4458c42bc776\") " pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.241694 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec252a2a-f9a4-4894-991d-1a70f596519d-run-httpd\") pod \"ceilometer-0\" (UID: \"ec252a2a-f9a4-4894-991d-1a70f596519d\") " pod="openstack/ceilometer-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.241712 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec252a2a-f9a4-4894-991d-1a70f596519d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ec252a2a-f9a4-4894-991d-1a70f596519d\") " pod="openstack/ceilometer-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.241750 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8004928-50bc-4db8-a701-4458c42bc776-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a8004928-50bc-4db8-a701-4458c42bc776\") " pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.241776 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ec252a2a-f9a4-4894-991d-1a70f596519d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ec252a2a-f9a4-4894-991d-1a70f596519d\") " pod="openstack/ceilometer-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.241810 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8004928-50bc-4db8-a701-4458c42bc776-public-tls-certs\") pod \"cinder-api-0\" (UID: \"a8004928-50bc-4db8-a701-4458c42bc776\") " pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.242739 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8004928-50bc-4db8-a701-4458c42bc776-logs\") pod \"cinder-api-0\" (UID: \"a8004928-50bc-4db8-a701-4458c42bc776\") " pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.242816 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a8004928-50bc-4db8-a701-4458c42bc776-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a8004928-50bc-4db8-a701-4458c42bc776\") " pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.247589 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8004928-50bc-4db8-a701-4458c42bc776-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"a8004928-50bc-4db8-a701-4458c42bc776\") " pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.248192 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec252a2a-f9a4-4894-991d-1a70f596519d-config-data\") pod \"ceilometer-0\" (UID: \"ec252a2a-f9a4-4894-991d-1a70f596519d\") " pod="openstack/ceilometer-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.250040 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec252a2a-f9a4-4894-991d-1a70f596519d-run-httpd\") pod \"ceilometer-0\" (UID: \"ec252a2a-f9a4-4894-991d-1a70f596519d\") " pod="openstack/ceilometer-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.250127 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a8004928-50bc-4db8-a701-4458c42bc776-config-data-custom\") pod \"cinder-api-0\" (UID: \"a8004928-50bc-4db8-a701-4458c42bc776\") " pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.250394 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec252a2a-f9a4-4894-991d-1a70f596519d-log-httpd\") pod \"ceilometer-0\" (UID: \"ec252a2a-f9a4-4894-991d-1a70f596519d\") " pod="openstack/ceilometer-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.250938 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8004928-50bc-4db8-a701-4458c42bc776-public-tls-certs\") pod \"cinder-api-0\" (UID: \"a8004928-50bc-4db8-a701-4458c42bc776\") " pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.251779 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec252a2a-f9a4-4894-991d-1a70f596519d-scripts\") pod \"ceilometer-0\" (UID: \"ec252a2a-f9a4-4894-991d-1a70f596519d\") " pod="openstack/ceilometer-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.254179 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec252a2a-f9a4-4894-991d-1a70f596519d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ec252a2a-f9a4-4894-991d-1a70f596519d\") " pod="openstack/ceilometer-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.254307 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8004928-50bc-4db8-a701-4458c42bc776-scripts\") pod \"cinder-api-0\" (UID: \"a8004928-50bc-4db8-a701-4458c42bc776\") " pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.254402 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8004928-50bc-4db8-a701-4458c42bc776-config-data\") pod \"cinder-api-0\" (UID: \"a8004928-50bc-4db8-a701-4458c42bc776\") " pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.260426 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8004928-50bc-4db8-a701-4458c42bc776-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a8004928-50bc-4db8-a701-4458c42bc776\") " pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.270244 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ec252a2a-f9a4-4894-991d-1a70f596519d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ec252a2a-f9a4-4894-991d-1a70f596519d\") " pod="openstack/ceilometer-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.271216 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jmwm\" (UniqueName: \"kubernetes.io/projected/ec252a2a-f9a4-4894-991d-1a70f596519d-kube-api-access-9jmwm\") pod \"ceilometer-0\" (UID: \"ec252a2a-f9a4-4894-991d-1a70f596519d\") " pod="openstack/ceilometer-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.272174 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rd84\" (UniqueName: \"kubernetes.io/projected/a8004928-50bc-4db8-a701-4458c42bc776-kube-api-access-8rd84\") pod \"cinder-api-0\" (UID: \"a8004928-50bc-4db8-a701-4458c42bc776\") " pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.457603 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.468620 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.711679 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-bfbc874dc-vsh7q" event={"ID":"36ffa543-526d-4d56-b599-06fcfe0988cf","Type":"ContainerStarted","Data":"fe5e27b6f150595d6723a12c077d23e29a6a968de2cd6f3c92d813816f07de44"} Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.712011 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.712180 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.722014 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.743319 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-bfbc874dc-vsh7q" podStartSLOduration=8.743300661 podStartE2EDuration="8.743300661s" podCreationTimestamp="2026-03-13 12:07:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:07:55.731170641 +0000 UTC m=+1191.369437404" watchObservedRunningTime="2026-03-13 12:07:55.743300661 +0000 UTC m=+1191.381567424" Mar 13 12:07:55 crc kubenswrapper[4837]: W0313 12:07:55.994451 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8004928_50bc_4db8_a701_4458c42bc776.slice/crio-25518f190eebe12e29d032768537d154d0c34493c273de162b69131ff26850ea WatchSource:0}: Error finding container 25518f190eebe12e29d032768537d154d0c34493c273de162b69131ff26850ea: Status 404 returned error can't find the container with id 25518f190eebe12e29d032768537d154d0c34493c273de162b69131ff26850ea Mar 13 12:07:55 crc kubenswrapper[4837]: I0313 12:07:55.995919 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.107223 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-78jtc"] Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.108810 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-78jtc" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.130617 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-78jtc"] Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.145918 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.156562 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5596f9dfb8-m9bxb" podUID="2a28d7a5-22a2-460a-a08c-8eb484e6c382" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.156843 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.185743 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzhsn\" (UniqueName: \"kubernetes.io/projected/ff8550d6-aacb-4848-928d-b1581a66d499-kube-api-access-lzhsn\") pod \"nova-api-db-create-78jtc\" (UID: \"ff8550d6-aacb-4848-928d-b1581a66d499\") " pod="openstack/nova-api-db-create-78jtc" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.185799 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff8550d6-aacb-4848-928d-b1581a66d499-operator-scripts\") pod \"nova-api-db-create-78jtc\" (UID: \"ff8550d6-aacb-4848-928d-b1581a66d499\") " pod="openstack/nova-api-db-create-78jtc" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.204674 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-mqgjq"] Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.206154 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-mqgjq" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.249462 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-8886-account-create-update-ljcrw"] Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.251075 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8886-account-create-update-ljcrw" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.253445 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.283077 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-mqgjq"] Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.286885 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e51457d7-9619-4179-8f01-de6ffe5ceb82-operator-scripts\") pod \"nova-cell0-db-create-mqgjq\" (UID: \"e51457d7-9619-4179-8f01-de6ffe5ceb82\") " pod="openstack/nova-cell0-db-create-mqgjq" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.286942 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e397db42-b505-4447-87a2-4c12ed412f28-operator-scripts\") pod \"nova-api-8886-account-create-update-ljcrw\" (UID: \"e397db42-b505-4447-87a2-4c12ed412f28\") " pod="openstack/nova-api-8886-account-create-update-ljcrw" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.286971 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzhsn\" (UniqueName: \"kubernetes.io/projected/ff8550d6-aacb-4848-928d-b1581a66d499-kube-api-access-lzhsn\") pod \"nova-api-db-create-78jtc\" (UID: \"ff8550d6-aacb-4848-928d-b1581a66d499\") " pod="openstack/nova-api-db-create-78jtc" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.287002 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dmms\" (UniqueName: \"kubernetes.io/projected/e51457d7-9619-4179-8f01-de6ffe5ceb82-kube-api-access-9dmms\") pod \"nova-cell0-db-create-mqgjq\" (UID: \"e51457d7-9619-4179-8f01-de6ffe5ceb82\") " pod="openstack/nova-cell0-db-create-mqgjq" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.287025 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff8550d6-aacb-4848-928d-b1581a66d499-operator-scripts\") pod \"nova-api-db-create-78jtc\" (UID: \"ff8550d6-aacb-4848-928d-b1581a66d499\") " pod="openstack/nova-api-db-create-78jtc" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.287288 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8wqn\" (UniqueName: \"kubernetes.io/projected/e397db42-b505-4447-87a2-4c12ed412f28-kube-api-access-q8wqn\") pod \"nova-api-8886-account-create-update-ljcrw\" (UID: \"e397db42-b505-4447-87a2-4c12ed412f28\") " pod="openstack/nova-api-8886-account-create-update-ljcrw" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.287614 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff8550d6-aacb-4848-928d-b1581a66d499-operator-scripts\") pod \"nova-api-db-create-78jtc\" (UID: \"ff8550d6-aacb-4848-928d-b1581a66d499\") " pod="openstack/nova-api-db-create-78jtc" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.287737 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-8886-account-create-update-ljcrw"] Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.312602 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzhsn\" (UniqueName: \"kubernetes.io/projected/ff8550d6-aacb-4848-928d-b1581a66d499-kube-api-access-lzhsn\") pod \"nova-api-db-create-78jtc\" (UID: \"ff8550d6-aacb-4848-928d-b1581a66d499\") " pod="openstack/nova-api-db-create-78jtc" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.389324 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e51457d7-9619-4179-8f01-de6ffe5ceb82-operator-scripts\") pod \"nova-cell0-db-create-mqgjq\" (UID: \"e51457d7-9619-4179-8f01-de6ffe5ceb82\") " pod="openstack/nova-cell0-db-create-mqgjq" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.389380 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e397db42-b505-4447-87a2-4c12ed412f28-operator-scripts\") pod \"nova-api-8886-account-create-update-ljcrw\" (UID: \"e397db42-b505-4447-87a2-4c12ed412f28\") " pod="openstack/nova-api-8886-account-create-update-ljcrw" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.389421 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dmms\" (UniqueName: \"kubernetes.io/projected/e51457d7-9619-4179-8f01-de6ffe5ceb82-kube-api-access-9dmms\") pod \"nova-cell0-db-create-mqgjq\" (UID: \"e51457d7-9619-4179-8f01-de6ffe5ceb82\") " pod="openstack/nova-cell0-db-create-mqgjq" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.389462 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8wqn\" (UniqueName: \"kubernetes.io/projected/e397db42-b505-4447-87a2-4c12ed412f28-kube-api-access-q8wqn\") pod \"nova-api-8886-account-create-update-ljcrw\" (UID: \"e397db42-b505-4447-87a2-4c12ed412f28\") " pod="openstack/nova-api-8886-account-create-update-ljcrw" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.390957 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e51457d7-9619-4179-8f01-de6ffe5ceb82-operator-scripts\") pod \"nova-cell0-db-create-mqgjq\" (UID: \"e51457d7-9619-4179-8f01-de6ffe5ceb82\") " pod="openstack/nova-cell0-db-create-mqgjq" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.391286 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e397db42-b505-4447-87a2-4c12ed412f28-operator-scripts\") pod \"nova-api-8886-account-create-update-ljcrw\" (UID: \"e397db42-b505-4447-87a2-4c12ed412f28\") " pod="openstack/nova-api-8886-account-create-update-ljcrw" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.400125 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-t8qk9"] Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.401582 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-t8qk9" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.424556 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8wqn\" (UniqueName: \"kubernetes.io/projected/e397db42-b505-4447-87a2-4c12ed412f28-kube-api-access-q8wqn\") pod \"nova-api-8886-account-create-update-ljcrw\" (UID: \"e397db42-b505-4447-87a2-4c12ed412f28\") " pod="openstack/nova-api-8886-account-create-update-ljcrw" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.428981 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-667d547b9-4p8qm" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.431655 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dmms\" (UniqueName: \"kubernetes.io/projected/e51457d7-9619-4179-8f01-de6ffe5ceb82-kube-api-access-9dmms\") pod \"nova-cell0-db-create-mqgjq\" (UID: \"e51457d7-9619-4179-8f01-de6ffe5ceb82\") " pod="openstack/nova-cell0-db-create-mqgjq" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.453833 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-4581-account-create-update-w6tc2"] Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.455254 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4581-account-create-update-w6tc2" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.458776 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.460027 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-78jtc" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.467141 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-t8qk9"] Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.490851 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfjj7\" (UniqueName: \"kubernetes.io/projected/8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb-kube-api-access-tfjj7\") pod \"nova-cell1-db-create-t8qk9\" (UID: \"8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb\") " pod="openstack/nova-cell1-db-create-t8qk9" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.491031 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec46ef58-a8e9-4354-b9a1-568535879964-operator-scripts\") pod \"nova-cell0-4581-account-create-update-w6tc2\" (UID: \"ec46ef58-a8e9-4354-b9a1-568535879964\") " pod="openstack/nova-cell0-4581-account-create-update-w6tc2" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.491105 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcs7r\" (UniqueName: \"kubernetes.io/projected/ec46ef58-a8e9-4354-b9a1-568535879964-kube-api-access-bcs7r\") pod \"nova-cell0-4581-account-create-update-w6tc2\" (UID: \"ec46ef58-a8e9-4354-b9a1-568535879964\") " pod="openstack/nova-cell0-4581-account-create-update-w6tc2" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.491125 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb-operator-scripts\") pod \"nova-cell1-db-create-t8qk9\" (UID: \"8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb\") " pod="openstack/nova-cell1-db-create-t8qk9" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.507769 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-4581-account-create-update-w6tc2"] Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.538864 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-67f9f46cf4-9cvcg"] Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.539221 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-67f9f46cf4-9cvcg" podUID="073acab9-3b9b-432a-aef7-b59bad9fa6ea" containerName="neutron-api" containerID="cri-o://592ae8d9d134287aaaf8e8bd131ed85ae4a0f882f18fe4b309c2608413d81458" gracePeriod=30 Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.539856 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-67f9f46cf4-9cvcg" podUID="073acab9-3b9b-432a-aef7-b59bad9fa6ea" containerName="neutron-httpd" containerID="cri-o://5abad0665ef76d6dfd0a8789ace2e34eb216059566daed67b5a9ba64a43b080f" gracePeriod=30 Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.565145 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-mqgjq" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.595191 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8886-account-create-update-ljcrw" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.595504 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec46ef58-a8e9-4354-b9a1-568535879964-operator-scripts\") pod \"nova-cell0-4581-account-create-update-w6tc2\" (UID: \"ec46ef58-a8e9-4354-b9a1-568535879964\") " pod="openstack/nova-cell0-4581-account-create-update-w6tc2" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.595593 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcs7r\" (UniqueName: \"kubernetes.io/projected/ec46ef58-a8e9-4354-b9a1-568535879964-kube-api-access-bcs7r\") pod \"nova-cell0-4581-account-create-update-w6tc2\" (UID: \"ec46ef58-a8e9-4354-b9a1-568535879964\") " pod="openstack/nova-cell0-4581-account-create-update-w6tc2" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.595619 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb-operator-scripts\") pod \"nova-cell1-db-create-t8qk9\" (UID: \"8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb\") " pod="openstack/nova-cell1-db-create-t8qk9" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.595660 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfjj7\" (UniqueName: \"kubernetes.io/projected/8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb-kube-api-access-tfjj7\") pod \"nova-cell1-db-create-t8qk9\" (UID: \"8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb\") " pod="openstack/nova-cell1-db-create-t8qk9" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.596513 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb-operator-scripts\") pod \"nova-cell1-db-create-t8qk9\" (UID: \"8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb\") " pod="openstack/nova-cell1-db-create-t8qk9" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.596712 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec46ef58-a8e9-4354-b9a1-568535879964-operator-scripts\") pod \"nova-cell0-4581-account-create-update-w6tc2\" (UID: \"ec46ef58-a8e9-4354-b9a1-568535879964\") " pod="openstack/nova-cell0-4581-account-create-update-w6tc2" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.619386 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcs7r\" (UniqueName: \"kubernetes.io/projected/ec46ef58-a8e9-4354-b9a1-568535879964-kube-api-access-bcs7r\") pod \"nova-cell0-4581-account-create-update-w6tc2\" (UID: \"ec46ef58-a8e9-4354-b9a1-568535879964\") " pod="openstack/nova-cell0-4581-account-create-update-w6tc2" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.629286 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfjj7\" (UniqueName: \"kubernetes.io/projected/8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb-kube-api-access-tfjj7\") pod \"nova-cell1-db-create-t8qk9\" (UID: \"8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb\") " pod="openstack/nova-cell1-db-create-t8qk9" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.648715 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-c124-account-create-update-8zqgg"] Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.650290 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c124-account-create-update-8zqgg" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.657978 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.697290 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-c124-account-create-update-8zqgg"] Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.700363 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ac843c1-9934-4711-aae6-7f6920596cb3-operator-scripts\") pod \"nova-cell1-c124-account-create-update-8zqgg\" (UID: \"6ac843c1-9934-4711-aae6-7f6920596cb3\") " pod="openstack/nova-cell1-c124-account-create-update-8zqgg" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.700427 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzzgd\" (UniqueName: \"kubernetes.io/projected/6ac843c1-9934-4711-aae6-7f6920596cb3-kube-api-access-nzzgd\") pod \"nova-cell1-c124-account-create-update-8zqgg\" (UID: \"6ac843c1-9934-4711-aae6-7f6920596cb3\") " pod="openstack/nova-cell1-c124-account-create-update-8zqgg" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.785596 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ec252a2a-f9a4-4894-991d-1a70f596519d","Type":"ContainerStarted","Data":"8963d958bbfe2f25190f6d4efa0bcd7a6fe7c107dfdb4e163c3ec794ab189d07"} Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.789292 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-t8qk9" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.789961 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4581-account-create-update-w6tc2" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.803713 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ac843c1-9934-4711-aae6-7f6920596cb3-operator-scripts\") pod \"nova-cell1-c124-account-create-update-8zqgg\" (UID: \"6ac843c1-9934-4711-aae6-7f6920596cb3\") " pod="openstack/nova-cell1-c124-account-create-update-8zqgg" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.803753 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzzgd\" (UniqueName: \"kubernetes.io/projected/6ac843c1-9934-4711-aae6-7f6920596cb3-kube-api-access-nzzgd\") pod \"nova-cell1-c124-account-create-update-8zqgg\" (UID: \"6ac843c1-9934-4711-aae6-7f6920596cb3\") " pod="openstack/nova-cell1-c124-account-create-update-8zqgg" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.804208 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a8004928-50bc-4db8-a701-4458c42bc776","Type":"ContainerStarted","Data":"25518f190eebe12e29d032768537d154d0c34493c273de162b69131ff26850ea"} Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.806977 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ac843c1-9934-4711-aae6-7f6920596cb3-operator-scripts\") pod \"nova-cell1-c124-account-create-update-8zqgg\" (UID: \"6ac843c1-9934-4711-aae6-7f6920596cb3\") " pod="openstack/nova-cell1-c124-account-create-update-8zqgg" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.823882 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzzgd\" (UniqueName: \"kubernetes.io/projected/6ac843c1-9934-4711-aae6-7f6920596cb3-kube-api-access-nzzgd\") pod \"nova-cell1-c124-account-create-update-8zqgg\" (UID: \"6ac843c1-9934-4711-aae6-7f6920596cb3\") " pod="openstack/nova-cell1-c124-account-create-update-8zqgg" Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.848881 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.849114 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f0173ba9-535a-435d-bc51-75c069e69e46" containerName="glance-log" containerID="cri-o://5e147e84ce0affb8bbda5c741ef88617e4ee699f66923e1ff90efae96ad20482" gracePeriod=30 Mar 13 12:07:56 crc kubenswrapper[4837]: I0313 12:07:56.849389 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f0173ba9-535a-435d-bc51-75c069e69e46" containerName="glance-httpd" containerID="cri-o://06e09154da451b1a2177b0ac750f567de80b4f12b1c2aa79102cdc2b77f671b6" gracePeriod=30 Mar 13 12:07:57 crc kubenswrapper[4837]: I0313 12:07:57.008814 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c124-account-create-update-8zqgg" Mar 13 12:07:57 crc kubenswrapper[4837]: I0313 12:07:57.182865 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8944c2be-da67-4cdd-9f75-0e473253e932" path="/var/lib/kubelet/pods/8944c2be-da67-4cdd-9f75-0e473253e932/volumes" Mar 13 12:07:57 crc kubenswrapper[4837]: I0313 12:07:57.183564 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-78jtc"] Mar 13 12:07:57 crc kubenswrapper[4837]: I0313 12:07:57.801682 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-mqgjq"] Mar 13 12:07:57 crc kubenswrapper[4837]: I0313 12:07:57.818498 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-t8qk9"] Mar 13 12:07:57 crc kubenswrapper[4837]: I0313 12:07:57.842405 4837 generic.go:334] "Generic (PLEG): container finished" podID="073acab9-3b9b-432a-aef7-b59bad9fa6ea" containerID="5abad0665ef76d6dfd0a8789ace2e34eb216059566daed67b5a9ba64a43b080f" exitCode=0 Mar 13 12:07:57 crc kubenswrapper[4837]: I0313 12:07:57.842528 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67f9f46cf4-9cvcg" event={"ID":"073acab9-3b9b-432a-aef7-b59bad9fa6ea","Type":"ContainerDied","Data":"5abad0665ef76d6dfd0a8789ace2e34eb216059566daed67b5a9ba64a43b080f"} Mar 13 12:07:57 crc kubenswrapper[4837]: I0313 12:07:57.845946 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a8004928-50bc-4db8-a701-4458c42bc776","Type":"ContainerStarted","Data":"4e476b792fcc3524b8f2bd0219d572eb68bfad23422ead86a6897878642bf878"} Mar 13 12:07:57 crc kubenswrapper[4837]: I0313 12:07:57.847896 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-78jtc" event={"ID":"ff8550d6-aacb-4848-928d-b1581a66d499","Type":"ContainerStarted","Data":"1c35974102ee9d500e8bf603751d70cec13d07eed47dcd00ec2798fe9d358807"} Mar 13 12:07:57 crc kubenswrapper[4837]: I0313 12:07:57.847930 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-78jtc" event={"ID":"ff8550d6-aacb-4848-928d-b1581a66d499","Type":"ContainerStarted","Data":"3a6b7d9266c6c68f69a431eb6b6c17756be54da729f94a82611fe08a1b1a72be"} Mar 13 12:07:57 crc kubenswrapper[4837]: I0313 12:07:57.894543 4837 generic.go:334] "Generic (PLEG): container finished" podID="f0173ba9-535a-435d-bc51-75c069e69e46" containerID="5e147e84ce0affb8bbda5c741ef88617e4ee699f66923e1ff90efae96ad20482" exitCode=143 Mar 13 12:07:57 crc kubenswrapper[4837]: I0313 12:07:57.894807 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f0173ba9-535a-435d-bc51-75c069e69e46","Type":"ContainerDied","Data":"5e147e84ce0affb8bbda5c741ef88617e4ee699f66923e1ff90efae96ad20482"} Mar 13 12:07:57 crc kubenswrapper[4837]: I0313 12:07:57.902385 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ec252a2a-f9a4-4894-991d-1a70f596519d","Type":"ContainerStarted","Data":"f13323c6b5c7472c3b9f76328f6f7d80a5868b615ab9c24cb1496e6b292c2e9a"} Mar 13 12:07:57 crc kubenswrapper[4837]: I0313 12:07:57.974499 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-78jtc" podStartSLOduration=1.9744745510000001 podStartE2EDuration="1.974474551s" podCreationTimestamp="2026-03-13 12:07:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:07:57.88851587 +0000 UTC m=+1193.526782633" watchObservedRunningTime="2026-03-13 12:07:57.974474551 +0000 UTC m=+1193.612741314" Mar 13 12:07:57 crc kubenswrapper[4837]: I0313 12:07:57.979277 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-8886-account-create-update-ljcrw"] Mar 13 12:07:58 crc kubenswrapper[4837]: I0313 12:07:58.004999 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-4581-account-create-update-w6tc2"] Mar 13 12:07:58 crc kubenswrapper[4837]: W0313 12:07:58.180998 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ac843c1_9934_4711_aae6_7f6920596cb3.slice/crio-bd1db75a0f6b89871b87bcc95f04a2edb02439e89ab9b5e1a096304f7490db99 WatchSource:0}: Error finding container bd1db75a0f6b89871b87bcc95f04a2edb02439e89ab9b5e1a096304f7490db99: Status 404 returned error can't find the container with id bd1db75a0f6b89871b87bcc95f04a2edb02439e89ab9b5e1a096304f7490db99 Mar 13 12:07:58 crc kubenswrapper[4837]: I0313 12:07:58.193820 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-c124-account-create-update-8zqgg"] Mar 13 12:07:58 crc kubenswrapper[4837]: I0313 12:07:58.912331 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-t8qk9" event={"ID":"8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb","Type":"ContainerStarted","Data":"5e6f4da7142b59c465f13069e8abffd32ebc3f04eeb6b88f772977ed584113c2"} Mar 13 12:07:58 crc kubenswrapper[4837]: I0313 12:07:58.912651 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-t8qk9" event={"ID":"8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb","Type":"ContainerStarted","Data":"69cde6bb4086abefd3b421a4e0db78878c121c0be2c2674284c1675581361bdc"} Mar 13 12:07:58 crc kubenswrapper[4837]: I0313 12:07:58.914124 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4581-account-create-update-w6tc2" event={"ID":"ec46ef58-a8e9-4354-b9a1-568535879964","Type":"ContainerStarted","Data":"c2cc081c6cf65b0ab460d8cc6143c9f0d5447d7db94e85de44cfe2121792b6a0"} Mar 13 12:07:58 crc kubenswrapper[4837]: I0313 12:07:58.914172 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4581-account-create-update-w6tc2" event={"ID":"ec46ef58-a8e9-4354-b9a1-568535879964","Type":"ContainerStarted","Data":"c19e4f72f1c5f35690c3fb2bd2be44a6fa31ffdb302bbef941997a819e69b808"} Mar 13 12:07:58 crc kubenswrapper[4837]: I0313 12:07:58.915499 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c124-account-create-update-8zqgg" event={"ID":"6ac843c1-9934-4711-aae6-7f6920596cb3","Type":"ContainerStarted","Data":"5d76ffad79a0d1339467174946f42bf027114aea75c47bb037057ca882b93f88"} Mar 13 12:07:58 crc kubenswrapper[4837]: I0313 12:07:58.915529 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c124-account-create-update-8zqgg" event={"ID":"6ac843c1-9934-4711-aae6-7f6920596cb3","Type":"ContainerStarted","Data":"bd1db75a0f6b89871b87bcc95f04a2edb02439e89ab9b5e1a096304f7490db99"} Mar 13 12:07:58 crc kubenswrapper[4837]: I0313 12:07:58.916961 4837 generic.go:334] "Generic (PLEG): container finished" podID="e51457d7-9619-4179-8f01-de6ffe5ceb82" containerID="76d8bcdb73b13d595e4c37de91e0da9193b0dfe32e04f54fbcbfc723d4f95d1f" exitCode=0 Mar 13 12:07:58 crc kubenswrapper[4837]: I0313 12:07:58.917074 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-mqgjq" event={"ID":"e51457d7-9619-4179-8f01-de6ffe5ceb82","Type":"ContainerDied","Data":"76d8bcdb73b13d595e4c37de91e0da9193b0dfe32e04f54fbcbfc723d4f95d1f"} Mar 13 12:07:58 crc kubenswrapper[4837]: I0313 12:07:58.917091 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-mqgjq" event={"ID":"e51457d7-9619-4179-8f01-de6ffe5ceb82","Type":"ContainerStarted","Data":"1ff4b0ecebe26fef949db006c82847907f81761a168e24a50335110289080520"} Mar 13 12:07:58 crc kubenswrapper[4837]: I0313 12:07:58.919122 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ec252a2a-f9a4-4894-991d-1a70f596519d","Type":"ContainerStarted","Data":"759b2c4c55f496a02021bffec50cb8a1d6cfb6037ffefb8f05fc410f86a3f8d4"} Mar 13 12:07:58 crc kubenswrapper[4837]: I0313 12:07:58.920845 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a8004928-50bc-4db8-a701-4458c42bc776","Type":"ContainerStarted","Data":"132bbc5ebed761d4d1fc57b12552302464956cd861bf757a4ff02da13f8f52f6"} Mar 13 12:07:58 crc kubenswrapper[4837]: I0313 12:07:58.920991 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 13 12:07:58 crc kubenswrapper[4837]: I0313 12:07:58.922309 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8886-account-create-update-ljcrw" event={"ID":"e397db42-b505-4447-87a2-4c12ed412f28","Type":"ContainerStarted","Data":"a98015db97ff0f5b37e30b833d1fc53c9a24f182fbe7bafcf011e2544e8dd80d"} Mar 13 12:07:58 crc kubenswrapper[4837]: I0313 12:07:58.922341 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8886-account-create-update-ljcrw" event={"ID":"e397db42-b505-4447-87a2-4c12ed412f28","Type":"ContainerStarted","Data":"e4a50d17ef0d5b10ca2c0d2aeafd143cd9c5e63e31ce86c11aca1ecba4422049"} Mar 13 12:07:58 crc kubenswrapper[4837]: I0313 12:07:58.923685 4837 generic.go:334] "Generic (PLEG): container finished" podID="ff8550d6-aacb-4848-928d-b1581a66d499" containerID="1c35974102ee9d500e8bf603751d70cec13d07eed47dcd00ec2798fe9d358807" exitCode=0 Mar 13 12:07:58 crc kubenswrapper[4837]: I0313 12:07:58.923718 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-78jtc" event={"ID":"ff8550d6-aacb-4848-928d-b1581a66d499","Type":"ContainerDied","Data":"1c35974102ee9d500e8bf603751d70cec13d07eed47dcd00ec2798fe9d358807"} Mar 13 12:07:58 crc kubenswrapper[4837]: I0313 12:07:58.935545 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-t8qk9" podStartSLOduration=2.93552743 podStartE2EDuration="2.93552743s" podCreationTimestamp="2026-03-13 12:07:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:07:58.925823125 +0000 UTC m=+1194.564089908" watchObservedRunningTime="2026-03-13 12:07:58.93552743 +0000 UTC m=+1194.573794193" Mar 13 12:07:58 crc kubenswrapper[4837]: I0313 12:07:58.964091 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-8886-account-create-update-ljcrw" podStartSLOduration=2.964075428 podStartE2EDuration="2.964075428s" podCreationTimestamp="2026-03-13 12:07:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:07:58.955098345 +0000 UTC m=+1194.593365108" watchObservedRunningTime="2026-03-13 12:07:58.964075428 +0000 UTC m=+1194.602342191" Mar 13 12:07:58 crc kubenswrapper[4837]: I0313 12:07:58.970830 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-4581-account-create-update-w6tc2" podStartSLOduration=2.970813599 podStartE2EDuration="2.970813599s" podCreationTimestamp="2026-03-13 12:07:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:07:58.9683083 +0000 UTC m=+1194.606575063" watchObservedRunningTime="2026-03-13 12:07:58.970813599 +0000 UTC m=+1194.609080362" Mar 13 12:07:59 crc kubenswrapper[4837]: I0313 12:07:59.019108 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-c124-account-create-update-8zqgg" podStartSLOduration=3.019083696 podStartE2EDuration="3.019083696s" podCreationTimestamp="2026-03-13 12:07:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:07:59.010150455 +0000 UTC m=+1194.648417238" watchObservedRunningTime="2026-03-13 12:07:59.019083696 +0000 UTC m=+1194.657350459" Mar 13 12:07:59 crc kubenswrapper[4837]: I0313 12:07:59.044894 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.044877446 podStartE2EDuration="4.044877446s" podCreationTimestamp="2026-03-13 12:07:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:07:59.036762052 +0000 UTC m=+1194.675028815" watchObservedRunningTime="2026-03-13 12:07:59.044877446 +0000 UTC m=+1194.683144199" Mar 13 12:07:59 crc kubenswrapper[4837]: I0313 12:07:59.405664 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="6f484085-7b83-46a8-80c2-b3ef6f8b8798" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.169:8776/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 12:07:59 crc kubenswrapper[4837]: I0313 12:07:59.942130 4837 generic.go:334] "Generic (PLEG): container finished" podID="8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb" containerID="5e6f4da7142b59c465f13069e8abffd32ebc3f04eeb6b88f772977ed584113c2" exitCode=0 Mar 13 12:07:59 crc kubenswrapper[4837]: I0313 12:07:59.942200 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-t8qk9" event={"ID":"8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb","Type":"ContainerDied","Data":"5e6f4da7142b59c465f13069e8abffd32ebc3f04eeb6b88f772977ed584113c2"} Mar 13 12:07:59 crc kubenswrapper[4837]: I0313 12:07:59.944777 4837 generic.go:334] "Generic (PLEG): container finished" podID="6ac843c1-9934-4711-aae6-7f6920596cb3" containerID="5d76ffad79a0d1339467174946f42bf027114aea75c47bb037057ca882b93f88" exitCode=0 Mar 13 12:07:59 crc kubenswrapper[4837]: I0313 12:07:59.945039 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c124-account-create-update-8zqgg" event={"ID":"6ac843c1-9934-4711-aae6-7f6920596cb3","Type":"ContainerDied","Data":"5d76ffad79a0d1339467174946f42bf027114aea75c47bb037057ca882b93f88"} Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.145284 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556728-7n29h"] Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.147207 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556728-7n29h" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.149158 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.149444 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.149624 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.155583 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556728-7n29h"] Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.212383 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2d5k\" (UniqueName: \"kubernetes.io/projected/47ae408b-faad-4a52-ad09-428242645381-kube-api-access-l2d5k\") pod \"auto-csr-approver-29556728-7n29h\" (UID: \"47ae408b-faad-4a52-ad09-428242645381\") " pod="openshift-infra/auto-csr-approver-29556728-7n29h" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.313574 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2d5k\" (UniqueName: \"kubernetes.io/projected/47ae408b-faad-4a52-ad09-428242645381-kube-api-access-l2d5k\") pod \"auto-csr-approver-29556728-7n29h\" (UID: \"47ae408b-faad-4a52-ad09-428242645381\") " pod="openshift-infra/auto-csr-approver-29556728-7n29h" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.334410 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2d5k\" (UniqueName: \"kubernetes.io/projected/47ae408b-faad-4a52-ad09-428242645381-kube-api-access-l2d5k\") pod \"auto-csr-approver-29556728-7n29h\" (UID: \"47ae408b-faad-4a52-ad09-428242645381\") " pod="openshift-infra/auto-csr-approver-29556728-7n29h" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.432088 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-78jtc" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.524326 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-mqgjq" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.541043 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556728-7n29h" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.623249 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e51457d7-9619-4179-8f01-de6ffe5ceb82-operator-scripts\") pod \"e51457d7-9619-4179-8f01-de6ffe5ceb82\" (UID: \"e51457d7-9619-4179-8f01-de6ffe5ceb82\") " Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.623342 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff8550d6-aacb-4848-928d-b1581a66d499-operator-scripts\") pod \"ff8550d6-aacb-4848-928d-b1581a66d499\" (UID: \"ff8550d6-aacb-4848-928d-b1581a66d499\") " Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.623396 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzhsn\" (UniqueName: \"kubernetes.io/projected/ff8550d6-aacb-4848-928d-b1581a66d499-kube-api-access-lzhsn\") pod \"ff8550d6-aacb-4848-928d-b1581a66d499\" (UID: \"ff8550d6-aacb-4848-928d-b1581a66d499\") " Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.623423 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dmms\" (UniqueName: \"kubernetes.io/projected/e51457d7-9619-4179-8f01-de6ffe5ceb82-kube-api-access-9dmms\") pod \"e51457d7-9619-4179-8f01-de6ffe5ceb82\" (UID: \"e51457d7-9619-4179-8f01-de6ffe5ceb82\") " Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.625605 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e51457d7-9619-4179-8f01-de6ffe5ceb82-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e51457d7-9619-4179-8f01-de6ffe5ceb82" (UID: "e51457d7-9619-4179-8f01-de6ffe5ceb82"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.627397 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff8550d6-aacb-4848-928d-b1581a66d499-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ff8550d6-aacb-4848-928d-b1581a66d499" (UID: "ff8550d6-aacb-4848-928d-b1581a66d499"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.636038 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff8550d6-aacb-4848-928d-b1581a66d499-kube-api-access-lzhsn" (OuterVolumeSpecName: "kube-api-access-lzhsn") pod "ff8550d6-aacb-4848-928d-b1581a66d499" (UID: "ff8550d6-aacb-4848-928d-b1581a66d499"). InnerVolumeSpecName "kube-api-access-lzhsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.636112 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e51457d7-9619-4179-8f01-de6ffe5ceb82-kube-api-access-9dmms" (OuterVolumeSpecName: "kube-api-access-9dmms") pod "e51457d7-9619-4179-8f01-de6ffe5ceb82" (UID: "e51457d7-9619-4179-8f01-de6ffe5ceb82"). InnerVolumeSpecName "kube-api-access-9dmms". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.726035 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e51457d7-9619-4179-8f01-de6ffe5ceb82-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.726319 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff8550d6-aacb-4848-928d-b1581a66d499-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.726332 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzhsn\" (UniqueName: \"kubernetes.io/projected/ff8550d6-aacb-4848-928d-b1581a66d499-kube-api-access-lzhsn\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.726343 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dmms\" (UniqueName: \"kubernetes.io/projected/e51457d7-9619-4179-8f01-de6ffe5ceb82-kube-api-access-9dmms\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.762081 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.827373 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0173ba9-535a-435d-bc51-75c069e69e46-logs\") pod \"f0173ba9-535a-435d-bc51-75c069e69e46\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.827473 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0173ba9-535a-435d-bc51-75c069e69e46-combined-ca-bundle\") pod \"f0173ba9-535a-435d-bc51-75c069e69e46\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.827496 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"f0173ba9-535a-435d-bc51-75c069e69e46\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.827522 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0173ba9-535a-435d-bc51-75c069e69e46-config-data\") pod \"f0173ba9-535a-435d-bc51-75c069e69e46\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.827549 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0173ba9-535a-435d-bc51-75c069e69e46-public-tls-certs\") pod \"f0173ba9-535a-435d-bc51-75c069e69e46\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.827571 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0173ba9-535a-435d-bc51-75c069e69e46-scripts\") pod \"f0173ba9-535a-435d-bc51-75c069e69e46\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.827615 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f0173ba9-535a-435d-bc51-75c069e69e46-httpd-run\") pod \"f0173ba9-535a-435d-bc51-75c069e69e46\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.827669 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvzd7\" (UniqueName: \"kubernetes.io/projected/f0173ba9-535a-435d-bc51-75c069e69e46-kube-api-access-gvzd7\") pod \"f0173ba9-535a-435d-bc51-75c069e69e46\" (UID: \"f0173ba9-535a-435d-bc51-75c069e69e46\") " Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.828492 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0173ba9-535a-435d-bc51-75c069e69e46-logs" (OuterVolumeSpecName: "logs") pod "f0173ba9-535a-435d-bc51-75c069e69e46" (UID: "f0173ba9-535a-435d-bc51-75c069e69e46"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.830822 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0173ba9-535a-435d-bc51-75c069e69e46-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f0173ba9-535a-435d-bc51-75c069e69e46" (UID: "f0173ba9-535a-435d-bc51-75c069e69e46"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.835613 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0173ba9-535a-435d-bc51-75c069e69e46-kube-api-access-gvzd7" (OuterVolumeSpecName: "kube-api-access-gvzd7") pod "f0173ba9-535a-435d-bc51-75c069e69e46" (UID: "f0173ba9-535a-435d-bc51-75c069e69e46"). InnerVolumeSpecName "kube-api-access-gvzd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.839104 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0173ba9-535a-435d-bc51-75c069e69e46-scripts" (OuterVolumeSpecName: "scripts") pod "f0173ba9-535a-435d-bc51-75c069e69e46" (UID: "f0173ba9-535a-435d-bc51-75c069e69e46"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.843125 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "f0173ba9-535a-435d-bc51-75c069e69e46" (UID: "f0173ba9-535a-435d-bc51-75c069e69e46"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.886956 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0173ba9-535a-435d-bc51-75c069e69e46-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f0173ba9-535a-435d-bc51-75c069e69e46" (UID: "f0173ba9-535a-435d-bc51-75c069e69e46"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.907551 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0173ba9-535a-435d-bc51-75c069e69e46-config-data" (OuterVolumeSpecName: "config-data") pod "f0173ba9-535a-435d-bc51-75c069e69e46" (UID: "f0173ba9-535a-435d-bc51-75c069e69e46"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.929283 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0173ba9-535a-435d-bc51-75c069e69e46-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.929331 4837 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f0173ba9-535a-435d-bc51-75c069e69e46-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.929347 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvzd7\" (UniqueName: \"kubernetes.io/projected/f0173ba9-535a-435d-bc51-75c069e69e46-kube-api-access-gvzd7\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.929358 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0173ba9-535a-435d-bc51-75c069e69e46-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.929367 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0173ba9-535a-435d-bc51-75c069e69e46-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.929387 4837 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.929398 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0173ba9-535a-435d-bc51-75c069e69e46-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.929734 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0173ba9-535a-435d-bc51-75c069e69e46-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f0173ba9-535a-435d-bc51-75c069e69e46" (UID: "f0173ba9-535a-435d-bc51-75c069e69e46"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.958894 4837 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.966746 4837 generic.go:334] "Generic (PLEG): container finished" podID="e397db42-b505-4447-87a2-4c12ed412f28" containerID="a98015db97ff0f5b37e30b833d1fc53c9a24f182fbe7bafcf011e2544e8dd80d" exitCode=0 Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.966844 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8886-account-create-update-ljcrw" event={"ID":"e397db42-b505-4447-87a2-4c12ed412f28","Type":"ContainerDied","Data":"a98015db97ff0f5b37e30b833d1fc53c9a24f182fbe7bafcf011e2544e8dd80d"} Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.972631 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-78jtc" event={"ID":"ff8550d6-aacb-4848-928d-b1581a66d499","Type":"ContainerDied","Data":"3a6b7d9266c6c68f69a431eb6b6c17756be54da729f94a82611fe08a1b1a72be"} Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.972851 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a6b7d9266c6c68f69a431eb6b6c17756be54da729f94a82611fe08a1b1a72be" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.972985 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-78jtc" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.975305 4837 generic.go:334] "Generic (PLEG): container finished" podID="ec46ef58-a8e9-4354-b9a1-568535879964" containerID="c2cc081c6cf65b0ab460d8cc6143c9f0d5447d7db94e85de44cfe2121792b6a0" exitCode=0 Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.975375 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4581-account-create-update-w6tc2" event={"ID":"ec46ef58-a8e9-4354-b9a1-568535879964","Type":"ContainerDied","Data":"c2cc081c6cf65b0ab460d8cc6143c9f0d5447d7db94e85de44cfe2121792b6a0"} Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.978608 4837 generic.go:334] "Generic (PLEG): container finished" podID="f0173ba9-535a-435d-bc51-75c069e69e46" containerID="06e09154da451b1a2177b0ac750f567de80b4f12b1c2aa79102cdc2b77f671b6" exitCode=0 Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.978741 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.978981 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f0173ba9-535a-435d-bc51-75c069e69e46","Type":"ContainerDied","Data":"06e09154da451b1a2177b0ac750f567de80b4f12b1c2aa79102cdc2b77f671b6"} Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.979043 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f0173ba9-535a-435d-bc51-75c069e69e46","Type":"ContainerDied","Data":"a0b7b975f0a853ab5afecbe29fde34fc4210243637b54de91d96895e00f81e30"} Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.979065 4837 scope.go:117] "RemoveContainer" containerID="06e09154da451b1a2177b0ac750f567de80b4f12b1c2aa79102cdc2b77f671b6" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.987005 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-mqgjq" event={"ID":"e51457d7-9619-4179-8f01-de6ffe5ceb82","Type":"ContainerDied","Data":"1ff4b0ecebe26fef949db006c82847907f81761a168e24a50335110289080520"} Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.987041 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ff4b0ecebe26fef949db006c82847907f81761a168e24a50335110289080520" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.987110 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-mqgjq" Mar 13 12:08:00 crc kubenswrapper[4837]: I0313 12:08:00.998518 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ec252a2a-f9a4-4894-991d-1a70f596519d","Type":"ContainerStarted","Data":"2fa9cafe6c8b9f2bb8d388b54db3598398838d667dccefef13c7e8655cabb201"} Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.025865 4837 scope.go:117] "RemoveContainer" containerID="5e147e84ce0affb8bbda5c741ef88617e4ee699f66923e1ff90efae96ad20482" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.032259 4837 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.032303 4837 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0173ba9-535a-435d-bc51-75c069e69e46-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.101493 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.115792 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.133411 4837 scope.go:117] "RemoveContainer" containerID="06e09154da451b1a2177b0ac750f567de80b4f12b1c2aa79102cdc2b77f671b6" Mar 13 12:08:01 crc kubenswrapper[4837]: E0313 12:08:01.138782 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06e09154da451b1a2177b0ac750f567de80b4f12b1c2aa79102cdc2b77f671b6\": container with ID starting with 06e09154da451b1a2177b0ac750f567de80b4f12b1c2aa79102cdc2b77f671b6 not found: ID does not exist" containerID="06e09154da451b1a2177b0ac750f567de80b4f12b1c2aa79102cdc2b77f671b6" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.138823 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06e09154da451b1a2177b0ac750f567de80b4f12b1c2aa79102cdc2b77f671b6"} err="failed to get container status \"06e09154da451b1a2177b0ac750f567de80b4f12b1c2aa79102cdc2b77f671b6\": rpc error: code = NotFound desc = could not find container \"06e09154da451b1a2177b0ac750f567de80b4f12b1c2aa79102cdc2b77f671b6\": container with ID starting with 06e09154da451b1a2177b0ac750f567de80b4f12b1c2aa79102cdc2b77f671b6 not found: ID does not exist" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.138846 4837 scope.go:117] "RemoveContainer" containerID="5e147e84ce0affb8bbda5c741ef88617e4ee699f66923e1ff90efae96ad20482" Mar 13 12:08:01 crc kubenswrapper[4837]: E0313 12:08:01.141866 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e147e84ce0affb8bbda5c741ef88617e4ee699f66923e1ff90efae96ad20482\": container with ID starting with 5e147e84ce0affb8bbda5c741ef88617e4ee699f66923e1ff90efae96ad20482 not found: ID does not exist" containerID="5e147e84ce0affb8bbda5c741ef88617e4ee699f66923e1ff90efae96ad20482" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.141917 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e147e84ce0affb8bbda5c741ef88617e4ee699f66923e1ff90efae96ad20482"} err="failed to get container status \"5e147e84ce0affb8bbda5c741ef88617e4ee699f66923e1ff90efae96ad20482\": rpc error: code = NotFound desc = could not find container \"5e147e84ce0affb8bbda5c741ef88617e4ee699f66923e1ff90efae96ad20482\": container with ID starting with 5e147e84ce0affb8bbda5c741ef88617e4ee699f66923e1ff90efae96ad20482 not found: ID does not exist" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.146371 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556728-7n29h"] Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.156179 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 12:08:01 crc kubenswrapper[4837]: E0313 12:08:01.156606 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0173ba9-535a-435d-bc51-75c069e69e46" containerName="glance-log" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.156622 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0173ba9-535a-435d-bc51-75c069e69e46" containerName="glance-log" Mar 13 12:08:01 crc kubenswrapper[4837]: E0313 12:08:01.156660 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e51457d7-9619-4179-8f01-de6ffe5ceb82" containerName="mariadb-database-create" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.156669 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e51457d7-9619-4179-8f01-de6ffe5ceb82" containerName="mariadb-database-create" Mar 13 12:08:01 crc kubenswrapper[4837]: E0313 12:08:01.156683 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0173ba9-535a-435d-bc51-75c069e69e46" containerName="glance-httpd" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.156689 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0173ba9-535a-435d-bc51-75c069e69e46" containerName="glance-httpd" Mar 13 12:08:01 crc kubenswrapper[4837]: E0313 12:08:01.156707 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff8550d6-aacb-4848-928d-b1581a66d499" containerName="mariadb-database-create" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.156712 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff8550d6-aacb-4848-928d-b1581a66d499" containerName="mariadb-database-create" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.156877 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0173ba9-535a-435d-bc51-75c069e69e46" containerName="glance-log" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.156894 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="e51457d7-9619-4179-8f01-de6ffe5ceb82" containerName="mariadb-database-create" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.156904 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff8550d6-aacb-4848-928d-b1581a66d499" containerName="mariadb-database-create" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.156917 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0173ba9-535a-435d-bc51-75c069e69e46" containerName="glance-httpd" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.157858 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.160210 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.160784 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.174028 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.350593 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3f87d89-35d5-4dc0-9c37-5297718a9351-scripts\") pod \"glance-default-external-api-0\" (UID: \"d3f87d89-35d5-4dc0-9c37-5297718a9351\") " pod="openstack/glance-default-external-api-0" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.350829 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3f87d89-35d5-4dc0-9c37-5297718a9351-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d3f87d89-35d5-4dc0-9c37-5297718a9351\") " pod="openstack/glance-default-external-api-0" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.350894 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"d3f87d89-35d5-4dc0-9c37-5297718a9351\") " pod="openstack/glance-default-external-api-0" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.350918 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8hs6\" (UniqueName: \"kubernetes.io/projected/d3f87d89-35d5-4dc0-9c37-5297718a9351-kube-api-access-d8hs6\") pod \"glance-default-external-api-0\" (UID: \"d3f87d89-35d5-4dc0-9c37-5297718a9351\") " pod="openstack/glance-default-external-api-0" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.351230 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d3f87d89-35d5-4dc0-9c37-5297718a9351-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d3f87d89-35d5-4dc0-9c37-5297718a9351\") " pod="openstack/glance-default-external-api-0" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.351268 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3f87d89-35d5-4dc0-9c37-5297718a9351-logs\") pod \"glance-default-external-api-0\" (UID: \"d3f87d89-35d5-4dc0-9c37-5297718a9351\") " pod="openstack/glance-default-external-api-0" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.351301 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3f87d89-35d5-4dc0-9c37-5297718a9351-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d3f87d89-35d5-4dc0-9c37-5297718a9351\") " pod="openstack/glance-default-external-api-0" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.351336 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3f87d89-35d5-4dc0-9c37-5297718a9351-config-data\") pod \"glance-default-external-api-0\" (UID: \"d3f87d89-35d5-4dc0-9c37-5297718a9351\") " pod="openstack/glance-default-external-api-0" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.453448 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"d3f87d89-35d5-4dc0-9c37-5297718a9351\") " pod="openstack/glance-default-external-api-0" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.453502 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8hs6\" (UniqueName: \"kubernetes.io/projected/d3f87d89-35d5-4dc0-9c37-5297718a9351-kube-api-access-d8hs6\") pod \"glance-default-external-api-0\" (UID: \"d3f87d89-35d5-4dc0-9c37-5297718a9351\") " pod="openstack/glance-default-external-api-0" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.453545 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d3f87d89-35d5-4dc0-9c37-5297718a9351-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d3f87d89-35d5-4dc0-9c37-5297718a9351\") " pod="openstack/glance-default-external-api-0" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.453576 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3f87d89-35d5-4dc0-9c37-5297718a9351-logs\") pod \"glance-default-external-api-0\" (UID: \"d3f87d89-35d5-4dc0-9c37-5297718a9351\") " pod="openstack/glance-default-external-api-0" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.453616 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3f87d89-35d5-4dc0-9c37-5297718a9351-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d3f87d89-35d5-4dc0-9c37-5297718a9351\") " pod="openstack/glance-default-external-api-0" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.453682 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3f87d89-35d5-4dc0-9c37-5297718a9351-config-data\") pod \"glance-default-external-api-0\" (UID: \"d3f87d89-35d5-4dc0-9c37-5297718a9351\") " pod="openstack/glance-default-external-api-0" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.453708 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3f87d89-35d5-4dc0-9c37-5297718a9351-scripts\") pod \"glance-default-external-api-0\" (UID: \"d3f87d89-35d5-4dc0-9c37-5297718a9351\") " pod="openstack/glance-default-external-api-0" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.453767 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3f87d89-35d5-4dc0-9c37-5297718a9351-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d3f87d89-35d5-4dc0-9c37-5297718a9351\") " pod="openstack/glance-default-external-api-0" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.464195 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3f87d89-35d5-4dc0-9c37-5297718a9351-logs\") pod \"glance-default-external-api-0\" (UID: \"d3f87d89-35d5-4dc0-9c37-5297718a9351\") " pod="openstack/glance-default-external-api-0" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.464500 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"d3f87d89-35d5-4dc0-9c37-5297718a9351\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.468901 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3f87d89-35d5-4dc0-9c37-5297718a9351-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d3f87d89-35d5-4dc0-9c37-5297718a9351\") " pod="openstack/glance-default-external-api-0" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.474344 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3f87d89-35d5-4dc0-9c37-5297718a9351-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d3f87d89-35d5-4dc0-9c37-5297718a9351\") " pod="openstack/glance-default-external-api-0" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.474597 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d3f87d89-35d5-4dc0-9c37-5297718a9351-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d3f87d89-35d5-4dc0-9c37-5297718a9351\") " pod="openstack/glance-default-external-api-0" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.483496 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3f87d89-35d5-4dc0-9c37-5297718a9351-scripts\") pod \"glance-default-external-api-0\" (UID: \"d3f87d89-35d5-4dc0-9c37-5297718a9351\") " pod="openstack/glance-default-external-api-0" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.485175 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3f87d89-35d5-4dc0-9c37-5297718a9351-config-data\") pod \"glance-default-external-api-0\" (UID: \"d3f87d89-35d5-4dc0-9c37-5297718a9351\") " pod="openstack/glance-default-external-api-0" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.530749 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8hs6\" (UniqueName: \"kubernetes.io/projected/d3f87d89-35d5-4dc0-9c37-5297718a9351-kube-api-access-d8hs6\") pod \"glance-default-external-api-0\" (UID: \"d3f87d89-35d5-4dc0-9c37-5297718a9351\") " pod="openstack/glance-default-external-api-0" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.541456 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"d3f87d89-35d5-4dc0-9c37-5297718a9351\") " pod="openstack/glance-default-external-api-0" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.597311 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c124-account-create-update-8zqgg" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.609316 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-t8qk9" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.762591 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ac843c1-9934-4711-aae6-7f6920596cb3-operator-scripts\") pod \"6ac843c1-9934-4711-aae6-7f6920596cb3\" (UID: \"6ac843c1-9934-4711-aae6-7f6920596cb3\") " Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.762744 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzzgd\" (UniqueName: \"kubernetes.io/projected/6ac843c1-9934-4711-aae6-7f6920596cb3-kube-api-access-nzzgd\") pod \"6ac843c1-9934-4711-aae6-7f6920596cb3\" (UID: \"6ac843c1-9934-4711-aae6-7f6920596cb3\") " Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.762808 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfjj7\" (UniqueName: \"kubernetes.io/projected/8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb-kube-api-access-tfjj7\") pod \"8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb\" (UID: \"8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb\") " Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.763169 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb-operator-scripts\") pod \"8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb\" (UID: \"8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb\") " Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.764207 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ac843c1-9934-4711-aae6-7f6920596cb3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6ac843c1-9934-4711-aae6-7f6920596cb3" (UID: "6ac843c1-9934-4711-aae6-7f6920596cb3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.764998 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb" (UID: "8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.765071 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ac843c1-9934-4711-aae6-7f6920596cb3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.767666 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ac843c1-9934-4711-aae6-7f6920596cb3-kube-api-access-nzzgd" (OuterVolumeSpecName: "kube-api-access-nzzgd") pod "6ac843c1-9934-4711-aae6-7f6920596cb3" (UID: "6ac843c1-9934-4711-aae6-7f6920596cb3"). InnerVolumeSpecName "kube-api-access-nzzgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.770405 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb-kube-api-access-tfjj7" (OuterVolumeSpecName: "kube-api-access-tfjj7") pod "8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb" (UID: "8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb"). InnerVolumeSpecName "kube-api-access-tfjj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.794360 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.866733 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzzgd\" (UniqueName: \"kubernetes.io/projected/6ac843c1-9934-4711-aae6-7f6920596cb3-kube-api-access-nzzgd\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.866779 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfjj7\" (UniqueName: \"kubernetes.io/projected/8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb-kube-api-access-tfjj7\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.866793 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.888741 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67f9f46cf4-9cvcg" Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.978784 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/073acab9-3b9b-432a-aef7-b59bad9fa6ea-httpd-config\") pod \"073acab9-3b9b-432a-aef7-b59bad9fa6ea\" (UID: \"073acab9-3b9b-432a-aef7-b59bad9fa6ea\") " Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.978928 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s56d9\" (UniqueName: \"kubernetes.io/projected/073acab9-3b9b-432a-aef7-b59bad9fa6ea-kube-api-access-s56d9\") pod \"073acab9-3b9b-432a-aef7-b59bad9fa6ea\" (UID: \"073acab9-3b9b-432a-aef7-b59bad9fa6ea\") " Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.978987 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/073acab9-3b9b-432a-aef7-b59bad9fa6ea-combined-ca-bundle\") pod \"073acab9-3b9b-432a-aef7-b59bad9fa6ea\" (UID: \"073acab9-3b9b-432a-aef7-b59bad9fa6ea\") " Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.979060 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/073acab9-3b9b-432a-aef7-b59bad9fa6ea-ovndb-tls-certs\") pod \"073acab9-3b9b-432a-aef7-b59bad9fa6ea\" (UID: \"073acab9-3b9b-432a-aef7-b59bad9fa6ea\") " Mar 13 12:08:01 crc kubenswrapper[4837]: I0313 12:08:01.979171 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/073acab9-3b9b-432a-aef7-b59bad9fa6ea-config\") pod \"073acab9-3b9b-432a-aef7-b59bad9fa6ea\" (UID: \"073acab9-3b9b-432a-aef7-b59bad9fa6ea\") " Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.041794 4837 generic.go:334] "Generic (PLEG): container finished" podID="073acab9-3b9b-432a-aef7-b59bad9fa6ea" containerID="592ae8d9d134287aaaf8e8bd131ed85ae4a0f882f18fe4b309c2608413d81458" exitCode=0 Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.042216 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67f9f46cf4-9cvcg" event={"ID":"073acab9-3b9b-432a-aef7-b59bad9fa6ea","Type":"ContainerDied","Data":"592ae8d9d134287aaaf8e8bd131ed85ae4a0f882f18fe4b309c2608413d81458"} Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.042246 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67f9f46cf4-9cvcg" event={"ID":"073acab9-3b9b-432a-aef7-b59bad9fa6ea","Type":"ContainerDied","Data":"f8cb990fe37777f793f0250c14cdfa0b903194e81a71cae32c4012805c32b7c7"} Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.042273 4837 scope.go:117] "RemoveContainer" containerID="5abad0665ef76d6dfd0a8789ace2e34eb216059566daed67b5a9ba64a43b080f" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.042460 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67f9f46cf4-9cvcg" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.051958 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-t8qk9" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.053911 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/073acab9-3b9b-432a-aef7-b59bad9fa6ea-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "073acab9-3b9b-432a-aef7-b59bad9fa6ea" (UID: "073acab9-3b9b-432a-aef7-b59bad9fa6ea"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.054304 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/073acab9-3b9b-432a-aef7-b59bad9fa6ea-kube-api-access-s56d9" (OuterVolumeSpecName: "kube-api-access-s56d9") pod "073acab9-3b9b-432a-aef7-b59bad9fa6ea" (UID: "073acab9-3b9b-432a-aef7-b59bad9fa6ea"). InnerVolumeSpecName "kube-api-access-s56d9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.062206 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-t8qk9" event={"ID":"8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb","Type":"ContainerDied","Data":"69cde6bb4086abefd3b421a4e0db78878c121c0be2c2674284c1675581361bdc"} Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.073815 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69cde6bb4086abefd3b421a4e0db78878c121c0be2c2674284c1675581361bdc" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.073867 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556728-7n29h" event={"ID":"47ae408b-faad-4a52-ad09-428242645381","Type":"ContainerStarted","Data":"f3da93ab4a472ba7116a0beb08f63b2c302111f8cbb9bf5768b3b8124101f12f"} Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.079027 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c124-account-create-update-8zqgg" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.079153 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c124-account-create-update-8zqgg" event={"ID":"6ac843c1-9934-4711-aae6-7f6920596cb3","Type":"ContainerDied","Data":"bd1db75a0f6b89871b87bcc95f04a2edb02439e89ab9b5e1a096304f7490db99"} Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.079182 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd1db75a0f6b89871b87bcc95f04a2edb02439e89ab9b5e1a096304f7490db99" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.083737 4837 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/073acab9-3b9b-432a-aef7-b59bad9fa6ea-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.083859 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s56d9\" (UniqueName: \"kubernetes.io/projected/073acab9-3b9b-432a-aef7-b59bad9fa6ea-kube-api-access-s56d9\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.134656 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/073acab9-3b9b-432a-aef7-b59bad9fa6ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "073acab9-3b9b-432a-aef7-b59bad9fa6ea" (UID: "073acab9-3b9b-432a-aef7-b59bad9fa6ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.140710 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/073acab9-3b9b-432a-aef7-b59bad9fa6ea-config" (OuterVolumeSpecName: "config") pod "073acab9-3b9b-432a-aef7-b59bad9fa6ea" (UID: "073acab9-3b9b-432a-aef7-b59bad9fa6ea"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.154923 4837 scope.go:117] "RemoveContainer" containerID="592ae8d9d134287aaaf8e8bd131ed85ae4a0f882f18fe4b309c2608413d81458" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.185371 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/073acab9-3b9b-432a-aef7-b59bad9fa6ea-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "073acab9-3b9b-432a-aef7-b59bad9fa6ea" (UID: "073acab9-3b9b-432a-aef7-b59bad9fa6ea"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.208029 4837 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/073acab9-3b9b-432a-aef7-b59bad9fa6ea-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.208189 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/073acab9-3b9b-432a-aef7-b59bad9fa6ea-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.208302 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/073acab9-3b9b-432a-aef7-b59bad9fa6ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.232122 4837 scope.go:117] "RemoveContainer" containerID="5abad0665ef76d6dfd0a8789ace2e34eb216059566daed67b5a9ba64a43b080f" Mar 13 12:08:02 crc kubenswrapper[4837]: E0313 12:08:02.239110 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5abad0665ef76d6dfd0a8789ace2e34eb216059566daed67b5a9ba64a43b080f\": container with ID starting with 5abad0665ef76d6dfd0a8789ace2e34eb216059566daed67b5a9ba64a43b080f not found: ID does not exist" containerID="5abad0665ef76d6dfd0a8789ace2e34eb216059566daed67b5a9ba64a43b080f" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.239155 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5abad0665ef76d6dfd0a8789ace2e34eb216059566daed67b5a9ba64a43b080f"} err="failed to get container status \"5abad0665ef76d6dfd0a8789ace2e34eb216059566daed67b5a9ba64a43b080f\": rpc error: code = NotFound desc = could not find container \"5abad0665ef76d6dfd0a8789ace2e34eb216059566daed67b5a9ba64a43b080f\": container with ID starting with 5abad0665ef76d6dfd0a8789ace2e34eb216059566daed67b5a9ba64a43b080f not found: ID does not exist" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.239183 4837 scope.go:117] "RemoveContainer" containerID="592ae8d9d134287aaaf8e8bd131ed85ae4a0f882f18fe4b309c2608413d81458" Mar 13 12:08:02 crc kubenswrapper[4837]: E0313 12:08:02.242210 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"592ae8d9d134287aaaf8e8bd131ed85ae4a0f882f18fe4b309c2608413d81458\": container with ID starting with 592ae8d9d134287aaaf8e8bd131ed85ae4a0f882f18fe4b309c2608413d81458 not found: ID does not exist" containerID="592ae8d9d134287aaaf8e8bd131ed85ae4a0f882f18fe4b309c2608413d81458" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.242337 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"592ae8d9d134287aaaf8e8bd131ed85ae4a0f882f18fe4b309c2608413d81458"} err="failed to get container status \"592ae8d9d134287aaaf8e8bd131ed85ae4a0f882f18fe4b309c2608413d81458\": rpc error: code = NotFound desc = could not find container \"592ae8d9d134287aaaf8e8bd131ed85ae4a0f882f18fe4b309c2608413d81458\": container with ID starting with 592ae8d9d134287aaaf8e8bd131ed85ae4a0f882f18fe4b309c2608413d81458 not found: ID does not exist" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.502093 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-67f9f46cf4-9cvcg"] Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.539449 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-67f9f46cf4-9cvcg"] Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.598224 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 12:08:02 crc kubenswrapper[4837]: W0313 12:08:02.605444 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3f87d89_35d5_4dc0_9c37_5297718a9351.slice/crio-5b07cd89ede2fe2c1fb21826f3293ae30dafc157e24ff9974fed53f7d9792a41 WatchSource:0}: Error finding container 5b07cd89ede2fe2c1fb21826f3293ae30dafc157e24ff9974fed53f7d9792a41: Status 404 returned error can't find the container with id 5b07cd89ede2fe2c1fb21826f3293ae30dafc157e24ff9974fed53f7d9792a41 Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.720835 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8886-account-create-update-ljcrw" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.740242 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4581-account-create-update-w6tc2" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.813573 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.815519 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-bfbc874dc-vsh7q" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.825989 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e397db42-b505-4447-87a2-4c12ed412f28-operator-scripts\") pod \"e397db42-b505-4447-87a2-4c12ed412f28\" (UID: \"e397db42-b505-4447-87a2-4c12ed412f28\") " Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.826115 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec46ef58-a8e9-4354-b9a1-568535879964-operator-scripts\") pod \"ec46ef58-a8e9-4354-b9a1-568535879964\" (UID: \"ec46ef58-a8e9-4354-b9a1-568535879964\") " Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.826172 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8wqn\" (UniqueName: \"kubernetes.io/projected/e397db42-b505-4447-87a2-4c12ed412f28-kube-api-access-q8wqn\") pod \"e397db42-b505-4447-87a2-4c12ed412f28\" (UID: \"e397db42-b505-4447-87a2-4c12ed412f28\") " Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.826237 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcs7r\" (UniqueName: \"kubernetes.io/projected/ec46ef58-a8e9-4354-b9a1-568535879964-kube-api-access-bcs7r\") pod \"ec46ef58-a8e9-4354-b9a1-568535879964\" (UID: \"ec46ef58-a8e9-4354-b9a1-568535879964\") " Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.828357 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec46ef58-a8e9-4354-b9a1-568535879964-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ec46ef58-a8e9-4354-b9a1-568535879964" (UID: "ec46ef58-a8e9-4354-b9a1-568535879964"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.828493 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e397db42-b505-4447-87a2-4c12ed412f28-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e397db42-b505-4447-87a2-4c12ed412f28" (UID: "e397db42-b505-4447-87a2-4c12ed412f28"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.840208 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec46ef58-a8e9-4354-b9a1-568535879964-kube-api-access-bcs7r" (OuterVolumeSpecName: "kube-api-access-bcs7r") pod "ec46ef58-a8e9-4354-b9a1-568535879964" (UID: "ec46ef58-a8e9-4354-b9a1-568535879964"). InnerVolumeSpecName "kube-api-access-bcs7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.843535 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e397db42-b505-4447-87a2-4c12ed412f28-kube-api-access-q8wqn" (OuterVolumeSpecName: "kube-api-access-q8wqn") pod "e397db42-b505-4447-87a2-4c12ed412f28" (UID: "e397db42-b505-4447-87a2-4c12ed412f28"). InnerVolumeSpecName "kube-api-access-q8wqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.860608 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.860901 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9fdb2289-943a-4078-ab5f-cab9a7b4faf1" containerName="glance-log" containerID="cri-o://4c6f80cedfefe6ca3ffa3fd1f8e5bca2af1a1e041ef15266c27ebfeb6b6939ec" gracePeriod=30 Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.861069 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9fdb2289-943a-4078-ab5f-cab9a7b4faf1" containerName="glance-httpd" containerID="cri-o://a7e9d992e509609ea914f80658069ef20b3e4ab7548f88fd1489567b1ca63a1f" gracePeriod=30 Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.928694 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec46ef58-a8e9-4354-b9a1-568535879964-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.928731 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8wqn\" (UniqueName: \"kubernetes.io/projected/e397db42-b505-4447-87a2-4c12ed412f28-kube-api-access-q8wqn\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.928745 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcs7r\" (UniqueName: \"kubernetes.io/projected/ec46ef58-a8e9-4354-b9a1-568535879964-kube-api-access-bcs7r\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.928758 4837 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e397db42-b505-4447-87a2-4c12ed412f28-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:02 crc kubenswrapper[4837]: I0313 12:08:02.944382 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.030567 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a28d7a5-22a2-460a-a08c-8eb484e6c382-scripts\") pod \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\" (UID: \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\") " Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.030713 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a28d7a5-22a2-460a-a08c-8eb484e6c382-horizon-tls-certs\") pod \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\" (UID: \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\") " Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.030736 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvsmz\" (UniqueName: \"kubernetes.io/projected/2a28d7a5-22a2-460a-a08c-8eb484e6c382-kube-api-access-wvsmz\") pod \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\" (UID: \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\") " Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.030805 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a28d7a5-22a2-460a-a08c-8eb484e6c382-combined-ca-bundle\") pod \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\" (UID: \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\") " Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.030838 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2a28d7a5-22a2-460a-a08c-8eb484e6c382-horizon-secret-key\") pod \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\" (UID: \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\") " Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.030877 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2a28d7a5-22a2-460a-a08c-8eb484e6c382-config-data\") pod \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\" (UID: \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\") " Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.030949 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a28d7a5-22a2-460a-a08c-8eb484e6c382-logs\") pod \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\" (UID: \"2a28d7a5-22a2-460a-a08c-8eb484e6c382\") " Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.031711 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a28d7a5-22a2-460a-a08c-8eb484e6c382-logs" (OuterVolumeSpecName: "logs") pod "2a28d7a5-22a2-460a-a08c-8eb484e6c382" (UID: "2a28d7a5-22a2-460a-a08c-8eb484e6c382"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.035287 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a28d7a5-22a2-460a-a08c-8eb484e6c382-kube-api-access-wvsmz" (OuterVolumeSpecName: "kube-api-access-wvsmz") pod "2a28d7a5-22a2-460a-a08c-8eb484e6c382" (UID: "2a28d7a5-22a2-460a-a08c-8eb484e6c382"). InnerVolumeSpecName "kube-api-access-wvsmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.042613 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a28d7a5-22a2-460a-a08c-8eb484e6c382-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "2a28d7a5-22a2-460a-a08c-8eb484e6c382" (UID: "2a28d7a5-22a2-460a-a08c-8eb484e6c382"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.065778 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="073acab9-3b9b-432a-aef7-b59bad9fa6ea" path="/var/lib/kubelet/pods/073acab9-3b9b-432a-aef7-b59bad9fa6ea/volumes" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.066721 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0173ba9-535a-435d-bc51-75c069e69e46" path="/var/lib/kubelet/pods/f0173ba9-535a-435d-bc51-75c069e69e46/volumes" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.074998 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a28d7a5-22a2-460a-a08c-8eb484e6c382-config-data" (OuterVolumeSpecName: "config-data") pod "2a28d7a5-22a2-460a-a08c-8eb484e6c382" (UID: "2a28d7a5-22a2-460a-a08c-8eb484e6c382"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.102606 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d3f87d89-35d5-4dc0-9c37-5297718a9351","Type":"ContainerStarted","Data":"5b07cd89ede2fe2c1fb21826f3293ae30dafc157e24ff9974fed53f7d9792a41"} Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.104421 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8886-account-create-update-ljcrw" event={"ID":"e397db42-b505-4447-87a2-4c12ed412f28","Type":"ContainerDied","Data":"e4a50d17ef0d5b10ca2c0d2aeafd143cd9c5e63e31ce86c11aca1ecba4422049"} Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.104451 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4a50d17ef0d5b10ca2c0d2aeafd143cd9c5e63e31ce86c11aca1ecba4422049" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.104502 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8886-account-create-update-ljcrw" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.118331 4837 generic.go:334] "Generic (PLEG): container finished" podID="9fdb2289-943a-4078-ab5f-cab9a7b4faf1" containerID="4c6f80cedfefe6ca3ffa3fd1f8e5bca2af1a1e041ef15266c27ebfeb6b6939ec" exitCode=143 Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.118470 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9fdb2289-943a-4078-ab5f-cab9a7b4faf1","Type":"ContainerDied","Data":"4c6f80cedfefe6ca3ffa3fd1f8e5bca2af1a1e041ef15266c27ebfeb6b6939ec"} Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.125284 4837 generic.go:334] "Generic (PLEG): container finished" podID="2a28d7a5-22a2-460a-a08c-8eb484e6c382" containerID="92b3db8efc4bd781409e05974c86a887259d700facd2c2ab05a9fcc6613ce654" exitCode=137 Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.125357 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5596f9dfb8-m9bxb" event={"ID":"2a28d7a5-22a2-460a-a08c-8eb484e6c382","Type":"ContainerDied","Data":"92b3db8efc4bd781409e05974c86a887259d700facd2c2ab05a9fcc6613ce654"} Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.125388 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5596f9dfb8-m9bxb" event={"ID":"2a28d7a5-22a2-460a-a08c-8eb484e6c382","Type":"ContainerDied","Data":"60985fc2aa747df3481773b902df6591e8f7e0a9aaa937b1d8ccf7c3a2e33f6e"} Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.125408 4837 scope.go:117] "RemoveContainer" containerID="7e464f7436823332f050e26237bc563d04c928c21ee9b8d3087ae1cc9a85aacb" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.125528 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5596f9dfb8-m9bxb" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.129462 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a28d7a5-22a2-460a-a08c-8eb484e6c382-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2a28d7a5-22a2-460a-a08c-8eb484e6c382" (UID: "2a28d7a5-22a2-460a-a08c-8eb484e6c382"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.133084 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a28d7a5-22a2-460a-a08c-8eb484e6c382-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.133139 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvsmz\" (UniqueName: \"kubernetes.io/projected/2a28d7a5-22a2-460a-a08c-8eb484e6c382-kube-api-access-wvsmz\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.133158 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a28d7a5-22a2-460a-a08c-8eb484e6c382-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.133170 4837 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2a28d7a5-22a2-460a-a08c-8eb484e6c382-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.133181 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2a28d7a5-22a2-460a-a08c-8eb484e6c382-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.144775 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4581-account-create-update-w6tc2" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.144773 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4581-account-create-update-w6tc2" event={"ID":"ec46ef58-a8e9-4354-b9a1-568535879964","Type":"ContainerDied","Data":"c19e4f72f1c5f35690c3fb2bd2be44a6fa31ffdb302bbef941997a819e69b808"} Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.144910 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c19e4f72f1c5f35690c3fb2bd2be44a6fa31ffdb302bbef941997a819e69b808" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.147553 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556728-7n29h" event={"ID":"47ae408b-faad-4a52-ad09-428242645381","Type":"ContainerStarted","Data":"d2184d47fa1ce72a82da97184468ccee1cece609eb9ab8fb1194680ef9c8ea21"} Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.158612 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ec252a2a-f9a4-4894-991d-1a70f596519d","Type":"ContainerStarted","Data":"4ac16d5369630fdba4cee8a516cca217bdb0ae52551269ab68960255bd7bcb07"} Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.158761 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ec252a2a-f9a4-4894-991d-1a70f596519d" containerName="ceilometer-central-agent" containerID="cri-o://f13323c6b5c7472c3b9f76328f6f7d80a5868b615ab9c24cb1496e6b292c2e9a" gracePeriod=30 Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.158880 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ec252a2a-f9a4-4894-991d-1a70f596519d" containerName="proxy-httpd" containerID="cri-o://4ac16d5369630fdba4cee8a516cca217bdb0ae52551269ab68960255bd7bcb07" gracePeriod=30 Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.158932 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ec252a2a-f9a4-4894-991d-1a70f596519d" containerName="sg-core" containerID="cri-o://2fa9cafe6c8b9f2bb8d388b54db3598398838d667dccefef13c7e8655cabb201" gracePeriod=30 Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.158976 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ec252a2a-f9a4-4894-991d-1a70f596519d" containerName="ceilometer-notification-agent" containerID="cri-o://759b2c4c55f496a02021bffec50cb8a1d6cfb6037ffefb8f05fc410f86a3f8d4" gracePeriod=30 Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.181290 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a28d7a5-22a2-460a-a08c-8eb484e6c382-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "2a28d7a5-22a2-460a-a08c-8eb484e6c382" (UID: "2a28d7a5-22a2-460a-a08c-8eb484e6c382"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.198202 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556728-7n29h" podStartSLOduration=2.255125133 podStartE2EDuration="3.198178236s" podCreationTimestamp="2026-03-13 12:08:00 +0000 UTC" firstStartedPulling="2026-03-13 12:08:01.132725043 +0000 UTC m=+1196.770991806" lastFinishedPulling="2026-03-13 12:08:02.075778146 +0000 UTC m=+1197.714044909" observedRunningTime="2026-03-13 12:08:03.16427505 +0000 UTC m=+1198.802541823" watchObservedRunningTime="2026-03-13 12:08:03.198178236 +0000 UTC m=+1198.836445009" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.207307 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a28d7a5-22a2-460a-a08c-8eb484e6c382-scripts" (OuterVolumeSpecName: "scripts") pod "2a28d7a5-22a2-460a-a08c-8eb484e6c382" (UID: "2a28d7a5-22a2-460a-a08c-8eb484e6c382"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.210153 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.553587896 podStartE2EDuration="8.210135992s" podCreationTimestamp="2026-03-13 12:07:55 +0000 UTC" firstStartedPulling="2026-03-13 12:07:56.214871619 +0000 UTC m=+1191.853138382" lastFinishedPulling="2026-03-13 12:08:01.871419715 +0000 UTC m=+1197.509686478" observedRunningTime="2026-03-13 12:08:03.187379787 +0000 UTC m=+1198.825646550" watchObservedRunningTime="2026-03-13 12:08:03.210135992 +0000 UTC m=+1198.848402755" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.235343 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a28d7a5-22a2-460a-a08c-8eb484e6c382-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.235370 4837 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a28d7a5-22a2-460a-a08c-8eb484e6c382-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.367982 4837 scope.go:117] "RemoveContainer" containerID="92b3db8efc4bd781409e05974c86a887259d700facd2c2ab05a9fcc6613ce654" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.482201 4837 scope.go:117] "RemoveContainer" containerID="7e464f7436823332f050e26237bc563d04c928c21ee9b8d3087ae1cc9a85aacb" Mar 13 12:08:03 crc kubenswrapper[4837]: E0313 12:08:03.482690 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e464f7436823332f050e26237bc563d04c928c21ee9b8d3087ae1cc9a85aacb\": container with ID starting with 7e464f7436823332f050e26237bc563d04c928c21ee9b8d3087ae1cc9a85aacb not found: ID does not exist" containerID="7e464f7436823332f050e26237bc563d04c928c21ee9b8d3087ae1cc9a85aacb" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.482728 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e464f7436823332f050e26237bc563d04c928c21ee9b8d3087ae1cc9a85aacb"} err="failed to get container status \"7e464f7436823332f050e26237bc563d04c928c21ee9b8d3087ae1cc9a85aacb\": rpc error: code = NotFound desc = could not find container \"7e464f7436823332f050e26237bc563d04c928c21ee9b8d3087ae1cc9a85aacb\": container with ID starting with 7e464f7436823332f050e26237bc563d04c928c21ee9b8d3087ae1cc9a85aacb not found: ID does not exist" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.482755 4837 scope.go:117] "RemoveContainer" containerID="92b3db8efc4bd781409e05974c86a887259d700facd2c2ab05a9fcc6613ce654" Mar 13 12:08:03 crc kubenswrapper[4837]: E0313 12:08:03.483301 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92b3db8efc4bd781409e05974c86a887259d700facd2c2ab05a9fcc6613ce654\": container with ID starting with 92b3db8efc4bd781409e05974c86a887259d700facd2c2ab05a9fcc6613ce654 not found: ID does not exist" containerID="92b3db8efc4bd781409e05974c86a887259d700facd2c2ab05a9fcc6613ce654" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.483349 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92b3db8efc4bd781409e05974c86a887259d700facd2c2ab05a9fcc6613ce654"} err="failed to get container status \"92b3db8efc4bd781409e05974c86a887259d700facd2c2ab05a9fcc6613ce654\": rpc error: code = NotFound desc = could not find container \"92b3db8efc4bd781409e05974c86a887259d700facd2c2ab05a9fcc6613ce654\": container with ID starting with 92b3db8efc4bd781409e05974c86a887259d700facd2c2ab05a9fcc6613ce654 not found: ID does not exist" Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.512069 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5596f9dfb8-m9bxb"] Mar 13 12:08:03 crc kubenswrapper[4837]: I0313 12:08:03.525709 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5596f9dfb8-m9bxb"] Mar 13 12:08:04 crc kubenswrapper[4837]: I0313 12:08:04.183828 4837 generic.go:334] "Generic (PLEG): container finished" podID="47ae408b-faad-4a52-ad09-428242645381" containerID="d2184d47fa1ce72a82da97184468ccee1cece609eb9ab8fb1194680ef9c8ea21" exitCode=0 Mar 13 12:08:04 crc kubenswrapper[4837]: I0313 12:08:04.184181 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556728-7n29h" event={"ID":"47ae408b-faad-4a52-ad09-428242645381","Type":"ContainerDied","Data":"d2184d47fa1ce72a82da97184468ccee1cece609eb9ab8fb1194680ef9c8ea21"} Mar 13 12:08:04 crc kubenswrapper[4837]: I0313 12:08:04.187720 4837 generic.go:334] "Generic (PLEG): container finished" podID="ec252a2a-f9a4-4894-991d-1a70f596519d" containerID="4ac16d5369630fdba4cee8a516cca217bdb0ae52551269ab68960255bd7bcb07" exitCode=0 Mar 13 12:08:04 crc kubenswrapper[4837]: I0313 12:08:04.187750 4837 generic.go:334] "Generic (PLEG): container finished" podID="ec252a2a-f9a4-4894-991d-1a70f596519d" containerID="2fa9cafe6c8b9f2bb8d388b54db3598398838d667dccefef13c7e8655cabb201" exitCode=2 Mar 13 12:08:04 crc kubenswrapper[4837]: I0313 12:08:04.187759 4837 generic.go:334] "Generic (PLEG): container finished" podID="ec252a2a-f9a4-4894-991d-1a70f596519d" containerID="759b2c4c55f496a02021bffec50cb8a1d6cfb6037ffefb8f05fc410f86a3f8d4" exitCode=0 Mar 13 12:08:04 crc kubenswrapper[4837]: I0313 12:08:04.187794 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ec252a2a-f9a4-4894-991d-1a70f596519d","Type":"ContainerDied","Data":"4ac16d5369630fdba4cee8a516cca217bdb0ae52551269ab68960255bd7bcb07"} Mar 13 12:08:04 crc kubenswrapper[4837]: I0313 12:08:04.187817 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ec252a2a-f9a4-4894-991d-1a70f596519d","Type":"ContainerDied","Data":"2fa9cafe6c8b9f2bb8d388b54db3598398838d667dccefef13c7e8655cabb201"} Mar 13 12:08:04 crc kubenswrapper[4837]: I0313 12:08:04.187829 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ec252a2a-f9a4-4894-991d-1a70f596519d","Type":"ContainerDied","Data":"759b2c4c55f496a02021bffec50cb8a1d6cfb6037ffefb8f05fc410f86a3f8d4"} Mar 13 12:08:04 crc kubenswrapper[4837]: I0313 12:08:04.191199 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d3f87d89-35d5-4dc0-9c37-5297718a9351","Type":"ContainerStarted","Data":"db9e0234f3793624350d9bf2860efc958e3f44554f4b5ac4ae84cf488c1ce7e4"} Mar 13 12:08:04 crc kubenswrapper[4837]: I0313 12:08:04.191365 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d3f87d89-35d5-4dc0-9c37-5297718a9351","Type":"ContainerStarted","Data":"fdfc517ecfc54d1ab6ae838e5f84fb794c7ede4598c042e712fca287be710aad"} Mar 13 12:08:04 crc kubenswrapper[4837]: I0313 12:08:04.228433 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.228410309 podStartE2EDuration="3.228410309s" podCreationTimestamp="2026-03-13 12:08:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:08:04.221471811 +0000 UTC m=+1199.859738584" watchObservedRunningTime="2026-03-13 12:08:04.228410309 +0000 UTC m=+1199.866677072" Mar 13 12:08:05 crc kubenswrapper[4837]: I0313 12:08:05.063947 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a28d7a5-22a2-460a-a08c-8eb484e6c382" path="/var/lib/kubelet/pods/2a28d7a5-22a2-460a-a08c-8eb484e6c382/volumes" Mar 13 12:08:05 crc kubenswrapper[4837]: I0313 12:08:05.484258 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:08:05 crc kubenswrapper[4837]: I0313 12:08:05.484631 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:08:05 crc kubenswrapper[4837]: I0313 12:08:05.545574 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556728-7n29h" Mar 13 12:08:05 crc kubenswrapper[4837]: I0313 12:08:05.576861 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2d5k\" (UniqueName: \"kubernetes.io/projected/47ae408b-faad-4a52-ad09-428242645381-kube-api-access-l2d5k\") pod \"47ae408b-faad-4a52-ad09-428242645381\" (UID: \"47ae408b-faad-4a52-ad09-428242645381\") " Mar 13 12:08:05 crc kubenswrapper[4837]: I0313 12:08:05.582838 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47ae408b-faad-4a52-ad09-428242645381-kube-api-access-l2d5k" (OuterVolumeSpecName: "kube-api-access-l2d5k") pod "47ae408b-faad-4a52-ad09-428242645381" (UID: "47ae408b-faad-4a52-ad09-428242645381"). InnerVolumeSpecName "kube-api-access-l2d5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:05 crc kubenswrapper[4837]: I0313 12:08:05.679369 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2d5k\" (UniqueName: \"kubernetes.io/projected/47ae408b-faad-4a52-ad09-428242645381-kube-api-access-l2d5k\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.221627 4837 generic.go:334] "Generic (PLEG): container finished" podID="9fdb2289-943a-4078-ab5f-cab9a7b4faf1" containerID="a7e9d992e509609ea914f80658069ef20b3e4ab7548f88fd1489567b1ca63a1f" exitCode=0 Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.221669 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9fdb2289-943a-4078-ab5f-cab9a7b4faf1","Type":"ContainerDied","Data":"a7e9d992e509609ea914f80658069ef20b3e4ab7548f88fd1489567b1ca63a1f"} Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.224010 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556728-7n29h" event={"ID":"47ae408b-faad-4a52-ad09-428242645381","Type":"ContainerDied","Data":"f3da93ab4a472ba7116a0beb08f63b2c302111f8cbb9bf5768b3b8124101f12f"} Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.224048 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3da93ab4a472ba7116a0beb08f63b2c302111f8cbb9bf5768b3b8124101f12f" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.224055 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556728-7n29h" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.242178 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556722-h599x"] Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.251091 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556722-h599x"] Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.615342 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.804366 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-f6gwd"] Mar 13 12:08:06 crc kubenswrapper[4837]: E0313 12:08:06.804745 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec46ef58-a8e9-4354-b9a1-568535879964" containerName="mariadb-account-create-update" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.804759 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec46ef58-a8e9-4354-b9a1-568535879964" containerName="mariadb-account-create-update" Mar 13 12:08:06 crc kubenswrapper[4837]: E0313 12:08:06.804778 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fdb2289-943a-4078-ab5f-cab9a7b4faf1" containerName="glance-log" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.804785 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fdb2289-943a-4078-ab5f-cab9a7b4faf1" containerName="glance-log" Mar 13 12:08:06 crc kubenswrapper[4837]: E0313 12:08:06.804797 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ac843c1-9934-4711-aae6-7f6920596cb3" containerName="mariadb-account-create-update" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.804803 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ac843c1-9934-4711-aae6-7f6920596cb3" containerName="mariadb-account-create-update" Mar 13 12:08:06 crc kubenswrapper[4837]: E0313 12:08:06.804814 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb" containerName="mariadb-database-create" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.804820 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb" containerName="mariadb-database-create" Mar 13 12:08:06 crc kubenswrapper[4837]: E0313 12:08:06.804833 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a28d7a5-22a2-460a-a08c-8eb484e6c382" containerName="horizon" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.804839 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a28d7a5-22a2-460a-a08c-8eb484e6c382" containerName="horizon" Mar 13 12:08:06 crc kubenswrapper[4837]: E0313 12:08:06.804851 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a28d7a5-22a2-460a-a08c-8eb484e6c382" containerName="horizon-log" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.804857 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a28d7a5-22a2-460a-a08c-8eb484e6c382" containerName="horizon-log" Mar 13 12:08:06 crc kubenswrapper[4837]: E0313 12:08:06.804869 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="073acab9-3b9b-432a-aef7-b59bad9fa6ea" containerName="neutron-httpd" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.804875 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="073acab9-3b9b-432a-aef7-b59bad9fa6ea" containerName="neutron-httpd" Mar 13 12:08:06 crc kubenswrapper[4837]: E0313 12:08:06.804884 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47ae408b-faad-4a52-ad09-428242645381" containerName="oc" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.804890 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="47ae408b-faad-4a52-ad09-428242645381" containerName="oc" Mar 13 12:08:06 crc kubenswrapper[4837]: E0313 12:08:06.804896 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e397db42-b505-4447-87a2-4c12ed412f28" containerName="mariadb-account-create-update" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.804902 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e397db42-b505-4447-87a2-4c12ed412f28" containerName="mariadb-account-create-update" Mar 13 12:08:06 crc kubenswrapper[4837]: E0313 12:08:06.804914 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="073acab9-3b9b-432a-aef7-b59bad9fa6ea" containerName="neutron-api" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.804920 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="073acab9-3b9b-432a-aef7-b59bad9fa6ea" containerName="neutron-api" Mar 13 12:08:06 crc kubenswrapper[4837]: E0313 12:08:06.804933 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fdb2289-943a-4078-ab5f-cab9a7b4faf1" containerName="glance-httpd" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.804938 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fdb2289-943a-4078-ab5f-cab9a7b4faf1" containerName="glance-httpd" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.805090 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a28d7a5-22a2-460a-a08c-8eb484e6c382" containerName="horizon-log" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.805107 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fdb2289-943a-4078-ab5f-cab9a7b4faf1" containerName="glance-log" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.805116 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec46ef58-a8e9-4354-b9a1-568535879964" containerName="mariadb-account-create-update" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.805128 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="47ae408b-faad-4a52-ad09-428242645381" containerName="oc" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.805135 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a28d7a5-22a2-460a-a08c-8eb484e6c382" containerName="horizon" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.805144 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ac843c1-9934-4711-aae6-7f6920596cb3" containerName="mariadb-account-create-update" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.805152 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb" containerName="mariadb-database-create" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.805160 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="073acab9-3b9b-432a-aef7-b59bad9fa6ea" containerName="neutron-httpd" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.805170 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="073acab9-3b9b-432a-aef7-b59bad9fa6ea" containerName="neutron-api" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.805180 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="e397db42-b505-4447-87a2-4c12ed412f28" containerName="mariadb-account-create-update" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.805192 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fdb2289-943a-4078-ab5f-cab9a7b4faf1" containerName="glance-httpd" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.805749 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-f6gwd" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.807164 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-combined-ca-bundle\") pod \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.807221 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cp4n\" (UniqueName: \"kubernetes.io/projected/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-kube-api-access-4cp4n\") pod \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.807244 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-config-data\") pod \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.807323 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-logs\") pod \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.807377 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-scripts\") pod \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.807399 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-httpd-run\") pod \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.807502 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.807540 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-internal-tls-certs\") pod \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\" (UID: \"9fdb2289-943a-4078-ab5f-cab9a7b4faf1\") " Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.808136 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-logs" (OuterVolumeSpecName: "logs") pod "9fdb2289-943a-4078-ab5f-cab9a7b4faf1" (UID: "9fdb2289-943a-4078-ab5f-cab9a7b4faf1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.809952 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.810104 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.810471 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9fdb2289-943a-4078-ab5f-cab9a7b4faf1" (UID: "9fdb2289-943a-4078-ab5f-cab9a7b4faf1"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.812495 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-qctwr" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.830056 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-f6gwd"] Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.842537 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-kube-api-access-4cp4n" (OuterVolumeSpecName: "kube-api-access-4cp4n") pod "9fdb2289-943a-4078-ab5f-cab9a7b4faf1" (UID: "9fdb2289-943a-4078-ab5f-cab9a7b4faf1"). InnerVolumeSpecName "kube-api-access-4cp4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.846976 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-scripts" (OuterVolumeSpecName: "scripts") pod "9fdb2289-943a-4078-ab5f-cab9a7b4faf1" (UID: "9fdb2289-943a-4078-ab5f-cab9a7b4faf1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.851852 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "9fdb2289-943a-4078-ab5f-cab9a7b4faf1" (UID: "9fdb2289-943a-4078-ab5f-cab9a7b4faf1"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.908771 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9fdb2289-943a-4078-ab5f-cab9a7b4faf1" (UID: "9fdb2289-943a-4078-ab5f-cab9a7b4faf1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.908821 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d6d5bbe-7e5b-4645-95c4-af868cba3244-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-f6gwd\" (UID: \"5d6d5bbe-7e5b-4645-95c4-af868cba3244\") " pod="openstack/nova-cell0-conductor-db-sync-f6gwd" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.908878 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d6d5bbe-7e5b-4645-95c4-af868cba3244-config-data\") pod \"nova-cell0-conductor-db-sync-f6gwd\" (UID: \"5d6d5bbe-7e5b-4645-95c4-af868cba3244\") " pod="openstack/nova-cell0-conductor-db-sync-f6gwd" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.908957 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqhf2\" (UniqueName: \"kubernetes.io/projected/5d6d5bbe-7e5b-4645-95c4-af868cba3244-kube-api-access-gqhf2\") pod \"nova-cell0-conductor-db-sync-f6gwd\" (UID: \"5d6d5bbe-7e5b-4645-95c4-af868cba3244\") " pod="openstack/nova-cell0-conductor-db-sync-f6gwd" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.909007 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d6d5bbe-7e5b-4645-95c4-af868cba3244-scripts\") pod \"nova-cell0-conductor-db-sync-f6gwd\" (UID: \"5d6d5bbe-7e5b-4645-95c4-af868cba3244\") " pod="openstack/nova-cell0-conductor-db-sync-f6gwd" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.910208 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.910239 4837 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.910261 4837 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.910270 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.910279 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cp4n\" (UniqueName: \"kubernetes.io/projected/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-kube-api-access-4cp4n\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.910288 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.956550 4837 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.973851 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9fdb2289-943a-4078-ab5f-cab9a7b4faf1" (UID: "9fdb2289-943a-4078-ab5f-cab9a7b4faf1"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:06 crc kubenswrapper[4837]: I0313 12:08:06.977169 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-config-data" (OuterVolumeSpecName: "config-data") pod "9fdb2289-943a-4078-ab5f-cab9a7b4faf1" (UID: "9fdb2289-943a-4078-ab5f-cab9a7b4faf1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.011411 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d6d5bbe-7e5b-4645-95c4-af868cba3244-scripts\") pod \"nova-cell0-conductor-db-sync-f6gwd\" (UID: \"5d6d5bbe-7e5b-4645-95c4-af868cba3244\") " pod="openstack/nova-cell0-conductor-db-sync-f6gwd" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.011466 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d6d5bbe-7e5b-4645-95c4-af868cba3244-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-f6gwd\" (UID: \"5d6d5bbe-7e5b-4645-95c4-af868cba3244\") " pod="openstack/nova-cell0-conductor-db-sync-f6gwd" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.011510 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d6d5bbe-7e5b-4645-95c4-af868cba3244-config-data\") pod \"nova-cell0-conductor-db-sync-f6gwd\" (UID: \"5d6d5bbe-7e5b-4645-95c4-af868cba3244\") " pod="openstack/nova-cell0-conductor-db-sync-f6gwd" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.011585 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqhf2\" (UniqueName: \"kubernetes.io/projected/5d6d5bbe-7e5b-4645-95c4-af868cba3244-kube-api-access-gqhf2\") pod \"nova-cell0-conductor-db-sync-f6gwd\" (UID: \"5d6d5bbe-7e5b-4645-95c4-af868cba3244\") " pod="openstack/nova-cell0-conductor-db-sync-f6gwd" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.011629 4837 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.011656 4837 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.011666 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fdb2289-943a-4078-ab5f-cab9a7b4faf1-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.026374 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d6d5bbe-7e5b-4645-95c4-af868cba3244-scripts\") pod \"nova-cell0-conductor-db-sync-f6gwd\" (UID: \"5d6d5bbe-7e5b-4645-95c4-af868cba3244\") " pod="openstack/nova-cell0-conductor-db-sync-f6gwd" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.027459 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d6d5bbe-7e5b-4645-95c4-af868cba3244-config-data\") pod \"nova-cell0-conductor-db-sync-f6gwd\" (UID: \"5d6d5bbe-7e5b-4645-95c4-af868cba3244\") " pod="openstack/nova-cell0-conductor-db-sync-f6gwd" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.028480 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d6d5bbe-7e5b-4645-95c4-af868cba3244-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-f6gwd\" (UID: \"5d6d5bbe-7e5b-4645-95c4-af868cba3244\") " pod="openstack/nova-cell0-conductor-db-sync-f6gwd" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.034165 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqhf2\" (UniqueName: \"kubernetes.io/projected/5d6d5bbe-7e5b-4645-95c4-af868cba3244-kube-api-access-gqhf2\") pod \"nova-cell0-conductor-db-sync-f6gwd\" (UID: \"5d6d5bbe-7e5b-4645-95c4-af868cba3244\") " pod="openstack/nova-cell0-conductor-db-sync-f6gwd" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.040306 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-f6gwd" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.075265 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be033789-27be-444d-b72e-7abbbb34b285" path="/var/lib/kubelet/pods/be033789-27be-444d-b72e-7abbbb34b285/volumes" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.239436 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9fdb2289-943a-4078-ab5f-cab9a7b4faf1","Type":"ContainerDied","Data":"9af1f1b6bae1b057a7c5b2be284aed718dd1bd53fd4267a097ec24a461a2d852"} Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.239784 4837 scope.go:117] "RemoveContainer" containerID="a7e9d992e509609ea914f80658069ef20b3e4ab7548f88fd1489567b1ca63a1f" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.239546 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.313584 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.334710 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.354705 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.356587 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.359218 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.360100 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.370557 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.395507 4837 scope.go:117] "RemoveContainer" containerID="4c6f80cedfefe6ca3ffa3fd1f8e5bca2af1a1e041ef15266c27ebfeb6b6939ec" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.527131 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0f3b003-127f-414f-877a-8f7df2872049-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d0f3b003-127f-414f-877a-8f7df2872049\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.527400 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"d0f3b003-127f-414f-877a-8f7df2872049\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.527432 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48pn7\" (UniqueName: \"kubernetes.io/projected/d0f3b003-127f-414f-877a-8f7df2872049-kube-api-access-48pn7\") pod \"glance-default-internal-api-0\" (UID: \"d0f3b003-127f-414f-877a-8f7df2872049\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.527465 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0f3b003-127f-414f-877a-8f7df2872049-logs\") pod \"glance-default-internal-api-0\" (UID: \"d0f3b003-127f-414f-877a-8f7df2872049\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.527580 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0f3b003-127f-414f-877a-8f7df2872049-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d0f3b003-127f-414f-877a-8f7df2872049\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.527616 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0f3b003-127f-414f-877a-8f7df2872049-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d0f3b003-127f-414f-877a-8f7df2872049\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.527663 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d0f3b003-127f-414f-877a-8f7df2872049-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d0f3b003-127f-414f-877a-8f7df2872049\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.527701 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0f3b003-127f-414f-877a-8f7df2872049-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d0f3b003-127f-414f-877a-8f7df2872049\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.587161 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-f6gwd"] Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.594785 4837 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.629547 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0f3b003-127f-414f-877a-8f7df2872049-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d0f3b003-127f-414f-877a-8f7df2872049\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.629604 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0f3b003-127f-414f-877a-8f7df2872049-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d0f3b003-127f-414f-877a-8f7df2872049\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.629670 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d0f3b003-127f-414f-877a-8f7df2872049-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d0f3b003-127f-414f-877a-8f7df2872049\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.629721 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0f3b003-127f-414f-877a-8f7df2872049-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d0f3b003-127f-414f-877a-8f7df2872049\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.629766 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0f3b003-127f-414f-877a-8f7df2872049-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d0f3b003-127f-414f-877a-8f7df2872049\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.629816 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"d0f3b003-127f-414f-877a-8f7df2872049\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.629854 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48pn7\" (UniqueName: \"kubernetes.io/projected/d0f3b003-127f-414f-877a-8f7df2872049-kube-api-access-48pn7\") pod \"glance-default-internal-api-0\" (UID: \"d0f3b003-127f-414f-877a-8f7df2872049\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.629894 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0f3b003-127f-414f-877a-8f7df2872049-logs\") pod \"glance-default-internal-api-0\" (UID: \"d0f3b003-127f-414f-877a-8f7df2872049\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.630115 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d0f3b003-127f-414f-877a-8f7df2872049-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d0f3b003-127f-414f-877a-8f7df2872049\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.630255 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0f3b003-127f-414f-877a-8f7df2872049-logs\") pod \"glance-default-internal-api-0\" (UID: \"d0f3b003-127f-414f-877a-8f7df2872049\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.630698 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"d0f3b003-127f-414f-877a-8f7df2872049\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.638770 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0f3b003-127f-414f-877a-8f7df2872049-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d0f3b003-127f-414f-877a-8f7df2872049\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.639514 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0f3b003-127f-414f-877a-8f7df2872049-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d0f3b003-127f-414f-877a-8f7df2872049\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.640548 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0f3b003-127f-414f-877a-8f7df2872049-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d0f3b003-127f-414f-877a-8f7df2872049\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.643506 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0f3b003-127f-414f-877a-8f7df2872049-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d0f3b003-127f-414f-877a-8f7df2872049\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.652535 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48pn7\" (UniqueName: \"kubernetes.io/projected/d0f3b003-127f-414f-877a-8f7df2872049-kube-api-access-48pn7\") pod \"glance-default-internal-api-0\" (UID: \"d0f3b003-127f-414f-877a-8f7df2872049\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.683031 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"d0f3b003-127f-414f-877a-8f7df2872049\") " pod="openstack/glance-default-internal-api-0" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.739397 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.934951 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ec252a2a-f9a4-4894-991d-1a70f596519d-sg-core-conf-yaml\") pod \"ec252a2a-f9a4-4894-991d-1a70f596519d\" (UID: \"ec252a2a-f9a4-4894-991d-1a70f596519d\") " Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.935244 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec252a2a-f9a4-4894-991d-1a70f596519d-combined-ca-bundle\") pod \"ec252a2a-f9a4-4894-991d-1a70f596519d\" (UID: \"ec252a2a-f9a4-4894-991d-1a70f596519d\") " Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.935420 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec252a2a-f9a4-4894-991d-1a70f596519d-scripts\") pod \"ec252a2a-f9a4-4894-991d-1a70f596519d\" (UID: \"ec252a2a-f9a4-4894-991d-1a70f596519d\") " Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.935528 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec252a2a-f9a4-4894-991d-1a70f596519d-config-data\") pod \"ec252a2a-f9a4-4894-991d-1a70f596519d\" (UID: \"ec252a2a-f9a4-4894-991d-1a70f596519d\") " Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.935669 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jmwm\" (UniqueName: \"kubernetes.io/projected/ec252a2a-f9a4-4894-991d-1a70f596519d-kube-api-access-9jmwm\") pod \"ec252a2a-f9a4-4894-991d-1a70f596519d\" (UID: \"ec252a2a-f9a4-4894-991d-1a70f596519d\") " Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.935819 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec252a2a-f9a4-4894-991d-1a70f596519d-log-httpd\") pod \"ec252a2a-f9a4-4894-991d-1a70f596519d\" (UID: \"ec252a2a-f9a4-4894-991d-1a70f596519d\") " Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.935895 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec252a2a-f9a4-4894-991d-1a70f596519d-run-httpd\") pod \"ec252a2a-f9a4-4894-991d-1a70f596519d\" (UID: \"ec252a2a-f9a4-4894-991d-1a70f596519d\") " Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.936123 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec252a2a-f9a4-4894-991d-1a70f596519d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ec252a2a-f9a4-4894-991d-1a70f596519d" (UID: "ec252a2a-f9a4-4894-991d-1a70f596519d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.936245 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec252a2a-f9a4-4894-991d-1a70f596519d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ec252a2a-f9a4-4894-991d-1a70f596519d" (UID: "ec252a2a-f9a4-4894-991d-1a70f596519d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.936616 4837 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec252a2a-f9a4-4894-991d-1a70f596519d-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.936722 4837 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec252a2a-f9a4-4894-991d-1a70f596519d-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.943782 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec252a2a-f9a4-4894-991d-1a70f596519d-scripts" (OuterVolumeSpecName: "scripts") pod "ec252a2a-f9a4-4894-991d-1a70f596519d" (UID: "ec252a2a-f9a4-4894-991d-1a70f596519d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.946583 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec252a2a-f9a4-4894-991d-1a70f596519d-kube-api-access-9jmwm" (OuterVolumeSpecName: "kube-api-access-9jmwm") pod "ec252a2a-f9a4-4894-991d-1a70f596519d" (UID: "ec252a2a-f9a4-4894-991d-1a70f596519d"). InnerVolumeSpecName "kube-api-access-9jmwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:07 crc kubenswrapper[4837]: I0313 12:08:07.974255 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.006810 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec252a2a-f9a4-4894-991d-1a70f596519d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ec252a2a-f9a4-4894-991d-1a70f596519d" (UID: "ec252a2a-f9a4-4894-991d-1a70f596519d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.038319 4837 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ec252a2a-f9a4-4894-991d-1a70f596519d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.038353 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec252a2a-f9a4-4894-991d-1a70f596519d-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.038363 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jmwm\" (UniqueName: \"kubernetes.io/projected/ec252a2a-f9a4-4894-991d-1a70f596519d-kube-api-access-9jmwm\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.049761 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec252a2a-f9a4-4894-991d-1a70f596519d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec252a2a-f9a4-4894-991d-1a70f596519d" (UID: "ec252a2a-f9a4-4894-991d-1a70f596519d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.063719 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec252a2a-f9a4-4894-991d-1a70f596519d-config-data" (OuterVolumeSpecName: "config-data") pod "ec252a2a-f9a4-4894-991d-1a70f596519d" (UID: "ec252a2a-f9a4-4894-991d-1a70f596519d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.139737 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec252a2a-f9a4-4894-991d-1a70f596519d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.139778 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec252a2a-f9a4-4894-991d-1a70f596519d-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.221004 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.265351 4837 generic.go:334] "Generic (PLEG): container finished" podID="ec252a2a-f9a4-4894-991d-1a70f596519d" containerID="f13323c6b5c7472c3b9f76328f6f7d80a5868b615ab9c24cb1496e6b292c2e9a" exitCode=0 Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.265426 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ec252a2a-f9a4-4894-991d-1a70f596519d","Type":"ContainerDied","Data":"f13323c6b5c7472c3b9f76328f6f7d80a5868b615ab9c24cb1496e6b292c2e9a"} Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.265454 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ec252a2a-f9a4-4894-991d-1a70f596519d","Type":"ContainerDied","Data":"8963d958bbfe2f25190f6d4efa0bcd7a6fe7c107dfdb4e163c3ec794ab189d07"} Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.265475 4837 scope.go:117] "RemoveContainer" containerID="4ac16d5369630fdba4cee8a516cca217bdb0ae52551269ab68960255bd7bcb07" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.265587 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.300076 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-f6gwd" event={"ID":"5d6d5bbe-7e5b-4645-95c4-af868cba3244","Type":"ContainerStarted","Data":"6144ca86ef9d0d9f5e120027d710fea9eb400bcc8f2a208f56ef661ebbec1f34"} Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.310794 4837 scope.go:117] "RemoveContainer" containerID="2fa9cafe6c8b9f2bb8d388b54db3598398838d667dccefef13c7e8655cabb201" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.321467 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.339496 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.363260 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:08:08 crc kubenswrapper[4837]: E0313 12:08:08.366671 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec252a2a-f9a4-4894-991d-1a70f596519d" containerName="proxy-httpd" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.366711 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec252a2a-f9a4-4894-991d-1a70f596519d" containerName="proxy-httpd" Mar 13 12:08:08 crc kubenswrapper[4837]: E0313 12:08:08.366773 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec252a2a-f9a4-4894-991d-1a70f596519d" containerName="ceilometer-central-agent" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.366783 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec252a2a-f9a4-4894-991d-1a70f596519d" containerName="ceilometer-central-agent" Mar 13 12:08:08 crc kubenswrapper[4837]: E0313 12:08:08.366811 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec252a2a-f9a4-4894-991d-1a70f596519d" containerName="ceilometer-notification-agent" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.366818 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec252a2a-f9a4-4894-991d-1a70f596519d" containerName="ceilometer-notification-agent" Mar 13 12:08:08 crc kubenswrapper[4837]: E0313 12:08:08.366834 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec252a2a-f9a4-4894-991d-1a70f596519d" containerName="sg-core" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.366842 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec252a2a-f9a4-4894-991d-1a70f596519d" containerName="sg-core" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.367713 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec252a2a-f9a4-4894-991d-1a70f596519d" containerName="ceilometer-central-agent" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.367774 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec252a2a-f9a4-4894-991d-1a70f596519d" containerName="ceilometer-notification-agent" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.367793 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec252a2a-f9a4-4894-991d-1a70f596519d" containerName="sg-core" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.367821 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec252a2a-f9a4-4894-991d-1a70f596519d" containerName="proxy-httpd" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.377768 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.384213 4837 scope.go:117] "RemoveContainer" containerID="759b2c4c55f496a02021bffec50cb8a1d6cfb6037ffefb8f05fc410f86a3f8d4" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.389098 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.389471 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.411450 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.458489 4837 scope.go:117] "RemoveContainer" containerID="f13323c6b5c7472c3b9f76328f6f7d80a5868b615ab9c24cb1496e6b292c2e9a" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.497692 4837 scope.go:117] "RemoveContainer" containerID="4ac16d5369630fdba4cee8a516cca217bdb0ae52551269ab68960255bd7bcb07" Mar 13 12:08:08 crc kubenswrapper[4837]: E0313 12:08:08.498052 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ac16d5369630fdba4cee8a516cca217bdb0ae52551269ab68960255bd7bcb07\": container with ID starting with 4ac16d5369630fdba4cee8a516cca217bdb0ae52551269ab68960255bd7bcb07 not found: ID does not exist" containerID="4ac16d5369630fdba4cee8a516cca217bdb0ae52551269ab68960255bd7bcb07" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.498083 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ac16d5369630fdba4cee8a516cca217bdb0ae52551269ab68960255bd7bcb07"} err="failed to get container status \"4ac16d5369630fdba4cee8a516cca217bdb0ae52551269ab68960255bd7bcb07\": rpc error: code = NotFound desc = could not find container \"4ac16d5369630fdba4cee8a516cca217bdb0ae52551269ab68960255bd7bcb07\": container with ID starting with 4ac16d5369630fdba4cee8a516cca217bdb0ae52551269ab68960255bd7bcb07 not found: ID does not exist" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.498107 4837 scope.go:117] "RemoveContainer" containerID="2fa9cafe6c8b9f2bb8d388b54db3598398838d667dccefef13c7e8655cabb201" Mar 13 12:08:08 crc kubenswrapper[4837]: E0313 12:08:08.498613 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fa9cafe6c8b9f2bb8d388b54db3598398838d667dccefef13c7e8655cabb201\": container with ID starting with 2fa9cafe6c8b9f2bb8d388b54db3598398838d667dccefef13c7e8655cabb201 not found: ID does not exist" containerID="2fa9cafe6c8b9f2bb8d388b54db3598398838d667dccefef13c7e8655cabb201" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.498682 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fa9cafe6c8b9f2bb8d388b54db3598398838d667dccefef13c7e8655cabb201"} err="failed to get container status \"2fa9cafe6c8b9f2bb8d388b54db3598398838d667dccefef13c7e8655cabb201\": rpc error: code = NotFound desc = could not find container \"2fa9cafe6c8b9f2bb8d388b54db3598398838d667dccefef13c7e8655cabb201\": container with ID starting with 2fa9cafe6c8b9f2bb8d388b54db3598398838d667dccefef13c7e8655cabb201 not found: ID does not exist" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.498718 4837 scope.go:117] "RemoveContainer" containerID="759b2c4c55f496a02021bffec50cb8a1d6cfb6037ffefb8f05fc410f86a3f8d4" Mar 13 12:08:08 crc kubenswrapper[4837]: E0313 12:08:08.499270 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"759b2c4c55f496a02021bffec50cb8a1d6cfb6037ffefb8f05fc410f86a3f8d4\": container with ID starting with 759b2c4c55f496a02021bffec50cb8a1d6cfb6037ffefb8f05fc410f86a3f8d4 not found: ID does not exist" containerID="759b2c4c55f496a02021bffec50cb8a1d6cfb6037ffefb8f05fc410f86a3f8d4" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.499325 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"759b2c4c55f496a02021bffec50cb8a1d6cfb6037ffefb8f05fc410f86a3f8d4"} err="failed to get container status \"759b2c4c55f496a02021bffec50cb8a1d6cfb6037ffefb8f05fc410f86a3f8d4\": rpc error: code = NotFound desc = could not find container \"759b2c4c55f496a02021bffec50cb8a1d6cfb6037ffefb8f05fc410f86a3f8d4\": container with ID starting with 759b2c4c55f496a02021bffec50cb8a1d6cfb6037ffefb8f05fc410f86a3f8d4 not found: ID does not exist" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.499363 4837 scope.go:117] "RemoveContainer" containerID="f13323c6b5c7472c3b9f76328f6f7d80a5868b615ab9c24cb1496e6b292c2e9a" Mar 13 12:08:08 crc kubenswrapper[4837]: E0313 12:08:08.499836 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f13323c6b5c7472c3b9f76328f6f7d80a5868b615ab9c24cb1496e6b292c2e9a\": container with ID starting with f13323c6b5c7472c3b9f76328f6f7d80a5868b615ab9c24cb1496e6b292c2e9a not found: ID does not exist" containerID="f13323c6b5c7472c3b9f76328f6f7d80a5868b615ab9c24cb1496e6b292c2e9a" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.499890 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f13323c6b5c7472c3b9f76328f6f7d80a5868b615ab9c24cb1496e6b292c2e9a"} err="failed to get container status \"f13323c6b5c7472c3b9f76328f6f7d80a5868b615ab9c24cb1496e6b292c2e9a\": rpc error: code = NotFound desc = could not find container \"f13323c6b5c7472c3b9f76328f6f7d80a5868b615ab9c24cb1496e6b292c2e9a\": container with ID starting with f13323c6b5c7472c3b9f76328f6f7d80a5868b615ab9c24cb1496e6b292c2e9a not found: ID does not exist" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.567777 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-config-data\") pod \"ceilometer-0\" (UID: \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\") " pod="openstack/ceilometer-0" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.567868 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-run-httpd\") pod \"ceilometer-0\" (UID: \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\") " pod="openstack/ceilometer-0" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.567905 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-scripts\") pod \"ceilometer-0\" (UID: \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\") " pod="openstack/ceilometer-0" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.567927 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\") " pod="openstack/ceilometer-0" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.567983 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\") " pod="openstack/ceilometer-0" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.568027 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6jrg\" (UniqueName: \"kubernetes.io/projected/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-kube-api-access-p6jrg\") pod \"ceilometer-0\" (UID: \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\") " pod="openstack/ceilometer-0" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.568066 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-log-httpd\") pod \"ceilometer-0\" (UID: \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\") " pod="openstack/ceilometer-0" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.572477 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 12:08:08 crc kubenswrapper[4837]: W0313 12:08:08.599543 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0f3b003_127f_414f_877a_8f7df2872049.slice/crio-eeae7f380050bdcbd8635a5445d8a2141e4fd6a8936c78d957ddea8b41ad3793 WatchSource:0}: Error finding container eeae7f380050bdcbd8635a5445d8a2141e4fd6a8936c78d957ddea8b41ad3793: Status 404 returned error can't find the container with id eeae7f380050bdcbd8635a5445d8a2141e4fd6a8936c78d957ddea8b41ad3793 Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.669452 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-config-data\") pod \"ceilometer-0\" (UID: \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\") " pod="openstack/ceilometer-0" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.669538 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-run-httpd\") pod \"ceilometer-0\" (UID: \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\") " pod="openstack/ceilometer-0" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.669584 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-scripts\") pod \"ceilometer-0\" (UID: \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\") " pod="openstack/ceilometer-0" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.669646 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\") " pod="openstack/ceilometer-0" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.669709 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\") " pod="openstack/ceilometer-0" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.669752 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6jrg\" (UniqueName: \"kubernetes.io/projected/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-kube-api-access-p6jrg\") pod \"ceilometer-0\" (UID: \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\") " pod="openstack/ceilometer-0" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.669881 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-log-httpd\") pod \"ceilometer-0\" (UID: \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\") " pod="openstack/ceilometer-0" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.670121 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-run-httpd\") pod \"ceilometer-0\" (UID: \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\") " pod="openstack/ceilometer-0" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.670241 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-log-httpd\") pod \"ceilometer-0\" (UID: \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\") " pod="openstack/ceilometer-0" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.681686 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\") " pod="openstack/ceilometer-0" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.684169 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\") " pod="openstack/ceilometer-0" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.685921 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-scripts\") pod \"ceilometer-0\" (UID: \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\") " pod="openstack/ceilometer-0" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.690627 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-config-data\") pod \"ceilometer-0\" (UID: \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\") " pod="openstack/ceilometer-0" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.696401 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6jrg\" (UniqueName: \"kubernetes.io/projected/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-kube-api-access-p6jrg\") pod \"ceilometer-0\" (UID: \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\") " pod="openstack/ceilometer-0" Mar 13 12:08:08 crc kubenswrapper[4837]: I0313 12:08:08.737473 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:08:09 crc kubenswrapper[4837]: I0313 12:08:09.059000 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fdb2289-943a-4078-ab5f-cab9a7b4faf1" path="/var/lib/kubelet/pods/9fdb2289-943a-4078-ab5f-cab9a7b4faf1/volumes" Mar 13 12:08:09 crc kubenswrapper[4837]: I0313 12:08:09.060119 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec252a2a-f9a4-4894-991d-1a70f596519d" path="/var/lib/kubelet/pods/ec252a2a-f9a4-4894-991d-1a70f596519d/volumes" Mar 13 12:08:09 crc kubenswrapper[4837]: I0313 12:08:09.249603 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:08:09 crc kubenswrapper[4837]: I0313 12:08:09.324853 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08c4fdb7-b384-4d2d-9bd4-4d33884e828c","Type":"ContainerStarted","Data":"a06c9f38e731b74f1c733f23ddeb517873bc8240455c11686148ecef7617ff18"} Mar 13 12:08:09 crc kubenswrapper[4837]: I0313 12:08:09.326955 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d0f3b003-127f-414f-877a-8f7df2872049","Type":"ContainerStarted","Data":"44826e4def7b4e6d925e64be0fe446443f064dc0048fccb1c82bbe3a889f12c6"} Mar 13 12:08:09 crc kubenswrapper[4837]: I0313 12:08:09.326989 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d0f3b003-127f-414f-877a-8f7df2872049","Type":"ContainerStarted","Data":"eeae7f380050bdcbd8635a5445d8a2141e4fd6a8936c78d957ddea8b41ad3793"} Mar 13 12:08:10 crc kubenswrapper[4837]: I0313 12:08:10.338521 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d0f3b003-127f-414f-877a-8f7df2872049","Type":"ContainerStarted","Data":"7a5e4afce92c029361a67abdc5df9ad06561515c47e3e0277455062bb70f9bca"} Mar 13 12:08:10 crc kubenswrapper[4837]: I0313 12:08:10.343905 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08c4fdb7-b384-4d2d-9bd4-4d33884e828c","Type":"ContainerStarted","Data":"8495229a6ff71917e207208546bb16237ed6662319c65e8aa9a38672e19873be"} Mar 13 12:08:10 crc kubenswrapper[4837]: I0313 12:08:10.380787 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.380752175 podStartE2EDuration="3.380752175s" podCreationTimestamp="2026-03-13 12:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:08:10.35802159 +0000 UTC m=+1205.996288353" watchObservedRunningTime="2026-03-13 12:08:10.380752175 +0000 UTC m=+1206.019018938" Mar 13 12:08:11 crc kubenswrapper[4837]: I0313 12:08:11.217844 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:08:11 crc kubenswrapper[4837]: I0313 12:08:11.362371 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08c4fdb7-b384-4d2d-9bd4-4d33884e828c","Type":"ContainerStarted","Data":"f7e8a1c4cea6c266dbe0e59de861283c004a7f4e6af318e9419ad381b4ec1ce8"} Mar 13 12:08:11 crc kubenswrapper[4837]: I0313 12:08:11.795074 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 13 12:08:11 crc kubenswrapper[4837]: I0313 12:08:11.795155 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 13 12:08:11 crc kubenswrapper[4837]: I0313 12:08:11.841940 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 13 12:08:11 crc kubenswrapper[4837]: I0313 12:08:11.845618 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 13 12:08:12 crc kubenswrapper[4837]: I0313 12:08:12.378269 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08c4fdb7-b384-4d2d-9bd4-4d33884e828c","Type":"ContainerStarted","Data":"670c3039e4f25422b8c912438421c20e191370ef5c3d79dde1295ea0f24bd8e4"} Mar 13 12:08:12 crc kubenswrapper[4837]: I0313 12:08:12.378575 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 13 12:08:12 crc kubenswrapper[4837]: I0313 12:08:12.378594 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 13 12:08:14 crc kubenswrapper[4837]: I0313 12:08:14.399369 4837 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 12:08:14 crc kubenswrapper[4837]: I0313 12:08:14.399716 4837 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 12:08:14 crc kubenswrapper[4837]: I0313 12:08:14.441503 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 13 12:08:14 crc kubenswrapper[4837]: I0313 12:08:14.652920 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 13 12:08:17 crc kubenswrapper[4837]: I0313 12:08:17.975780 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 13 12:08:17 crc kubenswrapper[4837]: I0313 12:08:17.976444 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 13 12:08:18 crc kubenswrapper[4837]: I0313 12:08:18.026404 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 13 12:08:18 crc kubenswrapper[4837]: I0313 12:08:18.026714 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 13 12:08:18 crc kubenswrapper[4837]: I0313 12:08:18.462773 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 13 12:08:18 crc kubenswrapper[4837]: I0313 12:08:18.463569 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 13 12:08:19 crc kubenswrapper[4837]: I0313 12:08:19.470593 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-f6gwd" event={"ID":"5d6d5bbe-7e5b-4645-95c4-af868cba3244","Type":"ContainerStarted","Data":"5337a2212bdc3b1dbf150fa95afc9aaae420bfce797da10558e36cb08bd46c77"} Mar 13 12:08:19 crc kubenswrapper[4837]: I0313 12:08:19.474678 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08c4fdb7-b384-4d2d-9bd4-4d33884e828c","Type":"ContainerStarted","Data":"fffa014874e73390230eb7b6af8642c186fe9f3931f1eb8400b94d2c0e9a3d1f"} Mar 13 12:08:19 crc kubenswrapper[4837]: I0313 12:08:19.474952 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="08c4fdb7-b384-4d2d-9bd4-4d33884e828c" containerName="ceilometer-central-agent" containerID="cri-o://8495229a6ff71917e207208546bb16237ed6662319c65e8aa9a38672e19873be" gracePeriod=30 Mar 13 12:08:19 crc kubenswrapper[4837]: I0313 12:08:19.475059 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 12:08:19 crc kubenswrapper[4837]: I0313 12:08:19.475107 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="08c4fdb7-b384-4d2d-9bd4-4d33884e828c" containerName="proxy-httpd" containerID="cri-o://fffa014874e73390230eb7b6af8642c186fe9f3931f1eb8400b94d2c0e9a3d1f" gracePeriod=30 Mar 13 12:08:19 crc kubenswrapper[4837]: I0313 12:08:19.475143 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="08c4fdb7-b384-4d2d-9bd4-4d33884e828c" containerName="sg-core" containerID="cri-o://670c3039e4f25422b8c912438421c20e191370ef5c3d79dde1295ea0f24bd8e4" gracePeriod=30 Mar 13 12:08:19 crc kubenswrapper[4837]: I0313 12:08:19.475177 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="08c4fdb7-b384-4d2d-9bd4-4d33884e828c" containerName="ceilometer-notification-agent" containerID="cri-o://f7e8a1c4cea6c266dbe0e59de861283c004a7f4e6af318e9419ad381b4ec1ce8" gracePeriod=30 Mar 13 12:08:19 crc kubenswrapper[4837]: I0313 12:08:19.492309 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-f6gwd" podStartSLOduration=2.241168462 podStartE2EDuration="13.492284707s" podCreationTimestamp="2026-03-13 12:08:06 +0000 UTC" firstStartedPulling="2026-03-13 12:08:07.594509062 +0000 UTC m=+1203.232775825" lastFinishedPulling="2026-03-13 12:08:18.845625297 +0000 UTC m=+1214.483892070" observedRunningTime="2026-03-13 12:08:19.483858303 +0000 UTC m=+1215.122125066" watchObservedRunningTime="2026-03-13 12:08:19.492284707 +0000 UTC m=+1215.130551480" Mar 13 12:08:19 crc kubenswrapper[4837]: I0313 12:08:19.532974 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.93763292 podStartE2EDuration="11.532945716s" podCreationTimestamp="2026-03-13 12:08:08 +0000 UTC" firstStartedPulling="2026-03-13 12:08:09.250385004 +0000 UTC m=+1204.888651767" lastFinishedPulling="2026-03-13 12:08:18.8456978 +0000 UTC m=+1214.483964563" observedRunningTime="2026-03-13 12:08:19.505294556 +0000 UTC m=+1215.143561319" watchObservedRunningTime="2026-03-13 12:08:19.532945716 +0000 UTC m=+1215.171212509" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.366577 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.485496 4837 generic.go:334] "Generic (PLEG): container finished" podID="08c4fdb7-b384-4d2d-9bd4-4d33884e828c" containerID="fffa014874e73390230eb7b6af8642c186fe9f3931f1eb8400b94d2c0e9a3d1f" exitCode=0 Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.485539 4837 generic.go:334] "Generic (PLEG): container finished" podID="08c4fdb7-b384-4d2d-9bd4-4d33884e828c" containerID="670c3039e4f25422b8c912438421c20e191370ef5c3d79dde1295ea0f24bd8e4" exitCode=2 Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.485552 4837 generic.go:334] "Generic (PLEG): container finished" podID="08c4fdb7-b384-4d2d-9bd4-4d33884e828c" containerID="f7e8a1c4cea6c266dbe0e59de861283c004a7f4e6af318e9419ad381b4ec1ce8" exitCode=0 Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.485561 4837 generic.go:334] "Generic (PLEG): container finished" podID="08c4fdb7-b384-4d2d-9bd4-4d33884e828c" containerID="8495229a6ff71917e207208546bb16237ed6662319c65e8aa9a38672e19873be" exitCode=0 Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.485680 4837 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.485693 4837 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.485790 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.486590 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08c4fdb7-b384-4d2d-9bd4-4d33884e828c","Type":"ContainerDied","Data":"fffa014874e73390230eb7b6af8642c186fe9f3931f1eb8400b94d2c0e9a3d1f"} Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.486622 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08c4fdb7-b384-4d2d-9bd4-4d33884e828c","Type":"ContainerDied","Data":"670c3039e4f25422b8c912438421c20e191370ef5c3d79dde1295ea0f24bd8e4"} Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.486637 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08c4fdb7-b384-4d2d-9bd4-4d33884e828c","Type":"ContainerDied","Data":"f7e8a1c4cea6c266dbe0e59de861283c004a7f4e6af318e9419ad381b4ec1ce8"} Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.486647 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08c4fdb7-b384-4d2d-9bd4-4d33884e828c","Type":"ContainerDied","Data":"8495229a6ff71917e207208546bb16237ed6662319c65e8aa9a38672e19873be"} Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.486725 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08c4fdb7-b384-4d2d-9bd4-4d33884e828c","Type":"ContainerDied","Data":"a06c9f38e731b74f1c733f23ddeb517873bc8240455c11686148ecef7617ff18"} Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.486742 4837 scope.go:117] "RemoveContainer" containerID="fffa014874e73390230eb7b6af8642c186fe9f3931f1eb8400b94d2c0e9a3d1f" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.509814 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-scripts\") pod \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\" (UID: \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\") " Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.509898 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-config-data\") pod \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\" (UID: \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\") " Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.509927 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-sg-core-conf-yaml\") pod \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\" (UID: \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\") " Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.510006 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-run-httpd\") pod \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\" (UID: \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\") " Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.510077 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6jrg\" (UniqueName: \"kubernetes.io/projected/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-kube-api-access-p6jrg\") pod \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\" (UID: \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\") " Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.510108 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-combined-ca-bundle\") pod \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\" (UID: \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\") " Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.510177 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-log-httpd\") pod \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\" (UID: \"08c4fdb7-b384-4d2d-9bd4-4d33884e828c\") " Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.510845 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "08c4fdb7-b384-4d2d-9bd4-4d33884e828c" (UID: "08c4fdb7-b384-4d2d-9bd4-4d33884e828c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.514049 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "08c4fdb7-b384-4d2d-9bd4-4d33884e828c" (UID: "08c4fdb7-b384-4d2d-9bd4-4d33884e828c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.515880 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-scripts" (OuterVolumeSpecName: "scripts") pod "08c4fdb7-b384-4d2d-9bd4-4d33884e828c" (UID: "08c4fdb7-b384-4d2d-9bd4-4d33884e828c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.523668 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-kube-api-access-p6jrg" (OuterVolumeSpecName: "kube-api-access-p6jrg") pod "08c4fdb7-b384-4d2d-9bd4-4d33884e828c" (UID: "08c4fdb7-b384-4d2d-9bd4-4d33884e828c"). InnerVolumeSpecName "kube-api-access-p6jrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.533950 4837 scope.go:117] "RemoveContainer" containerID="670c3039e4f25422b8c912438421c20e191370ef5c3d79dde1295ea0f24bd8e4" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.551013 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "08c4fdb7-b384-4d2d-9bd4-4d33884e828c" (UID: "08c4fdb7-b384-4d2d-9bd4-4d33884e828c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.554808 4837 scope.go:117] "RemoveContainer" containerID="f7e8a1c4cea6c266dbe0e59de861283c004a7f4e6af318e9419ad381b4ec1ce8" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.560744 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.563122 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.593862 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "08c4fdb7-b384-4d2d-9bd4-4d33884e828c" (UID: "08c4fdb7-b384-4d2d-9bd4-4d33884e828c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.604919 4837 scope.go:117] "RemoveContainer" containerID="8495229a6ff71917e207208546bb16237ed6662319c65e8aa9a38672e19873be" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.614283 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.614569 4837 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.614580 4837 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.614588 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6jrg\" (UniqueName: \"kubernetes.io/projected/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-kube-api-access-p6jrg\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.614597 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.614605 4837 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.639031 4837 scope.go:117] "RemoveContainer" containerID="fffa014874e73390230eb7b6af8642c186fe9f3931f1eb8400b94d2c0e9a3d1f" Mar 13 12:08:20 crc kubenswrapper[4837]: E0313 12:08:20.639948 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fffa014874e73390230eb7b6af8642c186fe9f3931f1eb8400b94d2c0e9a3d1f\": container with ID starting with fffa014874e73390230eb7b6af8642c186fe9f3931f1eb8400b94d2c0e9a3d1f not found: ID does not exist" containerID="fffa014874e73390230eb7b6af8642c186fe9f3931f1eb8400b94d2c0e9a3d1f" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.639977 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fffa014874e73390230eb7b6af8642c186fe9f3931f1eb8400b94d2c0e9a3d1f"} err="failed to get container status \"fffa014874e73390230eb7b6af8642c186fe9f3931f1eb8400b94d2c0e9a3d1f\": rpc error: code = NotFound desc = could not find container \"fffa014874e73390230eb7b6af8642c186fe9f3931f1eb8400b94d2c0e9a3d1f\": container with ID starting with fffa014874e73390230eb7b6af8642c186fe9f3931f1eb8400b94d2c0e9a3d1f not found: ID does not exist" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.639997 4837 scope.go:117] "RemoveContainer" containerID="670c3039e4f25422b8c912438421c20e191370ef5c3d79dde1295ea0f24bd8e4" Mar 13 12:08:20 crc kubenswrapper[4837]: E0313 12:08:20.640352 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"670c3039e4f25422b8c912438421c20e191370ef5c3d79dde1295ea0f24bd8e4\": container with ID starting with 670c3039e4f25422b8c912438421c20e191370ef5c3d79dde1295ea0f24bd8e4 not found: ID does not exist" containerID="670c3039e4f25422b8c912438421c20e191370ef5c3d79dde1295ea0f24bd8e4" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.640378 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"670c3039e4f25422b8c912438421c20e191370ef5c3d79dde1295ea0f24bd8e4"} err="failed to get container status \"670c3039e4f25422b8c912438421c20e191370ef5c3d79dde1295ea0f24bd8e4\": rpc error: code = NotFound desc = could not find container \"670c3039e4f25422b8c912438421c20e191370ef5c3d79dde1295ea0f24bd8e4\": container with ID starting with 670c3039e4f25422b8c912438421c20e191370ef5c3d79dde1295ea0f24bd8e4 not found: ID does not exist" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.640391 4837 scope.go:117] "RemoveContainer" containerID="f7e8a1c4cea6c266dbe0e59de861283c004a7f4e6af318e9419ad381b4ec1ce8" Mar 13 12:08:20 crc kubenswrapper[4837]: E0313 12:08:20.640599 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7e8a1c4cea6c266dbe0e59de861283c004a7f4e6af318e9419ad381b4ec1ce8\": container with ID starting with f7e8a1c4cea6c266dbe0e59de861283c004a7f4e6af318e9419ad381b4ec1ce8 not found: ID does not exist" containerID="f7e8a1c4cea6c266dbe0e59de861283c004a7f4e6af318e9419ad381b4ec1ce8" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.640613 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7e8a1c4cea6c266dbe0e59de861283c004a7f4e6af318e9419ad381b4ec1ce8"} err="failed to get container status \"f7e8a1c4cea6c266dbe0e59de861283c004a7f4e6af318e9419ad381b4ec1ce8\": rpc error: code = NotFound desc = could not find container \"f7e8a1c4cea6c266dbe0e59de861283c004a7f4e6af318e9419ad381b4ec1ce8\": container with ID starting with f7e8a1c4cea6c266dbe0e59de861283c004a7f4e6af318e9419ad381b4ec1ce8 not found: ID does not exist" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.640625 4837 scope.go:117] "RemoveContainer" containerID="8495229a6ff71917e207208546bb16237ed6662319c65e8aa9a38672e19873be" Mar 13 12:08:20 crc kubenswrapper[4837]: E0313 12:08:20.640854 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8495229a6ff71917e207208546bb16237ed6662319c65e8aa9a38672e19873be\": container with ID starting with 8495229a6ff71917e207208546bb16237ed6662319c65e8aa9a38672e19873be not found: ID does not exist" containerID="8495229a6ff71917e207208546bb16237ed6662319c65e8aa9a38672e19873be" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.640868 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8495229a6ff71917e207208546bb16237ed6662319c65e8aa9a38672e19873be"} err="failed to get container status \"8495229a6ff71917e207208546bb16237ed6662319c65e8aa9a38672e19873be\": rpc error: code = NotFound desc = could not find container \"8495229a6ff71917e207208546bb16237ed6662319c65e8aa9a38672e19873be\": container with ID starting with 8495229a6ff71917e207208546bb16237ed6662319c65e8aa9a38672e19873be not found: ID does not exist" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.640881 4837 scope.go:117] "RemoveContainer" containerID="fffa014874e73390230eb7b6af8642c186fe9f3931f1eb8400b94d2c0e9a3d1f" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.642611 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fffa014874e73390230eb7b6af8642c186fe9f3931f1eb8400b94d2c0e9a3d1f"} err="failed to get container status \"fffa014874e73390230eb7b6af8642c186fe9f3931f1eb8400b94d2c0e9a3d1f\": rpc error: code = NotFound desc = could not find container \"fffa014874e73390230eb7b6af8642c186fe9f3931f1eb8400b94d2c0e9a3d1f\": container with ID starting with fffa014874e73390230eb7b6af8642c186fe9f3931f1eb8400b94d2c0e9a3d1f not found: ID does not exist" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.642634 4837 scope.go:117] "RemoveContainer" containerID="670c3039e4f25422b8c912438421c20e191370ef5c3d79dde1295ea0f24bd8e4" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.643135 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"670c3039e4f25422b8c912438421c20e191370ef5c3d79dde1295ea0f24bd8e4"} err="failed to get container status \"670c3039e4f25422b8c912438421c20e191370ef5c3d79dde1295ea0f24bd8e4\": rpc error: code = NotFound desc = could not find container \"670c3039e4f25422b8c912438421c20e191370ef5c3d79dde1295ea0f24bd8e4\": container with ID starting with 670c3039e4f25422b8c912438421c20e191370ef5c3d79dde1295ea0f24bd8e4 not found: ID does not exist" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.643151 4837 scope.go:117] "RemoveContainer" containerID="f7e8a1c4cea6c266dbe0e59de861283c004a7f4e6af318e9419ad381b4ec1ce8" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.643370 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7e8a1c4cea6c266dbe0e59de861283c004a7f4e6af318e9419ad381b4ec1ce8"} err="failed to get container status \"f7e8a1c4cea6c266dbe0e59de861283c004a7f4e6af318e9419ad381b4ec1ce8\": rpc error: code = NotFound desc = could not find container \"f7e8a1c4cea6c266dbe0e59de861283c004a7f4e6af318e9419ad381b4ec1ce8\": container with ID starting with f7e8a1c4cea6c266dbe0e59de861283c004a7f4e6af318e9419ad381b4ec1ce8 not found: ID does not exist" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.643397 4837 scope.go:117] "RemoveContainer" containerID="8495229a6ff71917e207208546bb16237ed6662319c65e8aa9a38672e19873be" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.645012 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8495229a6ff71917e207208546bb16237ed6662319c65e8aa9a38672e19873be"} err="failed to get container status \"8495229a6ff71917e207208546bb16237ed6662319c65e8aa9a38672e19873be\": rpc error: code = NotFound desc = could not find container \"8495229a6ff71917e207208546bb16237ed6662319c65e8aa9a38672e19873be\": container with ID starting with 8495229a6ff71917e207208546bb16237ed6662319c65e8aa9a38672e19873be not found: ID does not exist" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.645041 4837 scope.go:117] "RemoveContainer" containerID="fffa014874e73390230eb7b6af8642c186fe9f3931f1eb8400b94d2c0e9a3d1f" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.645600 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fffa014874e73390230eb7b6af8642c186fe9f3931f1eb8400b94d2c0e9a3d1f"} err="failed to get container status \"fffa014874e73390230eb7b6af8642c186fe9f3931f1eb8400b94d2c0e9a3d1f\": rpc error: code = NotFound desc = could not find container \"fffa014874e73390230eb7b6af8642c186fe9f3931f1eb8400b94d2c0e9a3d1f\": container with ID starting with fffa014874e73390230eb7b6af8642c186fe9f3931f1eb8400b94d2c0e9a3d1f not found: ID does not exist" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.645713 4837 scope.go:117] "RemoveContainer" containerID="670c3039e4f25422b8c912438421c20e191370ef5c3d79dde1295ea0f24bd8e4" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.646164 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"670c3039e4f25422b8c912438421c20e191370ef5c3d79dde1295ea0f24bd8e4"} err="failed to get container status \"670c3039e4f25422b8c912438421c20e191370ef5c3d79dde1295ea0f24bd8e4\": rpc error: code = NotFound desc = could not find container \"670c3039e4f25422b8c912438421c20e191370ef5c3d79dde1295ea0f24bd8e4\": container with ID starting with 670c3039e4f25422b8c912438421c20e191370ef5c3d79dde1295ea0f24bd8e4 not found: ID does not exist" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.646217 4837 scope.go:117] "RemoveContainer" containerID="f7e8a1c4cea6c266dbe0e59de861283c004a7f4e6af318e9419ad381b4ec1ce8" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.646505 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7e8a1c4cea6c266dbe0e59de861283c004a7f4e6af318e9419ad381b4ec1ce8"} err="failed to get container status \"f7e8a1c4cea6c266dbe0e59de861283c004a7f4e6af318e9419ad381b4ec1ce8\": rpc error: code = NotFound desc = could not find container \"f7e8a1c4cea6c266dbe0e59de861283c004a7f4e6af318e9419ad381b4ec1ce8\": container with ID starting with f7e8a1c4cea6c266dbe0e59de861283c004a7f4e6af318e9419ad381b4ec1ce8 not found: ID does not exist" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.646531 4837 scope.go:117] "RemoveContainer" containerID="8495229a6ff71917e207208546bb16237ed6662319c65e8aa9a38672e19873be" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.646826 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8495229a6ff71917e207208546bb16237ed6662319c65e8aa9a38672e19873be"} err="failed to get container status \"8495229a6ff71917e207208546bb16237ed6662319c65e8aa9a38672e19873be\": rpc error: code = NotFound desc = could not find container \"8495229a6ff71917e207208546bb16237ed6662319c65e8aa9a38672e19873be\": container with ID starting with 8495229a6ff71917e207208546bb16237ed6662319c65e8aa9a38672e19873be not found: ID does not exist" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.646855 4837 scope.go:117] "RemoveContainer" containerID="fffa014874e73390230eb7b6af8642c186fe9f3931f1eb8400b94d2c0e9a3d1f" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.647095 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fffa014874e73390230eb7b6af8642c186fe9f3931f1eb8400b94d2c0e9a3d1f"} err="failed to get container status \"fffa014874e73390230eb7b6af8642c186fe9f3931f1eb8400b94d2c0e9a3d1f\": rpc error: code = NotFound desc = could not find container \"fffa014874e73390230eb7b6af8642c186fe9f3931f1eb8400b94d2c0e9a3d1f\": container with ID starting with fffa014874e73390230eb7b6af8642c186fe9f3931f1eb8400b94d2c0e9a3d1f not found: ID does not exist" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.647119 4837 scope.go:117] "RemoveContainer" containerID="670c3039e4f25422b8c912438421c20e191370ef5c3d79dde1295ea0f24bd8e4" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.647302 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"670c3039e4f25422b8c912438421c20e191370ef5c3d79dde1295ea0f24bd8e4"} err="failed to get container status \"670c3039e4f25422b8c912438421c20e191370ef5c3d79dde1295ea0f24bd8e4\": rpc error: code = NotFound desc = could not find container \"670c3039e4f25422b8c912438421c20e191370ef5c3d79dde1295ea0f24bd8e4\": container with ID starting with 670c3039e4f25422b8c912438421c20e191370ef5c3d79dde1295ea0f24bd8e4 not found: ID does not exist" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.647341 4837 scope.go:117] "RemoveContainer" containerID="f7e8a1c4cea6c266dbe0e59de861283c004a7f4e6af318e9419ad381b4ec1ce8" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.647574 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7e8a1c4cea6c266dbe0e59de861283c004a7f4e6af318e9419ad381b4ec1ce8"} err="failed to get container status \"f7e8a1c4cea6c266dbe0e59de861283c004a7f4e6af318e9419ad381b4ec1ce8\": rpc error: code = NotFound desc = could not find container \"f7e8a1c4cea6c266dbe0e59de861283c004a7f4e6af318e9419ad381b4ec1ce8\": container with ID starting with f7e8a1c4cea6c266dbe0e59de861283c004a7f4e6af318e9419ad381b4ec1ce8 not found: ID does not exist" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.647601 4837 scope.go:117] "RemoveContainer" containerID="8495229a6ff71917e207208546bb16237ed6662319c65e8aa9a38672e19873be" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.647776 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8495229a6ff71917e207208546bb16237ed6662319c65e8aa9a38672e19873be"} err="failed to get container status \"8495229a6ff71917e207208546bb16237ed6662319c65e8aa9a38672e19873be\": rpc error: code = NotFound desc = could not find container \"8495229a6ff71917e207208546bb16237ed6662319c65e8aa9a38672e19873be\": container with ID starting with 8495229a6ff71917e207208546bb16237ed6662319c65e8aa9a38672e19873be not found: ID does not exist" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.649839 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-config-data" (OuterVolumeSpecName: "config-data") pod "08c4fdb7-b384-4d2d-9bd4-4d33884e828c" (UID: "08c4fdb7-b384-4d2d-9bd4-4d33884e828c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.716778 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c4fdb7-b384-4d2d-9bd4-4d33884e828c-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.819321 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.829114 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.847148 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:08:20 crc kubenswrapper[4837]: E0313 12:08:20.847798 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08c4fdb7-b384-4d2d-9bd4-4d33884e828c" containerName="ceilometer-central-agent" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.847816 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="08c4fdb7-b384-4d2d-9bd4-4d33884e828c" containerName="ceilometer-central-agent" Mar 13 12:08:20 crc kubenswrapper[4837]: E0313 12:08:20.847842 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08c4fdb7-b384-4d2d-9bd4-4d33884e828c" containerName="proxy-httpd" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.847850 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="08c4fdb7-b384-4d2d-9bd4-4d33884e828c" containerName="proxy-httpd" Mar 13 12:08:20 crc kubenswrapper[4837]: E0313 12:08:20.847863 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08c4fdb7-b384-4d2d-9bd4-4d33884e828c" containerName="ceilometer-notification-agent" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.847870 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="08c4fdb7-b384-4d2d-9bd4-4d33884e828c" containerName="ceilometer-notification-agent" Mar 13 12:08:20 crc kubenswrapper[4837]: E0313 12:08:20.847885 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08c4fdb7-b384-4d2d-9bd4-4d33884e828c" containerName="sg-core" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.847892 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="08c4fdb7-b384-4d2d-9bd4-4d33884e828c" containerName="sg-core" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.848075 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="08c4fdb7-b384-4d2d-9bd4-4d33884e828c" containerName="ceilometer-notification-agent" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.848091 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="08c4fdb7-b384-4d2d-9bd4-4d33884e828c" containerName="ceilometer-central-agent" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.848103 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="08c4fdb7-b384-4d2d-9bd4-4d33884e828c" containerName="sg-core" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.848118 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="08c4fdb7-b384-4d2d-9bd4-4d33884e828c" containerName="proxy-httpd" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.851121 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.853773 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.853903 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 12:08:20 crc kubenswrapper[4837]: I0313 12:08:20.861686 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:08:21 crc kubenswrapper[4837]: I0313 12:08:21.023507 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7w9zs\" (UniqueName: \"kubernetes.io/projected/f9e01084-6025-433d-99d8-36d2c555c685-kube-api-access-7w9zs\") pod \"ceilometer-0\" (UID: \"f9e01084-6025-433d-99d8-36d2c555c685\") " pod="openstack/ceilometer-0" Mar 13 12:08:21 crc kubenswrapper[4837]: I0313 12:08:21.023567 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9e01084-6025-433d-99d8-36d2c555c685-run-httpd\") pod \"ceilometer-0\" (UID: \"f9e01084-6025-433d-99d8-36d2c555c685\") " pod="openstack/ceilometer-0" Mar 13 12:08:21 crc kubenswrapper[4837]: I0313 12:08:21.023637 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9e01084-6025-433d-99d8-36d2c555c685-scripts\") pod \"ceilometer-0\" (UID: \"f9e01084-6025-433d-99d8-36d2c555c685\") " pod="openstack/ceilometer-0" Mar 13 12:08:21 crc kubenswrapper[4837]: I0313 12:08:21.023863 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9e01084-6025-433d-99d8-36d2c555c685-log-httpd\") pod \"ceilometer-0\" (UID: \"f9e01084-6025-433d-99d8-36d2c555c685\") " pod="openstack/ceilometer-0" Mar 13 12:08:21 crc kubenswrapper[4837]: I0313 12:08:21.023923 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f9e01084-6025-433d-99d8-36d2c555c685-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f9e01084-6025-433d-99d8-36d2c555c685\") " pod="openstack/ceilometer-0" Mar 13 12:08:21 crc kubenswrapper[4837]: I0313 12:08:21.024050 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9e01084-6025-433d-99d8-36d2c555c685-config-data\") pod \"ceilometer-0\" (UID: \"f9e01084-6025-433d-99d8-36d2c555c685\") " pod="openstack/ceilometer-0" Mar 13 12:08:21 crc kubenswrapper[4837]: I0313 12:08:21.024088 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9e01084-6025-433d-99d8-36d2c555c685-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f9e01084-6025-433d-99d8-36d2c555c685\") " pod="openstack/ceilometer-0" Mar 13 12:08:21 crc kubenswrapper[4837]: I0313 12:08:21.066991 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08c4fdb7-b384-4d2d-9bd4-4d33884e828c" path="/var/lib/kubelet/pods/08c4fdb7-b384-4d2d-9bd4-4d33884e828c/volumes" Mar 13 12:08:21 crc kubenswrapper[4837]: I0313 12:08:21.126052 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9e01084-6025-433d-99d8-36d2c555c685-scripts\") pod \"ceilometer-0\" (UID: \"f9e01084-6025-433d-99d8-36d2c555c685\") " pod="openstack/ceilometer-0" Mar 13 12:08:21 crc kubenswrapper[4837]: I0313 12:08:21.126134 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9e01084-6025-433d-99d8-36d2c555c685-log-httpd\") pod \"ceilometer-0\" (UID: \"f9e01084-6025-433d-99d8-36d2c555c685\") " pod="openstack/ceilometer-0" Mar 13 12:08:21 crc kubenswrapper[4837]: I0313 12:08:21.126159 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f9e01084-6025-433d-99d8-36d2c555c685-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f9e01084-6025-433d-99d8-36d2c555c685\") " pod="openstack/ceilometer-0" Mar 13 12:08:21 crc kubenswrapper[4837]: I0313 12:08:21.126206 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9e01084-6025-433d-99d8-36d2c555c685-config-data\") pod \"ceilometer-0\" (UID: \"f9e01084-6025-433d-99d8-36d2c555c685\") " pod="openstack/ceilometer-0" Mar 13 12:08:21 crc kubenswrapper[4837]: I0313 12:08:21.126224 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9e01084-6025-433d-99d8-36d2c555c685-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f9e01084-6025-433d-99d8-36d2c555c685\") " pod="openstack/ceilometer-0" Mar 13 12:08:21 crc kubenswrapper[4837]: I0313 12:08:21.126415 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7w9zs\" (UniqueName: \"kubernetes.io/projected/f9e01084-6025-433d-99d8-36d2c555c685-kube-api-access-7w9zs\") pod \"ceilometer-0\" (UID: \"f9e01084-6025-433d-99d8-36d2c555c685\") " pod="openstack/ceilometer-0" Mar 13 12:08:21 crc kubenswrapper[4837]: I0313 12:08:21.126439 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9e01084-6025-433d-99d8-36d2c555c685-run-httpd\") pod \"ceilometer-0\" (UID: \"f9e01084-6025-433d-99d8-36d2c555c685\") " pod="openstack/ceilometer-0" Mar 13 12:08:21 crc kubenswrapper[4837]: I0313 12:08:21.127960 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9e01084-6025-433d-99d8-36d2c555c685-run-httpd\") pod \"ceilometer-0\" (UID: \"f9e01084-6025-433d-99d8-36d2c555c685\") " pod="openstack/ceilometer-0" Mar 13 12:08:21 crc kubenswrapper[4837]: I0313 12:08:21.138951 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9e01084-6025-433d-99d8-36d2c555c685-log-httpd\") pod \"ceilometer-0\" (UID: \"f9e01084-6025-433d-99d8-36d2c555c685\") " pod="openstack/ceilometer-0" Mar 13 12:08:21 crc kubenswrapper[4837]: I0313 12:08:21.140441 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9e01084-6025-433d-99d8-36d2c555c685-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f9e01084-6025-433d-99d8-36d2c555c685\") " pod="openstack/ceilometer-0" Mar 13 12:08:21 crc kubenswrapper[4837]: I0313 12:08:21.140893 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9e01084-6025-433d-99d8-36d2c555c685-config-data\") pod \"ceilometer-0\" (UID: \"f9e01084-6025-433d-99d8-36d2c555c685\") " pod="openstack/ceilometer-0" Mar 13 12:08:21 crc kubenswrapper[4837]: I0313 12:08:21.141232 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f9e01084-6025-433d-99d8-36d2c555c685-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f9e01084-6025-433d-99d8-36d2c555c685\") " pod="openstack/ceilometer-0" Mar 13 12:08:21 crc kubenswrapper[4837]: I0313 12:08:21.143338 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9e01084-6025-433d-99d8-36d2c555c685-scripts\") pod \"ceilometer-0\" (UID: \"f9e01084-6025-433d-99d8-36d2c555c685\") " pod="openstack/ceilometer-0" Mar 13 12:08:21 crc kubenswrapper[4837]: I0313 12:08:21.146258 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7w9zs\" (UniqueName: \"kubernetes.io/projected/f9e01084-6025-433d-99d8-36d2c555c685-kube-api-access-7w9zs\") pod \"ceilometer-0\" (UID: \"f9e01084-6025-433d-99d8-36d2c555c685\") " pod="openstack/ceilometer-0" Mar 13 12:08:21 crc kubenswrapper[4837]: I0313 12:08:21.347861 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:08:21 crc kubenswrapper[4837]: I0313 12:08:21.806551 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:08:22 crc kubenswrapper[4837]: I0313 12:08:22.514862 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9e01084-6025-433d-99d8-36d2c555c685","Type":"ContainerStarted","Data":"f59c8e71a086a620d4220bad0b44420b5cdb08a3b6ad9da08898e9871162295e"} Mar 13 12:08:23 crc kubenswrapper[4837]: I0313 12:08:23.524085 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9e01084-6025-433d-99d8-36d2c555c685","Type":"ContainerStarted","Data":"8b144003cd78c701982819cb7d0748db2e568902b9482e1bbdfbf0e7a4fd8a29"} Mar 13 12:08:23 crc kubenswrapper[4837]: I0313 12:08:23.524154 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9e01084-6025-433d-99d8-36d2c555c685","Type":"ContainerStarted","Data":"6bdb083352f5cd3edf1d10575e1157207eed447e6f437afa820f995c0285299c"} Mar 13 12:08:24 crc kubenswrapper[4837]: I0313 12:08:24.534170 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9e01084-6025-433d-99d8-36d2c555c685","Type":"ContainerStarted","Data":"e6fcba9318ec55a9f50d812ffd1a39b187c76c67f7a03dde9a6836d845ab1f00"} Mar 13 12:08:26 crc kubenswrapper[4837]: I0313 12:08:26.552500 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9e01084-6025-433d-99d8-36d2c555c685","Type":"ContainerStarted","Data":"49ab00cb0159b6758d37c40cf93ce155c8aa226ad0a842f4f68d853167d1807f"} Mar 13 12:08:26 crc kubenswrapper[4837]: I0313 12:08:26.553058 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 12:08:26 crc kubenswrapper[4837]: I0313 12:08:26.567730 4837 scope.go:117] "RemoveContainer" containerID="bf1679f5dae4d4dbf23dda0605e595646a6c9aa5a55d2f380823eb7ec590b836" Mar 13 12:08:26 crc kubenswrapper[4837]: I0313 12:08:26.575302 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.087078206 podStartE2EDuration="6.575282277s" podCreationTimestamp="2026-03-13 12:08:20 +0000 UTC" firstStartedPulling="2026-03-13 12:08:21.809530232 +0000 UTC m=+1217.447796995" lastFinishedPulling="2026-03-13 12:08:25.297734303 +0000 UTC m=+1220.936001066" observedRunningTime="2026-03-13 12:08:26.568947108 +0000 UTC m=+1222.207213881" watchObservedRunningTime="2026-03-13 12:08:26.575282277 +0000 UTC m=+1222.213549040" Mar 13 12:08:27 crc kubenswrapper[4837]: I0313 12:08:27.169279 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:08:28 crc kubenswrapper[4837]: I0313 12:08:28.566301 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f9e01084-6025-433d-99d8-36d2c555c685" containerName="ceilometer-central-agent" containerID="cri-o://8b144003cd78c701982819cb7d0748db2e568902b9482e1bbdfbf0e7a4fd8a29" gracePeriod=30 Mar 13 12:08:28 crc kubenswrapper[4837]: I0313 12:08:28.566351 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f9e01084-6025-433d-99d8-36d2c555c685" containerName="sg-core" containerID="cri-o://e6fcba9318ec55a9f50d812ffd1a39b187c76c67f7a03dde9a6836d845ab1f00" gracePeriod=30 Mar 13 12:08:28 crc kubenswrapper[4837]: I0313 12:08:28.566377 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f9e01084-6025-433d-99d8-36d2c555c685" containerName="ceilometer-notification-agent" containerID="cri-o://6bdb083352f5cd3edf1d10575e1157207eed447e6f437afa820f995c0285299c" gracePeriod=30 Mar 13 12:08:28 crc kubenswrapper[4837]: I0313 12:08:28.566401 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f9e01084-6025-433d-99d8-36d2c555c685" containerName="proxy-httpd" containerID="cri-o://49ab00cb0159b6758d37c40cf93ce155c8aa226ad0a842f4f68d853167d1807f" gracePeriod=30 Mar 13 12:08:29 crc kubenswrapper[4837]: I0313 12:08:29.576481 4837 generic.go:334] "Generic (PLEG): container finished" podID="f9e01084-6025-433d-99d8-36d2c555c685" containerID="49ab00cb0159b6758d37c40cf93ce155c8aa226ad0a842f4f68d853167d1807f" exitCode=0 Mar 13 12:08:29 crc kubenswrapper[4837]: I0313 12:08:29.576517 4837 generic.go:334] "Generic (PLEG): container finished" podID="f9e01084-6025-433d-99d8-36d2c555c685" containerID="e6fcba9318ec55a9f50d812ffd1a39b187c76c67f7a03dde9a6836d845ab1f00" exitCode=2 Mar 13 12:08:29 crc kubenswrapper[4837]: I0313 12:08:29.576530 4837 generic.go:334] "Generic (PLEG): container finished" podID="f9e01084-6025-433d-99d8-36d2c555c685" containerID="6bdb083352f5cd3edf1d10575e1157207eed447e6f437afa820f995c0285299c" exitCode=0 Mar 13 12:08:29 crc kubenswrapper[4837]: I0313 12:08:29.576558 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9e01084-6025-433d-99d8-36d2c555c685","Type":"ContainerDied","Data":"49ab00cb0159b6758d37c40cf93ce155c8aa226ad0a842f4f68d853167d1807f"} Mar 13 12:08:29 crc kubenswrapper[4837]: I0313 12:08:29.576601 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9e01084-6025-433d-99d8-36d2c555c685","Type":"ContainerDied","Data":"e6fcba9318ec55a9f50d812ffd1a39b187c76c67f7a03dde9a6836d845ab1f00"} Mar 13 12:08:29 crc kubenswrapper[4837]: I0313 12:08:29.576612 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9e01084-6025-433d-99d8-36d2c555c685","Type":"ContainerDied","Data":"6bdb083352f5cd3edf1d10575e1157207eed447e6f437afa820f995c0285299c"} Mar 13 12:08:29 crc kubenswrapper[4837]: I0313 12:08:29.578198 4837 generic.go:334] "Generic (PLEG): container finished" podID="5d6d5bbe-7e5b-4645-95c4-af868cba3244" containerID="5337a2212bdc3b1dbf150fa95afc9aaae420bfce797da10558e36cb08bd46c77" exitCode=0 Mar 13 12:08:29 crc kubenswrapper[4837]: I0313 12:08:29.578408 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-f6gwd" event={"ID":"5d6d5bbe-7e5b-4645-95c4-af868cba3244","Type":"ContainerDied","Data":"5337a2212bdc3b1dbf150fa95afc9aaae420bfce797da10558e36cb08bd46c77"} Mar 13 12:08:30 crc kubenswrapper[4837]: I0313 12:08:30.950703 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-f6gwd" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.130722 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d6d5bbe-7e5b-4645-95c4-af868cba3244-combined-ca-bundle\") pod \"5d6d5bbe-7e5b-4645-95c4-af868cba3244\" (UID: \"5d6d5bbe-7e5b-4645-95c4-af868cba3244\") " Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.130787 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqhf2\" (UniqueName: \"kubernetes.io/projected/5d6d5bbe-7e5b-4645-95c4-af868cba3244-kube-api-access-gqhf2\") pod \"5d6d5bbe-7e5b-4645-95c4-af868cba3244\" (UID: \"5d6d5bbe-7e5b-4645-95c4-af868cba3244\") " Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.130846 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d6d5bbe-7e5b-4645-95c4-af868cba3244-scripts\") pod \"5d6d5bbe-7e5b-4645-95c4-af868cba3244\" (UID: \"5d6d5bbe-7e5b-4645-95c4-af868cba3244\") " Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.131029 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d6d5bbe-7e5b-4645-95c4-af868cba3244-config-data\") pod \"5d6d5bbe-7e5b-4645-95c4-af868cba3244\" (UID: \"5d6d5bbe-7e5b-4645-95c4-af868cba3244\") " Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.136992 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d6d5bbe-7e5b-4645-95c4-af868cba3244-scripts" (OuterVolumeSpecName: "scripts") pod "5d6d5bbe-7e5b-4645-95c4-af868cba3244" (UID: "5d6d5bbe-7e5b-4645-95c4-af868cba3244"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.151434 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d6d5bbe-7e5b-4645-95c4-af868cba3244-kube-api-access-gqhf2" (OuterVolumeSpecName: "kube-api-access-gqhf2") pod "5d6d5bbe-7e5b-4645-95c4-af868cba3244" (UID: "5d6d5bbe-7e5b-4645-95c4-af868cba3244"). InnerVolumeSpecName "kube-api-access-gqhf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.158509 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d6d5bbe-7e5b-4645-95c4-af868cba3244-config-data" (OuterVolumeSpecName: "config-data") pod "5d6d5bbe-7e5b-4645-95c4-af868cba3244" (UID: "5d6d5bbe-7e5b-4645-95c4-af868cba3244"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.161624 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d6d5bbe-7e5b-4645-95c4-af868cba3244-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d6d5bbe-7e5b-4645-95c4-af868cba3244" (UID: "5d6d5bbe-7e5b-4645-95c4-af868cba3244"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.233476 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d6d5bbe-7e5b-4645-95c4-af868cba3244-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.233523 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqhf2\" (UniqueName: \"kubernetes.io/projected/5d6d5bbe-7e5b-4645-95c4-af868cba3244-kube-api-access-gqhf2\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.233542 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d6d5bbe-7e5b-4645-95c4-af868cba3244-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.233555 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d6d5bbe-7e5b-4645-95c4-af868cba3244-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.528410 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.607025 4837 generic.go:334] "Generic (PLEG): container finished" podID="f9e01084-6025-433d-99d8-36d2c555c685" containerID="8b144003cd78c701982819cb7d0748db2e568902b9482e1bbdfbf0e7a4fd8a29" exitCode=0 Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.607094 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.607120 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9e01084-6025-433d-99d8-36d2c555c685","Type":"ContainerDied","Data":"8b144003cd78c701982819cb7d0748db2e568902b9482e1bbdfbf0e7a4fd8a29"} Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.607155 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9e01084-6025-433d-99d8-36d2c555c685","Type":"ContainerDied","Data":"f59c8e71a086a620d4220bad0b44420b5cdb08a3b6ad9da08898e9871162295e"} Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.607173 4837 scope.go:117] "RemoveContainer" containerID="49ab00cb0159b6758d37c40cf93ce155c8aa226ad0a842f4f68d853167d1807f" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.611269 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-f6gwd" event={"ID":"5d6d5bbe-7e5b-4645-95c4-af868cba3244","Type":"ContainerDied","Data":"6144ca86ef9d0d9f5e120027d710fea9eb400bcc8f2a208f56ef661ebbec1f34"} Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.611339 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6144ca86ef9d0d9f5e120027d710fea9eb400bcc8f2a208f56ef661ebbec1f34" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.611427 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-f6gwd" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.631035 4837 scope.go:117] "RemoveContainer" containerID="e6fcba9318ec55a9f50d812ffd1a39b187c76c67f7a03dde9a6836d845ab1f00" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.643320 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9e01084-6025-433d-99d8-36d2c555c685-config-data\") pod \"f9e01084-6025-433d-99d8-36d2c555c685\" (UID: \"f9e01084-6025-433d-99d8-36d2c555c685\") " Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.643422 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f9e01084-6025-433d-99d8-36d2c555c685-sg-core-conf-yaml\") pod \"f9e01084-6025-433d-99d8-36d2c555c685\" (UID: \"f9e01084-6025-433d-99d8-36d2c555c685\") " Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.643503 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9e01084-6025-433d-99d8-36d2c555c685-scripts\") pod \"f9e01084-6025-433d-99d8-36d2c555c685\" (UID: \"f9e01084-6025-433d-99d8-36d2c555c685\") " Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.643539 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9e01084-6025-433d-99d8-36d2c555c685-run-httpd\") pod \"f9e01084-6025-433d-99d8-36d2c555c685\" (UID: \"f9e01084-6025-433d-99d8-36d2c555c685\") " Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.643588 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7w9zs\" (UniqueName: \"kubernetes.io/projected/f9e01084-6025-433d-99d8-36d2c555c685-kube-api-access-7w9zs\") pod \"f9e01084-6025-433d-99d8-36d2c555c685\" (UID: \"f9e01084-6025-433d-99d8-36d2c555c685\") " Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.643676 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9e01084-6025-433d-99d8-36d2c555c685-combined-ca-bundle\") pod \"f9e01084-6025-433d-99d8-36d2c555c685\" (UID: \"f9e01084-6025-433d-99d8-36d2c555c685\") " Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.643790 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9e01084-6025-433d-99d8-36d2c555c685-log-httpd\") pod \"f9e01084-6025-433d-99d8-36d2c555c685\" (UID: \"f9e01084-6025-433d-99d8-36d2c555c685\") " Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.643887 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9e01084-6025-433d-99d8-36d2c555c685-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f9e01084-6025-433d-99d8-36d2c555c685" (UID: "f9e01084-6025-433d-99d8-36d2c555c685"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.644260 4837 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9e01084-6025-433d-99d8-36d2c555c685-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.644258 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9e01084-6025-433d-99d8-36d2c555c685-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f9e01084-6025-433d-99d8-36d2c555c685" (UID: "f9e01084-6025-433d-99d8-36d2c555c685"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.655955 4837 scope.go:117] "RemoveContainer" containerID="6bdb083352f5cd3edf1d10575e1157207eed447e6f437afa820f995c0285299c" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.658227 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9e01084-6025-433d-99d8-36d2c555c685-kube-api-access-7w9zs" (OuterVolumeSpecName: "kube-api-access-7w9zs") pod "f9e01084-6025-433d-99d8-36d2c555c685" (UID: "f9e01084-6025-433d-99d8-36d2c555c685"). InnerVolumeSpecName "kube-api-access-7w9zs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.673741 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9e01084-6025-433d-99d8-36d2c555c685-scripts" (OuterVolumeSpecName: "scripts") pod "f9e01084-6025-433d-99d8-36d2c555c685" (UID: "f9e01084-6025-433d-99d8-36d2c555c685"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.691473 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9e01084-6025-433d-99d8-36d2c555c685-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f9e01084-6025-433d-99d8-36d2c555c685" (UID: "f9e01084-6025-433d-99d8-36d2c555c685"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.691498 4837 scope.go:117] "RemoveContainer" containerID="8b144003cd78c701982819cb7d0748db2e568902b9482e1bbdfbf0e7a4fd8a29" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.707447 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 13 12:08:31 crc kubenswrapper[4837]: E0313 12:08:31.708328 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d6d5bbe-7e5b-4645-95c4-af868cba3244" containerName="nova-cell0-conductor-db-sync" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.708358 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d6d5bbe-7e5b-4645-95c4-af868cba3244" containerName="nova-cell0-conductor-db-sync" Mar 13 12:08:31 crc kubenswrapper[4837]: E0313 12:08:31.708394 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9e01084-6025-433d-99d8-36d2c555c685" containerName="ceilometer-central-agent" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.708403 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9e01084-6025-433d-99d8-36d2c555c685" containerName="ceilometer-central-agent" Mar 13 12:08:31 crc kubenswrapper[4837]: E0313 12:08:31.708415 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9e01084-6025-433d-99d8-36d2c555c685" containerName="proxy-httpd" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.708424 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9e01084-6025-433d-99d8-36d2c555c685" containerName="proxy-httpd" Mar 13 12:08:31 crc kubenswrapper[4837]: E0313 12:08:31.708438 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9e01084-6025-433d-99d8-36d2c555c685" containerName="sg-core" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.708446 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9e01084-6025-433d-99d8-36d2c555c685" containerName="sg-core" Mar 13 12:08:31 crc kubenswrapper[4837]: E0313 12:08:31.708464 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9e01084-6025-433d-99d8-36d2c555c685" containerName="ceilometer-notification-agent" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.708472 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9e01084-6025-433d-99d8-36d2c555c685" containerName="ceilometer-notification-agent" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.708701 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9e01084-6025-433d-99d8-36d2c555c685" containerName="sg-core" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.708715 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9e01084-6025-433d-99d8-36d2c555c685" containerName="ceilometer-notification-agent" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.708778 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9e01084-6025-433d-99d8-36d2c555c685" containerName="ceilometer-central-agent" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.708794 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d6d5bbe-7e5b-4645-95c4-af868cba3244" containerName="nova-cell0-conductor-db-sync" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.708808 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9e01084-6025-433d-99d8-36d2c555c685" containerName="proxy-httpd" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.709390 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.714091 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.714292 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-qctwr" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.722361 4837 scope.go:117] "RemoveContainer" containerID="49ab00cb0159b6758d37c40cf93ce155c8aa226ad0a842f4f68d853167d1807f" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.724189 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 13 12:08:31 crc kubenswrapper[4837]: E0313 12:08:31.727252 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49ab00cb0159b6758d37c40cf93ce155c8aa226ad0a842f4f68d853167d1807f\": container with ID starting with 49ab00cb0159b6758d37c40cf93ce155c8aa226ad0a842f4f68d853167d1807f not found: ID does not exist" containerID="49ab00cb0159b6758d37c40cf93ce155c8aa226ad0a842f4f68d853167d1807f" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.727309 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49ab00cb0159b6758d37c40cf93ce155c8aa226ad0a842f4f68d853167d1807f"} err="failed to get container status \"49ab00cb0159b6758d37c40cf93ce155c8aa226ad0a842f4f68d853167d1807f\": rpc error: code = NotFound desc = could not find container \"49ab00cb0159b6758d37c40cf93ce155c8aa226ad0a842f4f68d853167d1807f\": container with ID starting with 49ab00cb0159b6758d37c40cf93ce155c8aa226ad0a842f4f68d853167d1807f not found: ID does not exist" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.727343 4837 scope.go:117] "RemoveContainer" containerID="e6fcba9318ec55a9f50d812ffd1a39b187c76c67f7a03dde9a6836d845ab1f00" Mar 13 12:08:31 crc kubenswrapper[4837]: E0313 12:08:31.729448 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6fcba9318ec55a9f50d812ffd1a39b187c76c67f7a03dde9a6836d845ab1f00\": container with ID starting with e6fcba9318ec55a9f50d812ffd1a39b187c76c67f7a03dde9a6836d845ab1f00 not found: ID does not exist" containerID="e6fcba9318ec55a9f50d812ffd1a39b187c76c67f7a03dde9a6836d845ab1f00" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.729524 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6fcba9318ec55a9f50d812ffd1a39b187c76c67f7a03dde9a6836d845ab1f00"} err="failed to get container status \"e6fcba9318ec55a9f50d812ffd1a39b187c76c67f7a03dde9a6836d845ab1f00\": rpc error: code = NotFound desc = could not find container \"e6fcba9318ec55a9f50d812ffd1a39b187c76c67f7a03dde9a6836d845ab1f00\": container with ID starting with e6fcba9318ec55a9f50d812ffd1a39b187c76c67f7a03dde9a6836d845ab1f00 not found: ID does not exist" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.729590 4837 scope.go:117] "RemoveContainer" containerID="6bdb083352f5cd3edf1d10575e1157207eed447e6f437afa820f995c0285299c" Mar 13 12:08:31 crc kubenswrapper[4837]: E0313 12:08:31.730280 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bdb083352f5cd3edf1d10575e1157207eed447e6f437afa820f995c0285299c\": container with ID starting with 6bdb083352f5cd3edf1d10575e1157207eed447e6f437afa820f995c0285299c not found: ID does not exist" containerID="6bdb083352f5cd3edf1d10575e1157207eed447e6f437afa820f995c0285299c" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.730332 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bdb083352f5cd3edf1d10575e1157207eed447e6f437afa820f995c0285299c"} err="failed to get container status \"6bdb083352f5cd3edf1d10575e1157207eed447e6f437afa820f995c0285299c\": rpc error: code = NotFound desc = could not find container \"6bdb083352f5cd3edf1d10575e1157207eed447e6f437afa820f995c0285299c\": container with ID starting with 6bdb083352f5cd3edf1d10575e1157207eed447e6f437afa820f995c0285299c not found: ID does not exist" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.730356 4837 scope.go:117] "RemoveContainer" containerID="8b144003cd78c701982819cb7d0748db2e568902b9482e1bbdfbf0e7a4fd8a29" Mar 13 12:08:31 crc kubenswrapper[4837]: E0313 12:08:31.730897 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b144003cd78c701982819cb7d0748db2e568902b9482e1bbdfbf0e7a4fd8a29\": container with ID starting with 8b144003cd78c701982819cb7d0748db2e568902b9482e1bbdfbf0e7a4fd8a29 not found: ID does not exist" containerID="8b144003cd78c701982819cb7d0748db2e568902b9482e1bbdfbf0e7a4fd8a29" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.730928 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b144003cd78c701982819cb7d0748db2e568902b9482e1bbdfbf0e7a4fd8a29"} err="failed to get container status \"8b144003cd78c701982819cb7d0748db2e568902b9482e1bbdfbf0e7a4fd8a29\": rpc error: code = NotFound desc = could not find container \"8b144003cd78c701982819cb7d0748db2e568902b9482e1bbdfbf0e7a4fd8a29\": container with ID starting with 8b144003cd78c701982819cb7d0748db2e568902b9482e1bbdfbf0e7a4fd8a29 not found: ID does not exist" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.743045 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9e01084-6025-433d-99d8-36d2c555c685-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f9e01084-6025-433d-99d8-36d2c555c685" (UID: "f9e01084-6025-433d-99d8-36d2c555c685"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.746113 4837 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f9e01084-6025-433d-99d8-36d2c555c685-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.746148 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9e01084-6025-433d-99d8-36d2c555c685-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.746161 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7w9zs\" (UniqueName: \"kubernetes.io/projected/f9e01084-6025-433d-99d8-36d2c555c685-kube-api-access-7w9zs\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.746171 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9e01084-6025-433d-99d8-36d2c555c685-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.746179 4837 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9e01084-6025-433d-99d8-36d2c555c685-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.764621 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9e01084-6025-433d-99d8-36d2c555c685-config-data" (OuterVolumeSpecName: "config-data") pod "f9e01084-6025-433d-99d8-36d2c555c685" (UID: "f9e01084-6025-433d-99d8-36d2c555c685"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.848205 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58240a84-c8ab-43a9-8113-eaf2d0ddea2e-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"58240a84-c8ab-43a9-8113-eaf2d0ddea2e\") " pod="openstack/nova-cell0-conductor-0" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.848279 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58240a84-c8ab-43a9-8113-eaf2d0ddea2e-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"58240a84-c8ab-43a9-8113-eaf2d0ddea2e\") " pod="openstack/nova-cell0-conductor-0" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.849126 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnx86\" (UniqueName: \"kubernetes.io/projected/58240a84-c8ab-43a9-8113-eaf2d0ddea2e-kube-api-access-jnx86\") pod \"nova-cell0-conductor-0\" (UID: \"58240a84-c8ab-43a9-8113-eaf2d0ddea2e\") " pod="openstack/nova-cell0-conductor-0" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.849387 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9e01084-6025-433d-99d8-36d2c555c685-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.944363 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.951321 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58240a84-c8ab-43a9-8113-eaf2d0ddea2e-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"58240a84-c8ab-43a9-8113-eaf2d0ddea2e\") " pod="openstack/nova-cell0-conductor-0" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.951394 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58240a84-c8ab-43a9-8113-eaf2d0ddea2e-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"58240a84-c8ab-43a9-8113-eaf2d0ddea2e\") " pod="openstack/nova-cell0-conductor-0" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.951457 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnx86\" (UniqueName: \"kubernetes.io/projected/58240a84-c8ab-43a9-8113-eaf2d0ddea2e-kube-api-access-jnx86\") pod \"nova-cell0-conductor-0\" (UID: \"58240a84-c8ab-43a9-8113-eaf2d0ddea2e\") " pod="openstack/nova-cell0-conductor-0" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.956674 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.960837 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58240a84-c8ab-43a9-8113-eaf2d0ddea2e-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"58240a84-c8ab-43a9-8113-eaf2d0ddea2e\") " pod="openstack/nova-cell0-conductor-0" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.973087 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnx86\" (UniqueName: \"kubernetes.io/projected/58240a84-c8ab-43a9-8113-eaf2d0ddea2e-kube-api-access-jnx86\") pod \"nova-cell0-conductor-0\" (UID: \"58240a84-c8ab-43a9-8113-eaf2d0ddea2e\") " pod="openstack/nova-cell0-conductor-0" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.978830 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58240a84-c8ab-43a9-8113-eaf2d0ddea2e-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"58240a84-c8ab-43a9-8113-eaf2d0ddea2e\") " pod="openstack/nova-cell0-conductor-0" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.984802 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.987563 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.991152 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 12:08:31 crc kubenswrapper[4837]: I0313 12:08:31.991289 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 12:08:32 crc kubenswrapper[4837]: I0313 12:08:31.999107 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:08:32 crc kubenswrapper[4837]: I0313 12:08:32.035337 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 13 12:08:32 crc kubenswrapper[4837]: I0313 12:08:32.155530 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7f70330-cb87-42e5-96c8-6d54828f2a5a-config-data\") pod \"ceilometer-0\" (UID: \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\") " pod="openstack/ceilometer-0" Mar 13 12:08:32 crc kubenswrapper[4837]: I0313 12:08:32.155980 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzlqd\" (UniqueName: \"kubernetes.io/projected/a7f70330-cb87-42e5-96c8-6d54828f2a5a-kube-api-access-lzlqd\") pod \"ceilometer-0\" (UID: \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\") " pod="openstack/ceilometer-0" Mar 13 12:08:32 crc kubenswrapper[4837]: I0313 12:08:32.156024 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a7f70330-cb87-42e5-96c8-6d54828f2a5a-log-httpd\") pod \"ceilometer-0\" (UID: \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\") " pod="openstack/ceilometer-0" Mar 13 12:08:32 crc kubenswrapper[4837]: I0313 12:08:32.156054 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7f70330-cb87-42e5-96c8-6d54828f2a5a-scripts\") pod \"ceilometer-0\" (UID: \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\") " pod="openstack/ceilometer-0" Mar 13 12:08:32 crc kubenswrapper[4837]: I0313 12:08:32.156288 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a7f70330-cb87-42e5-96c8-6d54828f2a5a-run-httpd\") pod \"ceilometer-0\" (UID: \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\") " pod="openstack/ceilometer-0" Mar 13 12:08:32 crc kubenswrapper[4837]: I0313 12:08:32.156365 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a7f70330-cb87-42e5-96c8-6d54828f2a5a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\") " pod="openstack/ceilometer-0" Mar 13 12:08:32 crc kubenswrapper[4837]: I0313 12:08:32.156410 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7f70330-cb87-42e5-96c8-6d54828f2a5a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\") " pod="openstack/ceilometer-0" Mar 13 12:08:32 crc kubenswrapper[4837]: I0313 12:08:32.258325 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7f70330-cb87-42e5-96c8-6d54828f2a5a-config-data\") pod \"ceilometer-0\" (UID: \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\") " pod="openstack/ceilometer-0" Mar 13 12:08:32 crc kubenswrapper[4837]: I0313 12:08:32.258379 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzlqd\" (UniqueName: \"kubernetes.io/projected/a7f70330-cb87-42e5-96c8-6d54828f2a5a-kube-api-access-lzlqd\") pod \"ceilometer-0\" (UID: \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\") " pod="openstack/ceilometer-0" Mar 13 12:08:32 crc kubenswrapper[4837]: I0313 12:08:32.258429 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a7f70330-cb87-42e5-96c8-6d54828f2a5a-log-httpd\") pod \"ceilometer-0\" (UID: \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\") " pod="openstack/ceilometer-0" Mar 13 12:08:32 crc kubenswrapper[4837]: I0313 12:08:32.258457 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7f70330-cb87-42e5-96c8-6d54828f2a5a-scripts\") pod \"ceilometer-0\" (UID: \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\") " pod="openstack/ceilometer-0" Mar 13 12:08:32 crc kubenswrapper[4837]: I0313 12:08:32.258536 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a7f70330-cb87-42e5-96c8-6d54828f2a5a-run-httpd\") pod \"ceilometer-0\" (UID: \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\") " pod="openstack/ceilometer-0" Mar 13 12:08:32 crc kubenswrapper[4837]: I0313 12:08:32.258574 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a7f70330-cb87-42e5-96c8-6d54828f2a5a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\") " pod="openstack/ceilometer-0" Mar 13 12:08:32 crc kubenswrapper[4837]: I0313 12:08:32.258610 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7f70330-cb87-42e5-96c8-6d54828f2a5a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\") " pod="openstack/ceilometer-0" Mar 13 12:08:32 crc kubenswrapper[4837]: I0313 12:08:32.259078 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a7f70330-cb87-42e5-96c8-6d54828f2a5a-log-httpd\") pod \"ceilometer-0\" (UID: \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\") " pod="openstack/ceilometer-0" Mar 13 12:08:32 crc kubenswrapper[4837]: I0313 12:08:32.259165 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a7f70330-cb87-42e5-96c8-6d54828f2a5a-run-httpd\") pod \"ceilometer-0\" (UID: \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\") " pod="openstack/ceilometer-0" Mar 13 12:08:32 crc kubenswrapper[4837]: I0313 12:08:32.263759 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a7f70330-cb87-42e5-96c8-6d54828f2a5a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\") " pod="openstack/ceilometer-0" Mar 13 12:08:32 crc kubenswrapper[4837]: I0313 12:08:32.264247 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7f70330-cb87-42e5-96c8-6d54828f2a5a-config-data\") pod \"ceilometer-0\" (UID: \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\") " pod="openstack/ceilometer-0" Mar 13 12:08:32 crc kubenswrapper[4837]: I0313 12:08:32.264702 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7f70330-cb87-42e5-96c8-6d54828f2a5a-scripts\") pod \"ceilometer-0\" (UID: \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\") " pod="openstack/ceilometer-0" Mar 13 12:08:32 crc kubenswrapper[4837]: I0313 12:08:32.268340 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7f70330-cb87-42e5-96c8-6d54828f2a5a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\") " pod="openstack/ceilometer-0" Mar 13 12:08:32 crc kubenswrapper[4837]: I0313 12:08:32.276699 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzlqd\" (UniqueName: \"kubernetes.io/projected/a7f70330-cb87-42e5-96c8-6d54828f2a5a-kube-api-access-lzlqd\") pod \"ceilometer-0\" (UID: \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\") " pod="openstack/ceilometer-0" Mar 13 12:08:32 crc kubenswrapper[4837]: I0313 12:08:32.459762 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:08:32 crc kubenswrapper[4837]: I0313 12:08:32.467262 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 13 12:08:32 crc kubenswrapper[4837]: I0313 12:08:32.622821 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"58240a84-c8ab-43a9-8113-eaf2d0ddea2e","Type":"ContainerStarted","Data":"ca3091bb68a4c8ccce552dcb4050de7d08cec268f979f37a313240951c2a5722"} Mar 13 12:08:32 crc kubenswrapper[4837]: I0313 12:08:32.906093 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:08:32 crc kubenswrapper[4837]: W0313 12:08:32.906793 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7f70330_cb87_42e5_96c8_6d54828f2a5a.slice/crio-c1caae87e2bfbe9657e4b62036ebd200f6d2445955d6ab66a4adb47c94a2fae0 WatchSource:0}: Error finding container c1caae87e2bfbe9657e4b62036ebd200f6d2445955d6ab66a4adb47c94a2fae0: Status 404 returned error can't find the container with id c1caae87e2bfbe9657e4b62036ebd200f6d2445955d6ab66a4adb47c94a2fae0 Mar 13 12:08:33 crc kubenswrapper[4837]: I0313 12:08:33.063276 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9e01084-6025-433d-99d8-36d2c555c685" path="/var/lib/kubelet/pods/f9e01084-6025-433d-99d8-36d2c555c685/volumes" Mar 13 12:08:33 crc kubenswrapper[4837]: I0313 12:08:33.634661 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"58240a84-c8ab-43a9-8113-eaf2d0ddea2e","Type":"ContainerStarted","Data":"b6b42276aa11a1a7a9b37c345c43f7aaa45f27dcce528886b8a09316471865cf"} Mar 13 12:08:33 crc kubenswrapper[4837]: I0313 12:08:33.634993 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 13 12:08:33 crc kubenswrapper[4837]: I0313 12:08:33.637057 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a7f70330-cb87-42e5-96c8-6d54828f2a5a","Type":"ContainerStarted","Data":"c1caae87e2bfbe9657e4b62036ebd200f6d2445955d6ab66a4adb47c94a2fae0"} Mar 13 12:08:33 crc kubenswrapper[4837]: I0313 12:08:33.677405 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.677377967 podStartE2EDuration="2.677377967s" podCreationTimestamp="2026-03-13 12:08:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:08:33.647666594 +0000 UTC m=+1229.285933367" watchObservedRunningTime="2026-03-13 12:08:33.677377967 +0000 UTC m=+1229.315644730" Mar 13 12:08:34 crc kubenswrapper[4837]: I0313 12:08:34.649553 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a7f70330-cb87-42e5-96c8-6d54828f2a5a","Type":"ContainerStarted","Data":"d8b81a1d862c648975bd9a812fe1d61df727077dd39a97f4adfc70dac6066075"} Mar 13 12:08:35 crc kubenswrapper[4837]: I0313 12:08:35.484394 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:08:35 crc kubenswrapper[4837]: I0313 12:08:35.484483 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:08:35 crc kubenswrapper[4837]: I0313 12:08:35.484558 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 12:08:35 crc kubenswrapper[4837]: I0313 12:08:35.485755 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"75c6e15833f1c4c6d83b741f42f4ce0c9378844641d1d149fd75349d257dfc71"} pod="openshift-machine-config-operator/machine-config-daemon-2td4d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 12:08:35 crc kubenswrapper[4837]: I0313 12:08:35.485837 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" containerID="cri-o://75c6e15833f1c4c6d83b741f42f4ce0c9378844641d1d149fd75349d257dfc71" gracePeriod=600 Mar 13 12:08:35 crc kubenswrapper[4837]: I0313 12:08:35.660922 4837 generic.go:334] "Generic (PLEG): container finished" podID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerID="75c6e15833f1c4c6d83b741f42f4ce0c9378844641d1d149fd75349d257dfc71" exitCode=0 Mar 13 12:08:35 crc kubenswrapper[4837]: I0313 12:08:35.661015 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerDied","Data":"75c6e15833f1c4c6d83b741f42f4ce0c9378844641d1d149fd75349d257dfc71"} Mar 13 12:08:35 crc kubenswrapper[4837]: I0313 12:08:35.661318 4837 scope.go:117] "RemoveContainer" containerID="62df99fa64e257c350cea1390039e0bd2f2c672bf6d80836ec3df94beec3d8d1" Mar 13 12:08:35 crc kubenswrapper[4837]: I0313 12:08:35.664203 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a7f70330-cb87-42e5-96c8-6d54828f2a5a","Type":"ContainerStarted","Data":"6d24d7cecf025123d4d281213efc8079b0cb18a3f100808ee593959500d93094"} Mar 13 12:08:36 crc kubenswrapper[4837]: I0313 12:08:36.488196 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="9fdb2289-943a-4078-ab5f-cab9a7b4faf1" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.160:9292/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 12:08:36 crc kubenswrapper[4837]: I0313 12:08:36.488249 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="9fdb2289-943a-4078-ab5f-cab9a7b4faf1" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.160:9292/healthcheck\": context deadline exceeded" Mar 13 12:08:36 crc kubenswrapper[4837]: I0313 12:08:36.673530 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerStarted","Data":"1c6d9e7e9de5c8ffd75bcf8d5717605d713d8068815596e54e918770c94282bc"} Mar 13 12:08:36 crc kubenswrapper[4837]: I0313 12:08:36.676520 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a7f70330-cb87-42e5-96c8-6d54828f2a5a","Type":"ContainerStarted","Data":"a9f4ef9baf51c5a45fe25c828b539addde1c0065712a676f95056b2183f00569"} Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.088620 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.569984 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-xlps2"] Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.571980 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xlps2" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.576128 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.576880 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.585099 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-xlps2"] Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.672994 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53268342-9adb-48b3-ba5b-52634c2c68fe-config-data\") pod \"nova-cell0-cell-mapping-xlps2\" (UID: \"53268342-9adb-48b3-ba5b-52634c2c68fe\") " pod="openstack/nova-cell0-cell-mapping-xlps2" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.673129 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53268342-9adb-48b3-ba5b-52634c2c68fe-scripts\") pod \"nova-cell0-cell-mapping-xlps2\" (UID: \"53268342-9adb-48b3-ba5b-52634c2c68fe\") " pod="openstack/nova-cell0-cell-mapping-xlps2" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.673167 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fltcz\" (UniqueName: \"kubernetes.io/projected/53268342-9adb-48b3-ba5b-52634c2c68fe-kube-api-access-fltcz\") pod \"nova-cell0-cell-mapping-xlps2\" (UID: \"53268342-9adb-48b3-ba5b-52634c2c68fe\") " pod="openstack/nova-cell0-cell-mapping-xlps2" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.673273 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53268342-9adb-48b3-ba5b-52634c2c68fe-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-xlps2\" (UID: \"53268342-9adb-48b3-ba5b-52634c2c68fe\") " pod="openstack/nova-cell0-cell-mapping-xlps2" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.729054 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.730572 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.741538 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.748558 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.776548 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53268342-9adb-48b3-ba5b-52634c2c68fe-config-data\") pod \"nova-cell0-cell-mapping-xlps2\" (UID: \"53268342-9adb-48b3-ba5b-52634c2c68fe\") " pod="openstack/nova-cell0-cell-mapping-xlps2" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.776694 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53268342-9adb-48b3-ba5b-52634c2c68fe-scripts\") pod \"nova-cell0-cell-mapping-xlps2\" (UID: \"53268342-9adb-48b3-ba5b-52634c2c68fe\") " pod="openstack/nova-cell0-cell-mapping-xlps2" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.776723 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fltcz\" (UniqueName: \"kubernetes.io/projected/53268342-9adb-48b3-ba5b-52634c2c68fe-kube-api-access-fltcz\") pod \"nova-cell0-cell-mapping-xlps2\" (UID: \"53268342-9adb-48b3-ba5b-52634c2c68fe\") " pod="openstack/nova-cell0-cell-mapping-xlps2" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.776818 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53268342-9adb-48b3-ba5b-52634c2c68fe-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-xlps2\" (UID: \"53268342-9adb-48b3-ba5b-52634c2c68fe\") " pod="openstack/nova-cell0-cell-mapping-xlps2" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.793398 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53268342-9adb-48b3-ba5b-52634c2c68fe-scripts\") pod \"nova-cell0-cell-mapping-xlps2\" (UID: \"53268342-9adb-48b3-ba5b-52634c2c68fe\") " pod="openstack/nova-cell0-cell-mapping-xlps2" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.793709 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53268342-9adb-48b3-ba5b-52634c2c68fe-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-xlps2\" (UID: \"53268342-9adb-48b3-ba5b-52634c2c68fe\") " pod="openstack/nova-cell0-cell-mapping-xlps2" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.803054 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53268342-9adb-48b3-ba5b-52634c2c68fe-config-data\") pod \"nova-cell0-cell-mapping-xlps2\" (UID: \"53268342-9adb-48b3-ba5b-52634c2c68fe\") " pod="openstack/nova-cell0-cell-mapping-xlps2" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.815229 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.816607 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.818396 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fltcz\" (UniqueName: \"kubernetes.io/projected/53268342-9adb-48b3-ba5b-52634c2c68fe-kube-api-access-fltcz\") pod \"nova-cell0-cell-mapping-xlps2\" (UID: \"53268342-9adb-48b3-ba5b-52634c2c68fe\") " pod="openstack/nova-cell0-cell-mapping-xlps2" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.820257 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.824204 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.878785 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81ec286a-b6df-4462-8023-c01230a50793-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"81ec286a-b6df-4462-8023-c01230a50793\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.878907 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81ec286a-b6df-4462-8023-c01230a50793-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"81ec286a-b6df-4462-8023-c01230a50793\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.879094 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srf5b\" (UniqueName: \"kubernetes.io/projected/81ec286a-b6df-4462-8023-c01230a50793-kube-api-access-srf5b\") pod \"nova-cell1-novncproxy-0\" (UID: \"81ec286a-b6df-4462-8023-c01230a50793\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.897149 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xlps2" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.932299 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.933711 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.948432 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.980333 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/707289ff-1434-49b7-904a-58decfdd53ca-config-data\") pod \"nova-scheduler-0\" (UID: \"707289ff-1434-49b7-904a-58decfdd53ca\") " pod="openstack/nova-scheduler-0" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.980437 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srf5b\" (UniqueName: \"kubernetes.io/projected/81ec286a-b6df-4462-8023-c01230a50793-kube-api-access-srf5b\") pod \"nova-cell1-novncproxy-0\" (UID: \"81ec286a-b6df-4462-8023-c01230a50793\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.980468 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81ec286a-b6df-4462-8023-c01230a50793-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"81ec286a-b6df-4462-8023-c01230a50793\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.980546 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81ec286a-b6df-4462-8023-c01230a50793-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"81ec286a-b6df-4462-8023-c01230a50793\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.980623 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/707289ff-1434-49b7-904a-58decfdd53ca-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"707289ff-1434-49b7-904a-58decfdd53ca\") " pod="openstack/nova-scheduler-0" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.980715 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzqjg\" (UniqueName: \"kubernetes.io/projected/707289ff-1434-49b7-904a-58decfdd53ca-kube-api-access-jzqjg\") pod \"nova-scheduler-0\" (UID: \"707289ff-1434-49b7-904a-58decfdd53ca\") " pod="openstack/nova-scheduler-0" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.988591 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81ec286a-b6df-4462-8023-c01230a50793-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"81ec286a-b6df-4462-8023-c01230a50793\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.996369 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81ec286a-b6df-4462-8023-c01230a50793-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"81ec286a-b6df-4462-8023-c01230a50793\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:08:37 crc kubenswrapper[4837]: I0313 12:08:37.996452 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.029111 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srf5b\" (UniqueName: \"kubernetes.io/projected/81ec286a-b6df-4462-8023-c01230a50793-kube-api-access-srf5b\") pod \"nova-cell1-novncproxy-0\" (UID: \"81ec286a-b6df-4462-8023-c01230a50793\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.063207 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.086493 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85cfl\" (UniqueName: \"kubernetes.io/projected/f3179576-07e2-4e05-8d10-01e3d694863b-kube-api-access-85cfl\") pod \"nova-api-0\" (UID: \"f3179576-07e2-4e05-8d10-01e3d694863b\") " pod="openstack/nova-api-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.086603 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3179576-07e2-4e05-8d10-01e3d694863b-config-data\") pod \"nova-api-0\" (UID: \"f3179576-07e2-4e05-8d10-01e3d694863b\") " pod="openstack/nova-api-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.086683 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/707289ff-1434-49b7-904a-58decfdd53ca-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"707289ff-1434-49b7-904a-58decfdd53ca\") " pod="openstack/nova-scheduler-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.086729 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3179576-07e2-4e05-8d10-01e3d694863b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f3179576-07e2-4e05-8d10-01e3d694863b\") " pod="openstack/nova-api-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.086762 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3179576-07e2-4e05-8d10-01e3d694863b-logs\") pod \"nova-api-0\" (UID: \"f3179576-07e2-4e05-8d10-01e3d694863b\") " pod="openstack/nova-api-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.086788 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzqjg\" (UniqueName: \"kubernetes.io/projected/707289ff-1434-49b7-904a-58decfdd53ca-kube-api-access-jzqjg\") pod \"nova-scheduler-0\" (UID: \"707289ff-1434-49b7-904a-58decfdd53ca\") " pod="openstack/nova-scheduler-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.086834 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/707289ff-1434-49b7-904a-58decfdd53ca-config-data\") pod \"nova-scheduler-0\" (UID: \"707289ff-1434-49b7-904a-58decfdd53ca\") " pod="openstack/nova-scheduler-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.091622 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/707289ff-1434-49b7-904a-58decfdd53ca-config-data\") pod \"nova-scheduler-0\" (UID: \"707289ff-1434-49b7-904a-58decfdd53ca\") " pod="openstack/nova-scheduler-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.097940 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/707289ff-1434-49b7-904a-58decfdd53ca-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"707289ff-1434-49b7-904a-58decfdd53ca\") " pod="openstack/nova-scheduler-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.111833 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.113597 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.127468 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.184630 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.198479 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzqjg\" (UniqueName: \"kubernetes.io/projected/707289ff-1434-49b7-904a-58decfdd53ca-kube-api-access-jzqjg\") pod \"nova-scheduler-0\" (UID: \"707289ff-1434-49b7-904a-58decfdd53ca\") " pod="openstack/nova-scheduler-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.199222 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/820f49e8-5f60-46ba-80a8-6314d4ae2c48-config-data\") pod \"nova-metadata-0\" (UID: \"820f49e8-5f60-46ba-80a8-6314d4ae2c48\") " pod="openstack/nova-metadata-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.199265 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gkl7\" (UniqueName: \"kubernetes.io/projected/820f49e8-5f60-46ba-80a8-6314d4ae2c48-kube-api-access-6gkl7\") pod \"nova-metadata-0\" (UID: \"820f49e8-5f60-46ba-80a8-6314d4ae2c48\") " pod="openstack/nova-metadata-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.199318 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/820f49e8-5f60-46ba-80a8-6314d4ae2c48-logs\") pod \"nova-metadata-0\" (UID: \"820f49e8-5f60-46ba-80a8-6314d4ae2c48\") " pod="openstack/nova-metadata-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.199436 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/820f49e8-5f60-46ba-80a8-6314d4ae2c48-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"820f49e8-5f60-46ba-80a8-6314d4ae2c48\") " pod="openstack/nova-metadata-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.199793 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85cfl\" (UniqueName: \"kubernetes.io/projected/f3179576-07e2-4e05-8d10-01e3d694863b-kube-api-access-85cfl\") pod \"nova-api-0\" (UID: \"f3179576-07e2-4e05-8d10-01e3d694863b\") " pod="openstack/nova-api-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.200593 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3179576-07e2-4e05-8d10-01e3d694863b-config-data\") pod \"nova-api-0\" (UID: \"f3179576-07e2-4e05-8d10-01e3d694863b\") " pod="openstack/nova-api-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.200964 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3179576-07e2-4e05-8d10-01e3d694863b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f3179576-07e2-4e05-8d10-01e3d694863b\") " pod="openstack/nova-api-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.201030 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3179576-07e2-4e05-8d10-01e3d694863b-logs\") pod \"nova-api-0\" (UID: \"f3179576-07e2-4e05-8d10-01e3d694863b\") " pod="openstack/nova-api-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.201520 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3179576-07e2-4e05-8d10-01e3d694863b-logs\") pod \"nova-api-0\" (UID: \"f3179576-07e2-4e05-8d10-01e3d694863b\") " pod="openstack/nova-api-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.218331 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3179576-07e2-4e05-8d10-01e3d694863b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f3179576-07e2-4e05-8d10-01e3d694863b\") " pod="openstack/nova-api-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.220301 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3179576-07e2-4e05-8d10-01e3d694863b-config-data\") pod \"nova-api-0\" (UID: \"f3179576-07e2-4e05-8d10-01e3d694863b\") " pod="openstack/nova-api-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.240261 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85cfl\" (UniqueName: \"kubernetes.io/projected/f3179576-07e2-4e05-8d10-01e3d694863b-kube-api-access-85cfl\") pod \"nova-api-0\" (UID: \"f3179576-07e2-4e05-8d10-01e3d694863b\") " pod="openstack/nova-api-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.303875 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/820f49e8-5f60-46ba-80a8-6314d4ae2c48-config-data\") pod \"nova-metadata-0\" (UID: \"820f49e8-5f60-46ba-80a8-6314d4ae2c48\") " pod="openstack/nova-metadata-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.303918 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gkl7\" (UniqueName: \"kubernetes.io/projected/820f49e8-5f60-46ba-80a8-6314d4ae2c48-kube-api-access-6gkl7\") pod \"nova-metadata-0\" (UID: \"820f49e8-5f60-46ba-80a8-6314d4ae2c48\") " pod="openstack/nova-metadata-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.303943 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/820f49e8-5f60-46ba-80a8-6314d4ae2c48-logs\") pod \"nova-metadata-0\" (UID: \"820f49e8-5f60-46ba-80a8-6314d4ae2c48\") " pod="openstack/nova-metadata-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.303984 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/820f49e8-5f60-46ba-80a8-6314d4ae2c48-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"820f49e8-5f60-46ba-80a8-6314d4ae2c48\") " pod="openstack/nova-metadata-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.311439 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/820f49e8-5f60-46ba-80a8-6314d4ae2c48-logs\") pod \"nova-metadata-0\" (UID: \"820f49e8-5f60-46ba-80a8-6314d4ae2c48\") " pod="openstack/nova-metadata-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.313691 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/820f49e8-5f60-46ba-80a8-6314d4ae2c48-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"820f49e8-5f60-46ba-80a8-6314d4ae2c48\") " pod="openstack/nova-metadata-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.315349 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/820f49e8-5f60-46ba-80a8-6314d4ae2c48-config-data\") pod \"nova-metadata-0\" (UID: \"820f49e8-5f60-46ba-80a8-6314d4ae2c48\") " pod="openstack/nova-metadata-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.345814 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gkl7\" (UniqueName: \"kubernetes.io/projected/820f49e8-5f60-46ba-80a8-6314d4ae2c48-kube-api-access-6gkl7\") pod \"nova-metadata-0\" (UID: \"820f49e8-5f60-46ba-80a8-6314d4ae2c48\") " pod="openstack/nova-metadata-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.347032 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-5blpv"] Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.348501 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-5blpv" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.385702 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-5blpv"] Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.419770 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.463077 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.492091 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.509950 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-5blpv\" (UID: \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\") " pod="openstack/dnsmasq-dns-757b4f8459-5blpv" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.510015 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gvxg\" (UniqueName: \"kubernetes.io/projected/6de330b6-0bbb-4a9d-9062-9c7ed182a189-kube-api-access-7gvxg\") pod \"dnsmasq-dns-757b4f8459-5blpv\" (UID: \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\") " pod="openstack/dnsmasq-dns-757b4f8459-5blpv" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.510072 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-dns-svc\") pod \"dnsmasq-dns-757b4f8459-5blpv\" (UID: \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\") " pod="openstack/dnsmasq-dns-757b4f8459-5blpv" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.510099 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-config\") pod \"dnsmasq-dns-757b4f8459-5blpv\" (UID: \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\") " pod="openstack/dnsmasq-dns-757b4f8459-5blpv" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.510251 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-5blpv\" (UID: \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\") " pod="openstack/dnsmasq-dns-757b4f8459-5blpv" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.510306 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-5blpv\" (UID: \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\") " pod="openstack/dnsmasq-dns-757b4f8459-5blpv" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.611562 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-5blpv\" (UID: \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\") " pod="openstack/dnsmasq-dns-757b4f8459-5blpv" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.611918 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-5blpv\" (UID: \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\") " pod="openstack/dnsmasq-dns-757b4f8459-5blpv" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.611980 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-5blpv\" (UID: \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\") " pod="openstack/dnsmasq-dns-757b4f8459-5blpv" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.612008 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gvxg\" (UniqueName: \"kubernetes.io/projected/6de330b6-0bbb-4a9d-9062-9c7ed182a189-kube-api-access-7gvxg\") pod \"dnsmasq-dns-757b4f8459-5blpv\" (UID: \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\") " pod="openstack/dnsmasq-dns-757b4f8459-5blpv" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.612058 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-dns-svc\") pod \"dnsmasq-dns-757b4f8459-5blpv\" (UID: \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\") " pod="openstack/dnsmasq-dns-757b4f8459-5blpv" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.612090 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-config\") pod \"dnsmasq-dns-757b4f8459-5blpv\" (UID: \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\") " pod="openstack/dnsmasq-dns-757b4f8459-5blpv" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.613145 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-config\") pod \"dnsmasq-dns-757b4f8459-5blpv\" (UID: \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\") " pod="openstack/dnsmasq-dns-757b4f8459-5blpv" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.613837 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-5blpv\" (UID: \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\") " pod="openstack/dnsmasq-dns-757b4f8459-5blpv" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.614372 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-5blpv\" (UID: \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\") " pod="openstack/dnsmasq-dns-757b4f8459-5blpv" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.614491 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-5blpv\" (UID: \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\") " pod="openstack/dnsmasq-dns-757b4f8459-5blpv" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.615125 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-dns-svc\") pod \"dnsmasq-dns-757b4f8459-5blpv\" (UID: \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\") " pod="openstack/dnsmasq-dns-757b4f8459-5blpv" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.648588 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gvxg\" (UniqueName: \"kubernetes.io/projected/6de330b6-0bbb-4a9d-9062-9c7ed182a189-kube-api-access-7gvxg\") pod \"dnsmasq-dns-757b4f8459-5blpv\" (UID: \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\") " pod="openstack/dnsmasq-dns-757b4f8459-5blpv" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.696136 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-5blpv" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.758353 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a7f70330-cb87-42e5-96c8-6d54828f2a5a","Type":"ContainerStarted","Data":"94da49d6a7255e5847d10069ee75dd614b4c6eea7e080a518814f780623556e5"} Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.759593 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.785409 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-xlps2"] Mar 13 12:08:38 crc kubenswrapper[4837]: I0313 12:08:38.796769 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.407149768 podStartE2EDuration="7.796750091s" podCreationTimestamp="2026-03-13 12:08:31 +0000 UTC" firstStartedPulling="2026-03-13 12:08:32.909031394 +0000 UTC m=+1228.547298177" lastFinishedPulling="2026-03-13 12:08:38.298631737 +0000 UTC m=+1233.936898500" observedRunningTime="2026-03-13 12:08:38.785744094 +0000 UTC m=+1234.424010857" watchObservedRunningTime="2026-03-13 12:08:38.796750091 +0000 UTC m=+1234.435016844" Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.046928 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.137949 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.229301 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8mzt4"] Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.231680 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-8mzt4" Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.242866 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.242922 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.274134 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8mzt4"] Mar 13 12:08:39 crc kubenswrapper[4837]: W0313 12:08:39.294984 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod707289ff_1434_49b7_904a_58decfdd53ca.slice/crio-b6785bc0ca408832d28cd32714ea145d9a6e0bbc829424d2cf876cff8cb2427b WatchSource:0}: Error finding container b6785bc0ca408832d28cd32714ea145d9a6e0bbc829424d2cf876cff8cb2427b: Status 404 returned error can't find the container with id b6785bc0ca408832d28cd32714ea145d9a6e0bbc829424d2cf876cff8cb2427b Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.301625 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 12:08:39 crc kubenswrapper[4837]: W0313 12:08:39.308950 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod820f49e8_5f60_46ba_80a8_6314d4ae2c48.slice/crio-96afc52a22b33bcafbbfbaef6c12e2832f3e29442fefe3b0c048dab79efe407f WatchSource:0}: Error finding container 96afc52a22b33bcafbbfbaef6c12e2832f3e29442fefe3b0c048dab79efe407f: Status 404 returned error can't find the container with id 96afc52a22b33bcafbbfbaef6c12e2832f3e29442fefe3b0c048dab79efe407f Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.338266 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.365681 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02b82791-6ef3-4a93-9d5a-84065d62775d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-8mzt4\" (UID: \"02b82791-6ef3-4a93-9d5a-84065d62775d\") " pod="openstack/nova-cell1-conductor-db-sync-8mzt4" Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.365741 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kwkc\" (UniqueName: \"kubernetes.io/projected/02b82791-6ef3-4a93-9d5a-84065d62775d-kube-api-access-8kwkc\") pod \"nova-cell1-conductor-db-sync-8mzt4\" (UID: \"02b82791-6ef3-4a93-9d5a-84065d62775d\") " pod="openstack/nova-cell1-conductor-db-sync-8mzt4" Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.365947 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02b82791-6ef3-4a93-9d5a-84065d62775d-config-data\") pod \"nova-cell1-conductor-db-sync-8mzt4\" (UID: \"02b82791-6ef3-4a93-9d5a-84065d62775d\") " pod="openstack/nova-cell1-conductor-db-sync-8mzt4" Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.366029 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02b82791-6ef3-4a93-9d5a-84065d62775d-scripts\") pod \"nova-cell1-conductor-db-sync-8mzt4\" (UID: \"02b82791-6ef3-4a93-9d5a-84065d62775d\") " pod="openstack/nova-cell1-conductor-db-sync-8mzt4" Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.409857 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-5blpv"] Mar 13 12:08:39 crc kubenswrapper[4837]: W0313 12:08:39.411963 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6de330b6_0bbb_4a9d_9062_9c7ed182a189.slice/crio-d820b1edec0c5d2d420936bab95dffbf9bd4c7adef7db33d312ced4b311526ff WatchSource:0}: Error finding container d820b1edec0c5d2d420936bab95dffbf9bd4c7adef7db33d312ced4b311526ff: Status 404 returned error can't find the container with id d820b1edec0c5d2d420936bab95dffbf9bd4c7adef7db33d312ced4b311526ff Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.467722 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02b82791-6ef3-4a93-9d5a-84065d62775d-config-data\") pod \"nova-cell1-conductor-db-sync-8mzt4\" (UID: \"02b82791-6ef3-4a93-9d5a-84065d62775d\") " pod="openstack/nova-cell1-conductor-db-sync-8mzt4" Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.468403 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02b82791-6ef3-4a93-9d5a-84065d62775d-scripts\") pod \"nova-cell1-conductor-db-sync-8mzt4\" (UID: \"02b82791-6ef3-4a93-9d5a-84065d62775d\") " pod="openstack/nova-cell1-conductor-db-sync-8mzt4" Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.468541 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02b82791-6ef3-4a93-9d5a-84065d62775d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-8mzt4\" (UID: \"02b82791-6ef3-4a93-9d5a-84065d62775d\") " pod="openstack/nova-cell1-conductor-db-sync-8mzt4" Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.468712 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kwkc\" (UniqueName: \"kubernetes.io/projected/02b82791-6ef3-4a93-9d5a-84065d62775d-kube-api-access-8kwkc\") pod \"nova-cell1-conductor-db-sync-8mzt4\" (UID: \"02b82791-6ef3-4a93-9d5a-84065d62775d\") " pod="openstack/nova-cell1-conductor-db-sync-8mzt4" Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.471377 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02b82791-6ef3-4a93-9d5a-84065d62775d-config-data\") pod \"nova-cell1-conductor-db-sync-8mzt4\" (UID: \"02b82791-6ef3-4a93-9d5a-84065d62775d\") " pod="openstack/nova-cell1-conductor-db-sync-8mzt4" Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.473323 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02b82791-6ef3-4a93-9d5a-84065d62775d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-8mzt4\" (UID: \"02b82791-6ef3-4a93-9d5a-84065d62775d\") " pod="openstack/nova-cell1-conductor-db-sync-8mzt4" Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.473935 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02b82791-6ef3-4a93-9d5a-84065d62775d-scripts\") pod \"nova-cell1-conductor-db-sync-8mzt4\" (UID: \"02b82791-6ef3-4a93-9d5a-84065d62775d\") " pod="openstack/nova-cell1-conductor-db-sync-8mzt4" Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.488350 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kwkc\" (UniqueName: \"kubernetes.io/projected/02b82791-6ef3-4a93-9d5a-84065d62775d-kube-api-access-8kwkc\") pod \"nova-cell1-conductor-db-sync-8mzt4\" (UID: \"02b82791-6ef3-4a93-9d5a-84065d62775d\") " pod="openstack/nova-cell1-conductor-db-sync-8mzt4" Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.562759 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-8mzt4" Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.786595 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xlps2" event={"ID":"53268342-9adb-48b3-ba5b-52634c2c68fe","Type":"ContainerStarted","Data":"5e2fe1dde876f5e43e3e8ce2528c539e8504cc8726824d4a38da88b3f10df140"} Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.786923 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xlps2" event={"ID":"53268342-9adb-48b3-ba5b-52634c2c68fe","Type":"ContainerStarted","Data":"4965c8cb939b3555313bf5ff81aac80d2a106589cf591b13befb28e52d15f3d4"} Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.788369 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"81ec286a-b6df-4462-8023-c01230a50793","Type":"ContainerStarted","Data":"78f61644c1756b2a1acf80d548b16d064b0de263e518cd87a1b42cea8c63088a"} Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.789454 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"820f49e8-5f60-46ba-80a8-6314d4ae2c48","Type":"ContainerStarted","Data":"96afc52a22b33bcafbbfbaef6c12e2832f3e29442fefe3b0c048dab79efe407f"} Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.796173 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f3179576-07e2-4e05-8d10-01e3d694863b","Type":"ContainerStarted","Data":"462ea19aaa8a4b2f42cf4a80e03784c4432ff7806e973f2c0cf7363762b9df8e"} Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.801537 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-xlps2" podStartSLOduration=2.801520946 podStartE2EDuration="2.801520946s" podCreationTimestamp="2026-03-13 12:08:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:08:39.800748892 +0000 UTC m=+1235.439015675" watchObservedRunningTime="2026-03-13 12:08:39.801520946 +0000 UTC m=+1235.439787709" Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.803935 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"707289ff-1434-49b7-904a-58decfdd53ca","Type":"ContainerStarted","Data":"b6785bc0ca408832d28cd32714ea145d9a6e0bbc829424d2cf876cff8cb2427b"} Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.810014 4837 generic.go:334] "Generic (PLEG): container finished" podID="6de330b6-0bbb-4a9d-9062-9c7ed182a189" containerID="7468313c118293c73f68950b41a915eb07c6510dd5985dea1ec55106483d1ae9" exitCode=0 Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.810415 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-5blpv" event={"ID":"6de330b6-0bbb-4a9d-9062-9c7ed182a189","Type":"ContainerDied","Data":"7468313c118293c73f68950b41a915eb07c6510dd5985dea1ec55106483d1ae9"} Mar 13 12:08:39 crc kubenswrapper[4837]: I0313 12:08:39.810507 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-5blpv" event={"ID":"6de330b6-0bbb-4a9d-9062-9c7ed182a189","Type":"ContainerStarted","Data":"d820b1edec0c5d2d420936bab95dffbf9bd4c7adef7db33d312ced4b311526ff"} Mar 13 12:08:40 crc kubenswrapper[4837]: I0313 12:08:40.107957 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8mzt4"] Mar 13 12:08:40 crc kubenswrapper[4837]: I0313 12:08:40.835340 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-5blpv" event={"ID":"6de330b6-0bbb-4a9d-9062-9c7ed182a189","Type":"ContainerStarted","Data":"1d8786b6d9674dc9d5eaebc032e5dbd8c1d018dc4a94605d311592e57b3895fc"} Mar 13 12:08:40 crc kubenswrapper[4837]: I0313 12:08:40.835850 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-757b4f8459-5blpv" Mar 13 12:08:40 crc kubenswrapper[4837]: I0313 12:08:40.846349 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-8mzt4" event={"ID":"02b82791-6ef3-4a93-9d5a-84065d62775d","Type":"ContainerStarted","Data":"deea73f54571ed1f4517906256e112c93e642ebacb77d1a62a53b5217eb1d25c"} Mar 13 12:08:40 crc kubenswrapper[4837]: I0313 12:08:40.846418 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-8mzt4" event={"ID":"02b82791-6ef3-4a93-9d5a-84065d62775d","Type":"ContainerStarted","Data":"8e78351e0267ed6011c0c508a5d86a34c1efe986a8b326b91fdf940d351283e1"} Mar 13 12:08:40 crc kubenswrapper[4837]: I0313 12:08:40.862464 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-757b4f8459-5blpv" podStartSLOduration=2.86244602 podStartE2EDuration="2.86244602s" podCreationTimestamp="2026-03-13 12:08:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:08:40.855537942 +0000 UTC m=+1236.493804705" watchObservedRunningTime="2026-03-13 12:08:40.86244602 +0000 UTC m=+1236.500712783" Mar 13 12:08:40 crc kubenswrapper[4837]: I0313 12:08:40.886859 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-8mzt4" podStartSLOduration=1.8868358779999999 podStartE2EDuration="1.886835878s" podCreationTimestamp="2026-03-13 12:08:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:08:40.873997364 +0000 UTC m=+1236.512264127" watchObservedRunningTime="2026-03-13 12:08:40.886835878 +0000 UTC m=+1236.525102641" Mar 13 12:08:41 crc kubenswrapper[4837]: I0313 12:08:41.822416 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 12:08:41 crc kubenswrapper[4837]: I0313 12:08:41.832333 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:08:42 crc kubenswrapper[4837]: I0313 12:08:42.877699 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"81ec286a-b6df-4462-8023-c01230a50793","Type":"ContainerStarted","Data":"2b72c4b74ac632994ae39578139216d840009de89378dfe0823503769ad992b6"} Mar 13 12:08:42 crc kubenswrapper[4837]: I0313 12:08:42.877905 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="81ec286a-b6df-4462-8023-c01230a50793" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://2b72c4b74ac632994ae39578139216d840009de89378dfe0823503769ad992b6" gracePeriod=30 Mar 13 12:08:42 crc kubenswrapper[4837]: I0313 12:08:42.881165 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"820f49e8-5f60-46ba-80a8-6314d4ae2c48","Type":"ContainerStarted","Data":"e203009f7f90bcd5acdca760bbec4abb4ed13e0d582406ed97971a2f55cbb3a6"} Mar 13 12:08:42 crc kubenswrapper[4837]: I0313 12:08:42.883219 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f3179576-07e2-4e05-8d10-01e3d694863b","Type":"ContainerStarted","Data":"0511fe858584c41b4362fa4eb0bbad5c40393493b881d67bbae3af394094d397"} Mar 13 12:08:42 crc kubenswrapper[4837]: I0313 12:08:42.897493 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"707289ff-1434-49b7-904a-58decfdd53ca","Type":"ContainerStarted","Data":"de14fa2730ef467496fc05d9ce620e2ff356ba12b2dd9494751b1d31dcb5f089"} Mar 13 12:08:42 crc kubenswrapper[4837]: I0313 12:08:42.903276 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.5364554310000003 podStartE2EDuration="5.903258027s" podCreationTimestamp="2026-03-13 12:08:37 +0000 UTC" firstStartedPulling="2026-03-13 12:08:39.14038959 +0000 UTC m=+1234.778656353" lastFinishedPulling="2026-03-13 12:08:42.507192186 +0000 UTC m=+1238.145458949" observedRunningTime="2026-03-13 12:08:42.89700995 +0000 UTC m=+1238.535276723" watchObservedRunningTime="2026-03-13 12:08:42.903258027 +0000 UTC m=+1238.541524790" Mar 13 12:08:42 crc kubenswrapper[4837]: I0313 12:08:42.921118 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.707180286 podStartE2EDuration="5.921097688s" podCreationTimestamp="2026-03-13 12:08:37 +0000 UTC" firstStartedPulling="2026-03-13 12:08:39.297300101 +0000 UTC m=+1234.935566864" lastFinishedPulling="2026-03-13 12:08:42.511217503 +0000 UTC m=+1238.149484266" observedRunningTime="2026-03-13 12:08:42.913195129 +0000 UTC m=+1238.551461892" watchObservedRunningTime="2026-03-13 12:08:42.921097688 +0000 UTC m=+1238.559364451" Mar 13 12:08:43 crc kubenswrapper[4837]: I0313 12:08:43.064094 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:08:43 crc kubenswrapper[4837]: I0313 12:08:43.492610 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 13 12:08:43 crc kubenswrapper[4837]: I0313 12:08:43.908552 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"820f49e8-5f60-46ba-80a8-6314d4ae2c48","Type":"ContainerStarted","Data":"840e4e9d2c2d0340a81c3beb9f6f9b82c73945fdc9f045f44ef5cce0128da5c7"} Mar 13 12:08:43 crc kubenswrapper[4837]: I0313 12:08:43.908770 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="820f49e8-5f60-46ba-80a8-6314d4ae2c48" containerName="nova-metadata-log" containerID="cri-o://e203009f7f90bcd5acdca760bbec4abb4ed13e0d582406ed97971a2f55cbb3a6" gracePeriod=30 Mar 13 12:08:43 crc kubenswrapper[4837]: I0313 12:08:43.909407 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="820f49e8-5f60-46ba-80a8-6314d4ae2c48" containerName="nova-metadata-metadata" containerID="cri-o://840e4e9d2c2d0340a81c3beb9f6f9b82c73945fdc9f045f44ef5cce0128da5c7" gracePeriod=30 Mar 13 12:08:43 crc kubenswrapper[4837]: I0313 12:08:43.915852 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f3179576-07e2-4e05-8d10-01e3d694863b","Type":"ContainerStarted","Data":"ccea0ec2cd3b8c08290e7221354973c6421a6b999d1adffb63e82adac076716a"} Mar 13 12:08:43 crc kubenswrapper[4837]: I0313 12:08:43.932525 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.702636859 podStartE2EDuration="5.932509243s" podCreationTimestamp="2026-03-13 12:08:38 +0000 UTC" firstStartedPulling="2026-03-13 12:08:39.311275861 +0000 UTC m=+1234.949542624" lastFinishedPulling="2026-03-13 12:08:42.541148245 +0000 UTC m=+1238.179415008" observedRunningTime="2026-03-13 12:08:43.925607216 +0000 UTC m=+1239.563873979" watchObservedRunningTime="2026-03-13 12:08:43.932509243 +0000 UTC m=+1239.570775996" Mar 13 12:08:43 crc kubenswrapper[4837]: I0313 12:08:43.955357 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.511674426 podStartE2EDuration="6.955331382s" podCreationTimestamp="2026-03-13 12:08:37 +0000 UTC" firstStartedPulling="2026-03-13 12:08:39.068331071 +0000 UTC m=+1234.706597834" lastFinishedPulling="2026-03-13 12:08:42.511988017 +0000 UTC m=+1238.150254790" observedRunningTime="2026-03-13 12:08:43.949616512 +0000 UTC m=+1239.587883285" watchObservedRunningTime="2026-03-13 12:08:43.955331382 +0000 UTC m=+1239.593598145" Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.482735 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.583421 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/820f49e8-5f60-46ba-80a8-6314d4ae2c48-combined-ca-bundle\") pod \"820f49e8-5f60-46ba-80a8-6314d4ae2c48\" (UID: \"820f49e8-5f60-46ba-80a8-6314d4ae2c48\") " Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.583501 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/820f49e8-5f60-46ba-80a8-6314d4ae2c48-config-data\") pod \"820f49e8-5f60-46ba-80a8-6314d4ae2c48\" (UID: \"820f49e8-5f60-46ba-80a8-6314d4ae2c48\") " Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.583561 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gkl7\" (UniqueName: \"kubernetes.io/projected/820f49e8-5f60-46ba-80a8-6314d4ae2c48-kube-api-access-6gkl7\") pod \"820f49e8-5f60-46ba-80a8-6314d4ae2c48\" (UID: \"820f49e8-5f60-46ba-80a8-6314d4ae2c48\") " Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.583769 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/820f49e8-5f60-46ba-80a8-6314d4ae2c48-logs\") pod \"820f49e8-5f60-46ba-80a8-6314d4ae2c48\" (UID: \"820f49e8-5f60-46ba-80a8-6314d4ae2c48\") " Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.584155 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/820f49e8-5f60-46ba-80a8-6314d4ae2c48-logs" (OuterVolumeSpecName: "logs") pod "820f49e8-5f60-46ba-80a8-6314d4ae2c48" (UID: "820f49e8-5f60-46ba-80a8-6314d4ae2c48"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.584276 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/820f49e8-5f60-46ba-80a8-6314d4ae2c48-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.593913 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/820f49e8-5f60-46ba-80a8-6314d4ae2c48-kube-api-access-6gkl7" (OuterVolumeSpecName: "kube-api-access-6gkl7") pod "820f49e8-5f60-46ba-80a8-6314d4ae2c48" (UID: "820f49e8-5f60-46ba-80a8-6314d4ae2c48"). InnerVolumeSpecName "kube-api-access-6gkl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.609884 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/820f49e8-5f60-46ba-80a8-6314d4ae2c48-config-data" (OuterVolumeSpecName: "config-data") pod "820f49e8-5f60-46ba-80a8-6314d4ae2c48" (UID: "820f49e8-5f60-46ba-80a8-6314d4ae2c48"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.612398 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/820f49e8-5f60-46ba-80a8-6314d4ae2c48-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "820f49e8-5f60-46ba-80a8-6314d4ae2c48" (UID: "820f49e8-5f60-46ba-80a8-6314d4ae2c48"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.696848 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/820f49e8-5f60-46ba-80a8-6314d4ae2c48-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.696907 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/820f49e8-5f60-46ba-80a8-6314d4ae2c48-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.696921 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gkl7\" (UniqueName: \"kubernetes.io/projected/820f49e8-5f60-46ba-80a8-6314d4ae2c48-kube-api-access-6gkl7\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.928150 4837 generic.go:334] "Generic (PLEG): container finished" podID="820f49e8-5f60-46ba-80a8-6314d4ae2c48" containerID="840e4e9d2c2d0340a81c3beb9f6f9b82c73945fdc9f045f44ef5cce0128da5c7" exitCode=0 Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.928192 4837 generic.go:334] "Generic (PLEG): container finished" podID="820f49e8-5f60-46ba-80a8-6314d4ae2c48" containerID="e203009f7f90bcd5acdca760bbec4abb4ed13e0d582406ed97971a2f55cbb3a6" exitCode=143 Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.928241 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"820f49e8-5f60-46ba-80a8-6314d4ae2c48","Type":"ContainerDied","Data":"840e4e9d2c2d0340a81c3beb9f6f9b82c73945fdc9f045f44ef5cce0128da5c7"} Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.928258 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.928283 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"820f49e8-5f60-46ba-80a8-6314d4ae2c48","Type":"ContainerDied","Data":"e203009f7f90bcd5acdca760bbec4abb4ed13e0d582406ed97971a2f55cbb3a6"} Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.928295 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"820f49e8-5f60-46ba-80a8-6314d4ae2c48","Type":"ContainerDied","Data":"96afc52a22b33bcafbbfbaef6c12e2832f3e29442fefe3b0c048dab79efe407f"} Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.928313 4837 scope.go:117] "RemoveContainer" containerID="840e4e9d2c2d0340a81c3beb9f6f9b82c73945fdc9f045f44ef5cce0128da5c7" Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.975545 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.986490 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.996351 4837 scope.go:117] "RemoveContainer" containerID="e203009f7f90bcd5acdca760bbec4abb4ed13e0d582406ed97971a2f55cbb3a6" Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.996407 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:08:44 crc kubenswrapper[4837]: E0313 12:08:44.996823 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="820f49e8-5f60-46ba-80a8-6314d4ae2c48" containerName="nova-metadata-metadata" Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.996835 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="820f49e8-5f60-46ba-80a8-6314d4ae2c48" containerName="nova-metadata-metadata" Mar 13 12:08:44 crc kubenswrapper[4837]: E0313 12:08:44.996864 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="820f49e8-5f60-46ba-80a8-6314d4ae2c48" containerName="nova-metadata-log" Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.996871 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="820f49e8-5f60-46ba-80a8-6314d4ae2c48" containerName="nova-metadata-log" Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.997056 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="820f49e8-5f60-46ba-80a8-6314d4ae2c48" containerName="nova-metadata-metadata" Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.997080 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="820f49e8-5f60-46ba-80a8-6314d4ae2c48" containerName="nova-metadata-log" Mar 13 12:08:44 crc kubenswrapper[4837]: I0313 12:08:44.998143 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.000426 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.000600 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.005210 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.048810 4837 scope.go:117] "RemoveContainer" containerID="840e4e9d2c2d0340a81c3beb9f6f9b82c73945fdc9f045f44ef5cce0128da5c7" Mar 13 12:08:45 crc kubenswrapper[4837]: E0313 12:08:45.052884 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"840e4e9d2c2d0340a81c3beb9f6f9b82c73945fdc9f045f44ef5cce0128da5c7\": container with ID starting with 840e4e9d2c2d0340a81c3beb9f6f9b82c73945fdc9f045f44ef5cce0128da5c7 not found: ID does not exist" containerID="840e4e9d2c2d0340a81c3beb9f6f9b82c73945fdc9f045f44ef5cce0128da5c7" Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.052940 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"840e4e9d2c2d0340a81c3beb9f6f9b82c73945fdc9f045f44ef5cce0128da5c7"} err="failed to get container status \"840e4e9d2c2d0340a81c3beb9f6f9b82c73945fdc9f045f44ef5cce0128da5c7\": rpc error: code = NotFound desc = could not find container \"840e4e9d2c2d0340a81c3beb9f6f9b82c73945fdc9f045f44ef5cce0128da5c7\": container with ID starting with 840e4e9d2c2d0340a81c3beb9f6f9b82c73945fdc9f045f44ef5cce0128da5c7 not found: ID does not exist" Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.052971 4837 scope.go:117] "RemoveContainer" containerID="e203009f7f90bcd5acdca760bbec4abb4ed13e0d582406ed97971a2f55cbb3a6" Mar 13 12:08:45 crc kubenswrapper[4837]: E0313 12:08:45.053359 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e203009f7f90bcd5acdca760bbec4abb4ed13e0d582406ed97971a2f55cbb3a6\": container with ID starting with e203009f7f90bcd5acdca760bbec4abb4ed13e0d582406ed97971a2f55cbb3a6 not found: ID does not exist" containerID="e203009f7f90bcd5acdca760bbec4abb4ed13e0d582406ed97971a2f55cbb3a6" Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.053396 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e203009f7f90bcd5acdca760bbec4abb4ed13e0d582406ed97971a2f55cbb3a6"} err="failed to get container status \"e203009f7f90bcd5acdca760bbec4abb4ed13e0d582406ed97971a2f55cbb3a6\": rpc error: code = NotFound desc = could not find container \"e203009f7f90bcd5acdca760bbec4abb4ed13e0d582406ed97971a2f55cbb3a6\": container with ID starting with e203009f7f90bcd5acdca760bbec4abb4ed13e0d582406ed97971a2f55cbb3a6 not found: ID does not exist" Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.053423 4837 scope.go:117] "RemoveContainer" containerID="840e4e9d2c2d0340a81c3beb9f6f9b82c73945fdc9f045f44ef5cce0128da5c7" Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.054124 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"840e4e9d2c2d0340a81c3beb9f6f9b82c73945fdc9f045f44ef5cce0128da5c7"} err="failed to get container status \"840e4e9d2c2d0340a81c3beb9f6f9b82c73945fdc9f045f44ef5cce0128da5c7\": rpc error: code = NotFound desc = could not find container \"840e4e9d2c2d0340a81c3beb9f6f9b82c73945fdc9f045f44ef5cce0128da5c7\": container with ID starting with 840e4e9d2c2d0340a81c3beb9f6f9b82c73945fdc9f045f44ef5cce0128da5c7 not found: ID does not exist" Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.054148 4837 scope.go:117] "RemoveContainer" containerID="e203009f7f90bcd5acdca760bbec4abb4ed13e0d582406ed97971a2f55cbb3a6" Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.054964 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e203009f7f90bcd5acdca760bbec4abb4ed13e0d582406ed97971a2f55cbb3a6"} err="failed to get container status \"e203009f7f90bcd5acdca760bbec4abb4ed13e0d582406ed97971a2f55cbb3a6\": rpc error: code = NotFound desc = could not find container \"e203009f7f90bcd5acdca760bbec4abb4ed13e0d582406ed97971a2f55cbb3a6\": container with ID starting with e203009f7f90bcd5acdca760bbec4abb4ed13e0d582406ed97971a2f55cbb3a6 not found: ID does not exist" Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.062680 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="820f49e8-5f60-46ba-80a8-6314d4ae2c48" path="/var/lib/kubelet/pods/820f49e8-5f60-46ba-80a8-6314d4ae2c48/volumes" Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.107843 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5\") " pod="openstack/nova-metadata-0" Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.107956 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-logs\") pod \"nova-metadata-0\" (UID: \"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5\") " pod="openstack/nova-metadata-0" Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.108022 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hzrv\" (UniqueName: \"kubernetes.io/projected/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-kube-api-access-5hzrv\") pod \"nova-metadata-0\" (UID: \"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5\") " pod="openstack/nova-metadata-0" Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.108059 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-config-data\") pod \"nova-metadata-0\" (UID: \"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5\") " pod="openstack/nova-metadata-0" Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.108075 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5\") " pod="openstack/nova-metadata-0" Mar 13 12:08:45 crc kubenswrapper[4837]: E0313 12:08:45.208196 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod820f49e8_5f60_46ba_80a8_6314d4ae2c48.slice\": RecentStats: unable to find data in memory cache]" Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.210122 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-config-data\") pod \"nova-metadata-0\" (UID: \"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5\") " pod="openstack/nova-metadata-0" Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.210260 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5\") " pod="openstack/nova-metadata-0" Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.210430 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5\") " pod="openstack/nova-metadata-0" Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.210653 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-logs\") pod \"nova-metadata-0\" (UID: \"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5\") " pod="openstack/nova-metadata-0" Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.210817 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hzrv\" (UniqueName: \"kubernetes.io/projected/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-kube-api-access-5hzrv\") pod \"nova-metadata-0\" (UID: \"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5\") " pod="openstack/nova-metadata-0" Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.211925 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-logs\") pod \"nova-metadata-0\" (UID: \"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5\") " pod="openstack/nova-metadata-0" Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.215211 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-config-data\") pod \"nova-metadata-0\" (UID: \"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5\") " pod="openstack/nova-metadata-0" Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.215436 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5\") " pod="openstack/nova-metadata-0" Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.215721 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5\") " pod="openstack/nova-metadata-0" Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.232536 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hzrv\" (UniqueName: \"kubernetes.io/projected/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-kube-api-access-5hzrv\") pod \"nova-metadata-0\" (UID: \"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5\") " pod="openstack/nova-metadata-0" Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.324355 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.766541 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:08:45 crc kubenswrapper[4837]: W0313 12:08:45.777100 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d0251c8_3594_482e_bd3c_2ca33c9e0ab5.slice/crio-e3827fca94d00b12ca92bacbca73cd8bbf6d6f767ab5ac87612ce11ca72155b4 WatchSource:0}: Error finding container e3827fca94d00b12ca92bacbca73cd8bbf6d6f767ab5ac87612ce11ca72155b4: Status 404 returned error can't find the container with id e3827fca94d00b12ca92bacbca73cd8bbf6d6f767ab5ac87612ce11ca72155b4 Mar 13 12:08:45 crc kubenswrapper[4837]: I0313 12:08:45.941907 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5","Type":"ContainerStarted","Data":"e3827fca94d00b12ca92bacbca73cd8bbf6d6f767ab5ac87612ce11ca72155b4"} Mar 13 12:08:46 crc kubenswrapper[4837]: I0313 12:08:46.956483 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5","Type":"ContainerStarted","Data":"3d7f2f79cf7018648ce988483c617478d157b963d57e5926b659be5f5a8f979a"} Mar 13 12:08:46 crc kubenswrapper[4837]: I0313 12:08:46.957042 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5","Type":"ContainerStarted","Data":"2d6f702dc40ebdf06e59f7de571c86b0307e438343613d9494734ff2a3295495"} Mar 13 12:08:46 crc kubenswrapper[4837]: I0313 12:08:46.980782 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.980764639 podStartE2EDuration="2.980764639s" podCreationTimestamp="2026-03-13 12:08:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:08:46.972816619 +0000 UTC m=+1242.611083402" watchObservedRunningTime="2026-03-13 12:08:46.980764639 +0000 UTC m=+1242.619031402" Mar 13 12:08:47 crc kubenswrapper[4837]: I0313 12:08:47.967335 4837 generic.go:334] "Generic (PLEG): container finished" podID="53268342-9adb-48b3-ba5b-52634c2c68fe" containerID="5e2fe1dde876f5e43e3e8ce2528c539e8504cc8726824d4a38da88b3f10df140" exitCode=0 Mar 13 12:08:47 crc kubenswrapper[4837]: I0313 12:08:47.967423 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xlps2" event={"ID":"53268342-9adb-48b3-ba5b-52634c2c68fe","Type":"ContainerDied","Data":"5e2fe1dde876f5e43e3e8ce2528c539e8504cc8726824d4a38da88b3f10df140"} Mar 13 12:08:47 crc kubenswrapper[4837]: I0313 12:08:47.969713 4837 generic.go:334] "Generic (PLEG): container finished" podID="02b82791-6ef3-4a93-9d5a-84065d62775d" containerID="deea73f54571ed1f4517906256e112c93e642ebacb77d1a62a53b5217eb1d25c" exitCode=0 Mar 13 12:08:47 crc kubenswrapper[4837]: I0313 12:08:47.969754 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-8mzt4" event={"ID":"02b82791-6ef3-4a93-9d5a-84065d62775d","Type":"ContainerDied","Data":"deea73f54571ed1f4517906256e112c93e642ebacb77d1a62a53b5217eb1d25c"} Mar 13 12:08:48 crc kubenswrapper[4837]: I0313 12:08:48.421142 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 12:08:48 crc kubenswrapper[4837]: I0313 12:08:48.421582 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 12:08:48 crc kubenswrapper[4837]: I0313 12:08:48.493171 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 13 12:08:48 crc kubenswrapper[4837]: I0313 12:08:48.521397 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 13 12:08:48 crc kubenswrapper[4837]: I0313 12:08:48.697880 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-757b4f8459-5blpv" Mar 13 12:08:48 crc kubenswrapper[4837]: I0313 12:08:48.757219 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-txzkw"] Mar 13 12:08:48 crc kubenswrapper[4837]: I0313 12:08:48.757453 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" podUID="fd9a8546-e61b-47e0-90b9-e6c8e4365b0b" containerName="dnsmasq-dns" containerID="cri-o://308f5a2ca30c72015ad1831a239549e973a6a698921b4916b0e838cdf0b49c8a" gracePeriod=10 Mar 13 12:08:48 crc kubenswrapper[4837]: I0313 12:08:48.983559 4837 generic.go:334] "Generic (PLEG): container finished" podID="fd9a8546-e61b-47e0-90b9-e6c8e4365b0b" containerID="308f5a2ca30c72015ad1831a239549e973a6a698921b4916b0e838cdf0b49c8a" exitCode=0 Mar 13 12:08:48 crc kubenswrapper[4837]: I0313 12:08:48.984390 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" event={"ID":"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b","Type":"ContainerDied","Data":"308f5a2ca30c72015ad1831a239549e973a6a698921b4916b0e838cdf0b49c8a"} Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.041241 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.353885 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.503315 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-ovsdbserver-sb\") pod \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\" (UID: \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\") " Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.503402 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2wh2\" (UniqueName: \"kubernetes.io/projected/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-kube-api-access-s2wh2\") pod \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\" (UID: \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\") " Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.503485 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-config\") pod \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\" (UID: \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\") " Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.503585 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-dns-svc\") pod \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\" (UID: \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\") " Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.503613 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-ovsdbserver-nb\") pod \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\" (UID: \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\") " Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.503771 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-dns-swift-storage-0\") pod \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\" (UID: \"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b\") " Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.505423 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f3179576-07e2-4e05-8d10-01e3d694863b" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.196:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.508445 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f3179576-07e2-4e05-8d10-01e3d694863b" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.196:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.515671 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-kube-api-access-s2wh2" (OuterVolumeSpecName: "kube-api-access-s2wh2") pod "fd9a8546-e61b-47e0-90b9-e6c8e4365b0b" (UID: "fd9a8546-e61b-47e0-90b9-e6c8e4365b0b"). InnerVolumeSpecName "kube-api-access-s2wh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.555328 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-8mzt4" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.574209 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fd9a8546-e61b-47e0-90b9-e6c8e4365b0b" (UID: "fd9a8546-e61b-47e0-90b9-e6c8e4365b0b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.583376 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xlps2" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.587965 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fd9a8546-e61b-47e0-90b9-e6c8e4365b0b" (UID: "fd9a8546-e61b-47e0-90b9-e6c8e4365b0b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.596490 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-config" (OuterVolumeSpecName: "config") pod "fd9a8546-e61b-47e0-90b9-e6c8e4365b0b" (UID: "fd9a8546-e61b-47e0-90b9-e6c8e4365b0b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.609682 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fltcz\" (UniqueName: \"kubernetes.io/projected/53268342-9adb-48b3-ba5b-52634c2c68fe-kube-api-access-fltcz\") pod \"53268342-9adb-48b3-ba5b-52634c2c68fe\" (UID: \"53268342-9adb-48b3-ba5b-52634c2c68fe\") " Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.609778 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53268342-9adb-48b3-ba5b-52634c2c68fe-config-data\") pod \"53268342-9adb-48b3-ba5b-52634c2c68fe\" (UID: \"53268342-9adb-48b3-ba5b-52634c2c68fe\") " Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.609816 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kwkc\" (UniqueName: \"kubernetes.io/projected/02b82791-6ef3-4a93-9d5a-84065d62775d-kube-api-access-8kwkc\") pod \"02b82791-6ef3-4a93-9d5a-84065d62775d\" (UID: \"02b82791-6ef3-4a93-9d5a-84065d62775d\") " Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.609902 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53268342-9adb-48b3-ba5b-52634c2c68fe-combined-ca-bundle\") pod \"53268342-9adb-48b3-ba5b-52634c2c68fe\" (UID: \"53268342-9adb-48b3-ba5b-52634c2c68fe\") " Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.609957 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53268342-9adb-48b3-ba5b-52634c2c68fe-scripts\") pod \"53268342-9adb-48b3-ba5b-52634c2c68fe\" (UID: \"53268342-9adb-48b3-ba5b-52634c2c68fe\") " Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.609988 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fd9a8546-e61b-47e0-90b9-e6c8e4365b0b" (UID: "fd9a8546-e61b-47e0-90b9-e6c8e4365b0b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.610058 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02b82791-6ef3-4a93-9d5a-84065d62775d-config-data\") pod \"02b82791-6ef3-4a93-9d5a-84065d62775d\" (UID: \"02b82791-6ef3-4a93-9d5a-84065d62775d\") " Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.610093 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02b82791-6ef3-4a93-9d5a-84065d62775d-scripts\") pod \"02b82791-6ef3-4a93-9d5a-84065d62775d\" (UID: \"02b82791-6ef3-4a93-9d5a-84065d62775d\") " Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.610169 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02b82791-6ef3-4a93-9d5a-84065d62775d-combined-ca-bundle\") pod \"02b82791-6ef3-4a93-9d5a-84065d62775d\" (UID: \"02b82791-6ef3-4a93-9d5a-84065d62775d\") " Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.610776 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.610798 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.610810 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.610824 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2wh2\" (UniqueName: \"kubernetes.io/projected/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-kube-api-access-s2wh2\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.610840 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.614327 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fd9a8546-e61b-47e0-90b9-e6c8e4365b0b" (UID: "fd9a8546-e61b-47e0-90b9-e6c8e4365b0b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.614516 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53268342-9adb-48b3-ba5b-52634c2c68fe-kube-api-access-fltcz" (OuterVolumeSpecName: "kube-api-access-fltcz") pod "53268342-9adb-48b3-ba5b-52634c2c68fe" (UID: "53268342-9adb-48b3-ba5b-52634c2c68fe"). InnerVolumeSpecName "kube-api-access-fltcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.614684 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02b82791-6ef3-4a93-9d5a-84065d62775d-kube-api-access-8kwkc" (OuterVolumeSpecName: "kube-api-access-8kwkc") pod "02b82791-6ef3-4a93-9d5a-84065d62775d" (UID: "02b82791-6ef3-4a93-9d5a-84065d62775d"). InnerVolumeSpecName "kube-api-access-8kwkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.616937 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02b82791-6ef3-4a93-9d5a-84065d62775d-scripts" (OuterVolumeSpecName: "scripts") pod "02b82791-6ef3-4a93-9d5a-84065d62775d" (UID: "02b82791-6ef3-4a93-9d5a-84065d62775d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.617008 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53268342-9adb-48b3-ba5b-52634c2c68fe-scripts" (OuterVolumeSpecName: "scripts") pod "53268342-9adb-48b3-ba5b-52634c2c68fe" (UID: "53268342-9adb-48b3-ba5b-52634c2c68fe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.637927 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53268342-9adb-48b3-ba5b-52634c2c68fe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "53268342-9adb-48b3-ba5b-52634c2c68fe" (UID: "53268342-9adb-48b3-ba5b-52634c2c68fe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.658357 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53268342-9adb-48b3-ba5b-52634c2c68fe-config-data" (OuterVolumeSpecName: "config-data") pod "53268342-9adb-48b3-ba5b-52634c2c68fe" (UID: "53268342-9adb-48b3-ba5b-52634c2c68fe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.660975 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02b82791-6ef3-4a93-9d5a-84065d62775d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "02b82791-6ef3-4a93-9d5a-84065d62775d" (UID: "02b82791-6ef3-4a93-9d5a-84065d62775d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.663889 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02b82791-6ef3-4a93-9d5a-84065d62775d-config-data" (OuterVolumeSpecName: "config-data") pod "02b82791-6ef3-4a93-9d5a-84065d62775d" (UID: "02b82791-6ef3-4a93-9d5a-84065d62775d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.713074 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02b82791-6ef3-4a93-9d5a-84065d62775d-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.713137 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02b82791-6ef3-4a93-9d5a-84065d62775d-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.713155 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02b82791-6ef3-4a93-9d5a-84065d62775d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.713180 4837 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.713201 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fltcz\" (UniqueName: \"kubernetes.io/projected/53268342-9adb-48b3-ba5b-52634c2c68fe-kube-api-access-fltcz\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.713217 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53268342-9adb-48b3-ba5b-52634c2c68fe-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.713235 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kwkc\" (UniqueName: \"kubernetes.io/projected/02b82791-6ef3-4a93-9d5a-84065d62775d-kube-api-access-8kwkc\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.713251 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53268342-9adb-48b3-ba5b-52634c2c68fe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.713265 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53268342-9adb-48b3-ba5b-52634c2c68fe-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.994057 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xlps2" event={"ID":"53268342-9adb-48b3-ba5b-52634c2c68fe","Type":"ContainerDied","Data":"4965c8cb939b3555313bf5ff81aac80d2a106589cf591b13befb28e52d15f3d4"} Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.994102 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4965c8cb939b3555313bf5ff81aac80d2a106589cf591b13befb28e52d15f3d4" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.994209 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xlps2" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.998879 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" event={"ID":"fd9a8546-e61b-47e0-90b9-e6c8e4365b0b","Type":"ContainerDied","Data":"3b4bbdde4e1a36119cc27a40f2a694902d8b5f53fa6c902b59c1385e734f5a5e"} Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.998950 4837 scope.go:117] "RemoveContainer" containerID="308f5a2ca30c72015ad1831a239549e973a6a698921b4916b0e838cdf0b49c8a" Mar 13 12:08:49 crc kubenswrapper[4837]: I0313 12:08:49.998976 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-txzkw" Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.000484 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-8mzt4" event={"ID":"02b82791-6ef3-4a93-9d5a-84065d62775d","Type":"ContainerDied","Data":"8e78351e0267ed6011c0c508a5d86a34c1efe986a8b326b91fdf940d351283e1"} Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.000499 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-8mzt4" Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.000574 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e78351e0267ed6011c0c508a5d86a34c1efe986a8b326b91fdf940d351283e1" Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.030561 4837 scope.go:117] "RemoveContainer" containerID="18aeb282fbd8558fc7f2a4d93c502285e6ae25649a3f62cf2708ff5492d7993d" Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.133360 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-txzkw"] Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.154930 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-txzkw"] Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.202629 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 13 12:08:50 crc kubenswrapper[4837]: E0313 12:08:50.203384 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02b82791-6ef3-4a93-9d5a-84065d62775d" containerName="nova-cell1-conductor-db-sync" Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.207599 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="02b82791-6ef3-4a93-9d5a-84065d62775d" containerName="nova-cell1-conductor-db-sync" Mar 13 12:08:50 crc kubenswrapper[4837]: E0313 12:08:50.207821 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd9a8546-e61b-47e0-90b9-e6c8e4365b0b" containerName="init" Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.207899 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd9a8546-e61b-47e0-90b9-e6c8e4365b0b" containerName="init" Mar 13 12:08:50 crc kubenswrapper[4837]: E0313 12:08:50.208009 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53268342-9adb-48b3-ba5b-52634c2c68fe" containerName="nova-manage" Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.208091 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="53268342-9adb-48b3-ba5b-52634c2c68fe" containerName="nova-manage" Mar 13 12:08:50 crc kubenswrapper[4837]: E0313 12:08:50.208198 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd9a8546-e61b-47e0-90b9-e6c8e4365b0b" containerName="dnsmasq-dns" Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.208274 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd9a8546-e61b-47e0-90b9-e6c8e4365b0b" containerName="dnsmasq-dns" Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.208862 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="53268342-9adb-48b3-ba5b-52634c2c68fe" containerName="nova-manage" Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.208968 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="02b82791-6ef3-4a93-9d5a-84065d62775d" containerName="nova-cell1-conductor-db-sync" Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.209059 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd9a8546-e61b-47e0-90b9-e6c8e4365b0b" containerName="dnsmasq-dns" Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.212889 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.224536 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.230968 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.235613 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a51debb-c1cb-4a55-b845-e89d89d11e86-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9a51debb-c1cb-4a55-b845-e89d89d11e86\") " pod="openstack/nova-cell1-conductor-0" Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.236559 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a51debb-c1cb-4a55-b845-e89d89d11e86-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9a51debb-c1cb-4a55-b845-e89d89d11e86\") " pod="openstack/nova-cell1-conductor-0" Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.236732 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx7rd\" (UniqueName: \"kubernetes.io/projected/9a51debb-c1cb-4a55-b845-e89d89d11e86-kube-api-access-hx7rd\") pod \"nova-cell1-conductor-0\" (UID: \"9a51debb-c1cb-4a55-b845-e89d89d11e86\") " pod="openstack/nova-cell1-conductor-0" Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.320741 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.324713 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.324840 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.338443 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a51debb-c1cb-4a55-b845-e89d89d11e86-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9a51debb-c1cb-4a55-b845-e89d89d11e86\") " pod="openstack/nova-cell1-conductor-0" Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.338497 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a51debb-c1cb-4a55-b845-e89d89d11e86-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9a51debb-c1cb-4a55-b845-e89d89d11e86\") " pod="openstack/nova-cell1-conductor-0" Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.338537 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hx7rd\" (UniqueName: \"kubernetes.io/projected/9a51debb-c1cb-4a55-b845-e89d89d11e86-kube-api-access-hx7rd\") pod \"nova-cell1-conductor-0\" (UID: \"9a51debb-c1cb-4a55-b845-e89d89d11e86\") " pod="openstack/nova-cell1-conductor-0" Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.342792 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a51debb-c1cb-4a55-b845-e89d89d11e86-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9a51debb-c1cb-4a55-b845-e89d89d11e86\") " pod="openstack/nova-cell1-conductor-0" Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.346276 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.346595 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f3179576-07e2-4e05-8d10-01e3d694863b" containerName="nova-api-log" containerID="cri-o://0511fe858584c41b4362fa4eb0bbad5c40393493b881d67bbae3af394094d397" gracePeriod=30 Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.346728 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f3179576-07e2-4e05-8d10-01e3d694863b" containerName="nova-api-api" containerID="cri-o://ccea0ec2cd3b8c08290e7221354973c6421a6b999d1adffb63e82adac076716a" gracePeriod=30 Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.357700 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.366311 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx7rd\" (UniqueName: \"kubernetes.io/projected/9a51debb-c1cb-4a55-b845-e89d89d11e86-kube-api-access-hx7rd\") pod \"nova-cell1-conductor-0\" (UID: \"9a51debb-c1cb-4a55-b845-e89d89d11e86\") " pod="openstack/nova-cell1-conductor-0" Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.367210 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a51debb-c1cb-4a55-b845-e89d89d11e86-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9a51debb-c1cb-4a55-b845-e89d89d11e86\") " pod="openstack/nova-cell1-conductor-0" Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.550706 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 13 12:08:50 crc kubenswrapper[4837]: I0313 12:08:50.993976 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 13 12:08:50 crc kubenswrapper[4837]: W0313 12:08:50.997360 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a51debb_c1cb_4a55_b845_e89d89d11e86.slice/crio-8cfc11ef296434fbdcd112fcff3fcf482dfc699684f6173db012822b534b56d7 WatchSource:0}: Error finding container 8cfc11ef296434fbdcd112fcff3fcf482dfc699684f6173db012822b534b56d7: Status 404 returned error can't find the container with id 8cfc11ef296434fbdcd112fcff3fcf482dfc699684f6173db012822b534b56d7 Mar 13 12:08:51 crc kubenswrapper[4837]: I0313 12:08:51.014205 4837 generic.go:334] "Generic (PLEG): container finished" podID="f3179576-07e2-4e05-8d10-01e3d694863b" containerID="0511fe858584c41b4362fa4eb0bbad5c40393493b881d67bbae3af394094d397" exitCode=143 Mar 13 12:08:51 crc kubenswrapper[4837]: I0313 12:08:51.014276 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f3179576-07e2-4e05-8d10-01e3d694863b","Type":"ContainerDied","Data":"0511fe858584c41b4362fa4eb0bbad5c40393493b881d67bbae3af394094d397"} Mar 13 12:08:51 crc kubenswrapper[4837]: I0313 12:08:51.016902 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"9a51debb-c1cb-4a55-b845-e89d89d11e86","Type":"ContainerStarted","Data":"8cfc11ef296434fbdcd112fcff3fcf482dfc699684f6173db012822b534b56d7"} Mar 13 12:08:51 crc kubenswrapper[4837]: I0313 12:08:51.026290 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="707289ff-1434-49b7-904a-58decfdd53ca" containerName="nova-scheduler-scheduler" containerID="cri-o://de14fa2730ef467496fc05d9ce620e2ff356ba12b2dd9494751b1d31dcb5f089" gracePeriod=30 Mar 13 12:08:51 crc kubenswrapper[4837]: I0313 12:08:51.071262 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd9a8546-e61b-47e0-90b9-e6c8e4365b0b" path="/var/lib/kubelet/pods/fd9a8546-e61b-47e0-90b9-e6c8e4365b0b/volumes" Mar 13 12:08:52 crc kubenswrapper[4837]: I0313 12:08:52.045512 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4d0251c8-3594-482e-bd3c-2ca33c9e0ab5" containerName="nova-metadata-log" containerID="cri-o://2d6f702dc40ebdf06e59f7de571c86b0307e438343613d9494734ff2a3295495" gracePeriod=30 Mar 13 12:08:52 crc kubenswrapper[4837]: I0313 12:08:52.046808 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"9a51debb-c1cb-4a55-b845-e89d89d11e86","Type":"ContainerStarted","Data":"c1cca02df9f56d80002d5370498f9f6c551789fcd9e78dfef99dca9ce9a40416"} Mar 13 12:08:52 crc kubenswrapper[4837]: I0313 12:08:52.047078 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 13 12:08:52 crc kubenswrapper[4837]: I0313 12:08:52.047131 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4d0251c8-3594-482e-bd3c-2ca33c9e0ab5" containerName="nova-metadata-metadata" containerID="cri-o://3d7f2f79cf7018648ce988483c617478d157b963d57e5926b659be5f5a8f979a" gracePeriod=30 Mar 13 12:08:52 crc kubenswrapper[4837]: I0313 12:08:52.072455 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.072439004 podStartE2EDuration="2.072439004s" podCreationTimestamp="2026-03-13 12:08:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:08:52.065766794 +0000 UTC m=+1247.704033547" watchObservedRunningTime="2026-03-13 12:08:52.072439004 +0000 UTC m=+1247.710705767" Mar 13 12:08:52 crc kubenswrapper[4837]: I0313 12:08:52.622259 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 12:08:52 crc kubenswrapper[4837]: I0313 12:08:52.704857 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-combined-ca-bundle\") pod \"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5\" (UID: \"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5\") " Mar 13 12:08:52 crc kubenswrapper[4837]: I0313 12:08:52.704921 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-config-data\") pod \"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5\" (UID: \"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5\") " Mar 13 12:08:52 crc kubenswrapper[4837]: I0313 12:08:52.704957 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hzrv\" (UniqueName: \"kubernetes.io/projected/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-kube-api-access-5hzrv\") pod \"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5\" (UID: \"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5\") " Mar 13 12:08:52 crc kubenswrapper[4837]: I0313 12:08:52.705042 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-logs\") pod \"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5\" (UID: \"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5\") " Mar 13 12:08:52 crc kubenswrapper[4837]: I0313 12:08:52.705065 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-nova-metadata-tls-certs\") pod \"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5\" (UID: \"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5\") " Mar 13 12:08:52 crc kubenswrapper[4837]: I0313 12:08:52.706844 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-logs" (OuterVolumeSpecName: "logs") pod "4d0251c8-3594-482e-bd3c-2ca33c9e0ab5" (UID: "4d0251c8-3594-482e-bd3c-2ca33c9e0ab5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:08:52 crc kubenswrapper[4837]: I0313 12:08:52.710444 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-kube-api-access-5hzrv" (OuterVolumeSpecName: "kube-api-access-5hzrv") pod "4d0251c8-3594-482e-bd3c-2ca33c9e0ab5" (UID: "4d0251c8-3594-482e-bd3c-2ca33c9e0ab5"). InnerVolumeSpecName "kube-api-access-5hzrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:52 crc kubenswrapper[4837]: I0313 12:08:52.748350 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4d0251c8-3594-482e-bd3c-2ca33c9e0ab5" (UID: "4d0251c8-3594-482e-bd3c-2ca33c9e0ab5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:52 crc kubenswrapper[4837]: I0313 12:08:52.752415 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-config-data" (OuterVolumeSpecName: "config-data") pod "4d0251c8-3594-482e-bd3c-2ca33c9e0ab5" (UID: "4d0251c8-3594-482e-bd3c-2ca33c9e0ab5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:52 crc kubenswrapper[4837]: I0313 12:08:52.757723 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "4d0251c8-3594-482e-bd3c-2ca33c9e0ab5" (UID: "4d0251c8-3594-482e-bd3c-2ca33c9e0ab5"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:52 crc kubenswrapper[4837]: I0313 12:08:52.806708 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:52 crc kubenswrapper[4837]: I0313 12:08:52.806744 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:52 crc kubenswrapper[4837]: I0313 12:08:52.806754 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hzrv\" (UniqueName: \"kubernetes.io/projected/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-kube-api-access-5hzrv\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:52 crc kubenswrapper[4837]: I0313 12:08:52.806764 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:52 crc kubenswrapper[4837]: I0313 12:08:52.806774 4837 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.056721 4837 generic.go:334] "Generic (PLEG): container finished" podID="4d0251c8-3594-482e-bd3c-2ca33c9e0ab5" containerID="3d7f2f79cf7018648ce988483c617478d157b963d57e5926b659be5f5a8f979a" exitCode=0 Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.056767 4837 generic.go:334] "Generic (PLEG): container finished" podID="4d0251c8-3594-482e-bd3c-2ca33c9e0ab5" containerID="2d6f702dc40ebdf06e59f7de571c86b0307e438343613d9494734ff2a3295495" exitCode=143 Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.056951 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.067791 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5","Type":"ContainerDied","Data":"3d7f2f79cf7018648ce988483c617478d157b963d57e5926b659be5f5a8f979a"} Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.067866 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5","Type":"ContainerDied","Data":"2d6f702dc40ebdf06e59f7de571c86b0307e438343613d9494734ff2a3295495"} Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.067884 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4d0251c8-3594-482e-bd3c-2ca33c9e0ab5","Type":"ContainerDied","Data":"e3827fca94d00b12ca92bacbca73cd8bbf6d6f767ab5ac87612ce11ca72155b4"} Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.068012 4837 scope.go:117] "RemoveContainer" containerID="3d7f2f79cf7018648ce988483c617478d157b963d57e5926b659be5f5a8f979a" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.107968 4837 scope.go:117] "RemoveContainer" containerID="2d6f702dc40ebdf06e59f7de571c86b0307e438343613d9494734ff2a3295495" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.108786 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.128785 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.136513 4837 scope.go:117] "RemoveContainer" containerID="3d7f2f79cf7018648ce988483c617478d157b963d57e5926b659be5f5a8f979a" Mar 13 12:08:53 crc kubenswrapper[4837]: E0313 12:08:53.136967 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d7f2f79cf7018648ce988483c617478d157b963d57e5926b659be5f5a8f979a\": container with ID starting with 3d7f2f79cf7018648ce988483c617478d157b963d57e5926b659be5f5a8f979a not found: ID does not exist" containerID="3d7f2f79cf7018648ce988483c617478d157b963d57e5926b659be5f5a8f979a" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.137009 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d7f2f79cf7018648ce988483c617478d157b963d57e5926b659be5f5a8f979a"} err="failed to get container status \"3d7f2f79cf7018648ce988483c617478d157b963d57e5926b659be5f5a8f979a\": rpc error: code = NotFound desc = could not find container \"3d7f2f79cf7018648ce988483c617478d157b963d57e5926b659be5f5a8f979a\": container with ID starting with 3d7f2f79cf7018648ce988483c617478d157b963d57e5926b659be5f5a8f979a not found: ID does not exist" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.137035 4837 scope.go:117] "RemoveContainer" containerID="2d6f702dc40ebdf06e59f7de571c86b0307e438343613d9494734ff2a3295495" Mar 13 12:08:53 crc kubenswrapper[4837]: E0313 12:08:53.137340 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d6f702dc40ebdf06e59f7de571c86b0307e438343613d9494734ff2a3295495\": container with ID starting with 2d6f702dc40ebdf06e59f7de571c86b0307e438343613d9494734ff2a3295495 not found: ID does not exist" containerID="2d6f702dc40ebdf06e59f7de571c86b0307e438343613d9494734ff2a3295495" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.137382 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d6f702dc40ebdf06e59f7de571c86b0307e438343613d9494734ff2a3295495"} err="failed to get container status \"2d6f702dc40ebdf06e59f7de571c86b0307e438343613d9494734ff2a3295495\": rpc error: code = NotFound desc = could not find container \"2d6f702dc40ebdf06e59f7de571c86b0307e438343613d9494734ff2a3295495\": container with ID starting with 2d6f702dc40ebdf06e59f7de571c86b0307e438343613d9494734ff2a3295495 not found: ID does not exist" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.137407 4837 scope.go:117] "RemoveContainer" containerID="3d7f2f79cf7018648ce988483c617478d157b963d57e5926b659be5f5a8f979a" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.137624 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d7f2f79cf7018648ce988483c617478d157b963d57e5926b659be5f5a8f979a"} err="failed to get container status \"3d7f2f79cf7018648ce988483c617478d157b963d57e5926b659be5f5a8f979a\": rpc error: code = NotFound desc = could not find container \"3d7f2f79cf7018648ce988483c617478d157b963d57e5926b659be5f5a8f979a\": container with ID starting with 3d7f2f79cf7018648ce988483c617478d157b963d57e5926b659be5f5a8f979a not found: ID does not exist" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.137658 4837 scope.go:117] "RemoveContainer" containerID="2d6f702dc40ebdf06e59f7de571c86b0307e438343613d9494734ff2a3295495" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.137871 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d6f702dc40ebdf06e59f7de571c86b0307e438343613d9494734ff2a3295495"} err="failed to get container status \"2d6f702dc40ebdf06e59f7de571c86b0307e438343613d9494734ff2a3295495\": rpc error: code = NotFound desc = could not find container \"2d6f702dc40ebdf06e59f7de571c86b0307e438343613d9494734ff2a3295495\": container with ID starting with 2d6f702dc40ebdf06e59f7de571c86b0307e438343613d9494734ff2a3295495 not found: ID does not exist" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.139282 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:08:53 crc kubenswrapper[4837]: E0313 12:08:53.139696 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d0251c8-3594-482e-bd3c-2ca33c9e0ab5" containerName="nova-metadata-metadata" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.139708 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d0251c8-3594-482e-bd3c-2ca33c9e0ab5" containerName="nova-metadata-metadata" Mar 13 12:08:53 crc kubenswrapper[4837]: E0313 12:08:53.139747 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d0251c8-3594-482e-bd3c-2ca33c9e0ab5" containerName="nova-metadata-log" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.139753 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d0251c8-3594-482e-bd3c-2ca33c9e0ab5" containerName="nova-metadata-log" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.139915 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d0251c8-3594-482e-bd3c-2ca33c9e0ab5" containerName="nova-metadata-log" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.139936 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d0251c8-3594-482e-bd3c-2ca33c9e0ab5" containerName="nova-metadata-metadata" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.140935 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.143543 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.143888 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.150845 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.319933 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-config-data\") pod \"nova-metadata-0\" (UID: \"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43\") " pod="openstack/nova-metadata-0" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.320028 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7skk9\" (UniqueName: \"kubernetes.io/projected/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-kube-api-access-7skk9\") pod \"nova-metadata-0\" (UID: \"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43\") " pod="openstack/nova-metadata-0" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.320062 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43\") " pod="openstack/nova-metadata-0" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.320107 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-logs\") pod \"nova-metadata-0\" (UID: \"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43\") " pod="openstack/nova-metadata-0" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.320192 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43\") " pod="openstack/nova-metadata-0" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.422520 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7skk9\" (UniqueName: \"kubernetes.io/projected/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-kube-api-access-7skk9\") pod \"nova-metadata-0\" (UID: \"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43\") " pod="openstack/nova-metadata-0" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.422624 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43\") " pod="openstack/nova-metadata-0" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.422685 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-logs\") pod \"nova-metadata-0\" (UID: \"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43\") " pod="openstack/nova-metadata-0" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.422771 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43\") " pod="openstack/nova-metadata-0" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.422827 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-config-data\") pod \"nova-metadata-0\" (UID: \"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43\") " pod="openstack/nova-metadata-0" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.424466 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-logs\") pod \"nova-metadata-0\" (UID: \"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43\") " pod="openstack/nova-metadata-0" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.427353 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-config-data\") pod \"nova-metadata-0\" (UID: \"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43\") " pod="openstack/nova-metadata-0" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.427595 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43\") " pod="openstack/nova-metadata-0" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.431668 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43\") " pod="openstack/nova-metadata-0" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.450095 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7skk9\" (UniqueName: \"kubernetes.io/projected/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-kube-api-access-7skk9\") pod \"nova-metadata-0\" (UID: \"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43\") " pod="openstack/nova-metadata-0" Mar 13 12:08:53 crc kubenswrapper[4837]: E0313 12:08:53.494910 4837 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="de14fa2730ef467496fc05d9ce620e2ff356ba12b2dd9494751b1d31dcb5f089" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 12:08:53 crc kubenswrapper[4837]: E0313 12:08:53.496155 4837 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="de14fa2730ef467496fc05d9ce620e2ff356ba12b2dd9494751b1d31dcb5f089" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 12:08:53 crc kubenswrapper[4837]: E0313 12:08:53.497843 4837 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="de14fa2730ef467496fc05d9ce620e2ff356ba12b2dd9494751b1d31dcb5f089" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 12:08:53 crc kubenswrapper[4837]: E0313 12:08:53.497936 4837 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="707289ff-1434-49b7-904a-58decfdd53ca" containerName="nova-scheduler-scheduler" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.510801 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 12:08:53 crc kubenswrapper[4837]: I0313 12:08:53.951302 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:08:53 crc kubenswrapper[4837]: W0313 12:08:53.954450 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5000e5ff_8cf6_4f0c_a6c4_e6b550c2fe43.slice/crio-c2ff21ee05eb4c0cd65e5feb281a54f68d478fb493c97488ec3ad06bbc0f4880 WatchSource:0}: Error finding container c2ff21ee05eb4c0cd65e5feb281a54f68d478fb493c97488ec3ad06bbc0f4880: Status 404 returned error can't find the container with id c2ff21ee05eb4c0cd65e5feb281a54f68d478fb493c97488ec3ad06bbc0f4880 Mar 13 12:08:54 crc kubenswrapper[4837]: I0313 12:08:54.067670 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43","Type":"ContainerStarted","Data":"c2ff21ee05eb4c0cd65e5feb281a54f68d478fb493c97488ec3ad06bbc0f4880"} Mar 13 12:08:54 crc kubenswrapper[4837]: I0313 12:08:54.710198 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 12:08:54 crc kubenswrapper[4837]: I0313 12:08:54.888589 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/707289ff-1434-49b7-904a-58decfdd53ca-combined-ca-bundle\") pod \"707289ff-1434-49b7-904a-58decfdd53ca\" (UID: \"707289ff-1434-49b7-904a-58decfdd53ca\") " Mar 13 12:08:54 crc kubenswrapper[4837]: I0313 12:08:54.888898 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzqjg\" (UniqueName: \"kubernetes.io/projected/707289ff-1434-49b7-904a-58decfdd53ca-kube-api-access-jzqjg\") pod \"707289ff-1434-49b7-904a-58decfdd53ca\" (UID: \"707289ff-1434-49b7-904a-58decfdd53ca\") " Mar 13 12:08:54 crc kubenswrapper[4837]: I0313 12:08:54.888936 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/707289ff-1434-49b7-904a-58decfdd53ca-config-data\") pod \"707289ff-1434-49b7-904a-58decfdd53ca\" (UID: \"707289ff-1434-49b7-904a-58decfdd53ca\") " Mar 13 12:08:54 crc kubenswrapper[4837]: I0313 12:08:54.894229 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/707289ff-1434-49b7-904a-58decfdd53ca-kube-api-access-jzqjg" (OuterVolumeSpecName: "kube-api-access-jzqjg") pod "707289ff-1434-49b7-904a-58decfdd53ca" (UID: "707289ff-1434-49b7-904a-58decfdd53ca"). InnerVolumeSpecName "kube-api-access-jzqjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:54 crc kubenswrapper[4837]: I0313 12:08:54.915085 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/707289ff-1434-49b7-904a-58decfdd53ca-config-data" (OuterVolumeSpecName: "config-data") pod "707289ff-1434-49b7-904a-58decfdd53ca" (UID: "707289ff-1434-49b7-904a-58decfdd53ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:54 crc kubenswrapper[4837]: I0313 12:08:54.917806 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/707289ff-1434-49b7-904a-58decfdd53ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "707289ff-1434-49b7-904a-58decfdd53ca" (UID: "707289ff-1434-49b7-904a-58decfdd53ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:54 crc kubenswrapper[4837]: I0313 12:08:54.994443 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzqjg\" (UniqueName: \"kubernetes.io/projected/707289ff-1434-49b7-904a-58decfdd53ca-kube-api-access-jzqjg\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:54 crc kubenswrapper[4837]: I0313 12:08:54.994484 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/707289ff-1434-49b7-904a-58decfdd53ca-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:54 crc kubenswrapper[4837]: I0313 12:08:54.994495 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/707289ff-1434-49b7-904a-58decfdd53ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.038612 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.074366 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d0251c8-3594-482e-bd3c-2ca33c9e0ab5" path="/var/lib/kubelet/pods/4d0251c8-3594-482e-bd3c-2ca33c9e0ab5/volumes" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.085899 4837 generic.go:334] "Generic (PLEG): container finished" podID="f3179576-07e2-4e05-8d10-01e3d694863b" containerID="ccea0ec2cd3b8c08290e7221354973c6421a6b999d1adffb63e82adac076716a" exitCode=0 Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.085990 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f3179576-07e2-4e05-8d10-01e3d694863b","Type":"ContainerDied","Data":"ccea0ec2cd3b8c08290e7221354973c6421a6b999d1adffb63e82adac076716a"} Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.089365 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f3179576-07e2-4e05-8d10-01e3d694863b","Type":"ContainerDied","Data":"462ea19aaa8a4b2f42cf4a80e03784c4432ff7806e973f2c0cf7363762b9df8e"} Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.089412 4837 scope.go:117] "RemoveContainer" containerID="ccea0ec2cd3b8c08290e7221354973c6421a6b999d1adffb63e82adac076716a" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.089550 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.099537 4837 generic.go:334] "Generic (PLEG): container finished" podID="707289ff-1434-49b7-904a-58decfdd53ca" containerID="de14fa2730ef467496fc05d9ce620e2ff356ba12b2dd9494751b1d31dcb5f089" exitCode=0 Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.099594 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.099661 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"707289ff-1434-49b7-904a-58decfdd53ca","Type":"ContainerDied","Data":"de14fa2730ef467496fc05d9ce620e2ff356ba12b2dd9494751b1d31dcb5f089"} Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.099700 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"707289ff-1434-49b7-904a-58decfdd53ca","Type":"ContainerDied","Data":"b6785bc0ca408832d28cd32714ea145d9a6e0bbc829424d2cf876cff8cb2427b"} Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.105291 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43","Type":"ContainerStarted","Data":"456c80f0855c2245bf0a0fc6d9cc652dad19c0773d477ea76ccfc7415d5a5c8e"} Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.105341 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43","Type":"ContainerStarted","Data":"9b97f0741ed8dc4568de5acf76058ec03e048925850e843f27b09f36ed5bcf98"} Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.116964 4837 scope.go:117] "RemoveContainer" containerID="0511fe858584c41b4362fa4eb0bbad5c40393493b881d67bbae3af394094d397" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.137072 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.137036845 podStartE2EDuration="2.137036845s" podCreationTimestamp="2026-03-13 12:08:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:08:55.127241106 +0000 UTC m=+1250.765507899" watchObservedRunningTime="2026-03-13 12:08:55.137036845 +0000 UTC m=+1250.775303608" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.144474 4837 scope.go:117] "RemoveContainer" containerID="ccea0ec2cd3b8c08290e7221354973c6421a6b999d1adffb63e82adac076716a" Mar 13 12:08:55 crc kubenswrapper[4837]: E0313 12:08:55.144916 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccea0ec2cd3b8c08290e7221354973c6421a6b999d1adffb63e82adac076716a\": container with ID starting with ccea0ec2cd3b8c08290e7221354973c6421a6b999d1adffb63e82adac076716a not found: ID does not exist" containerID="ccea0ec2cd3b8c08290e7221354973c6421a6b999d1adffb63e82adac076716a" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.144959 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccea0ec2cd3b8c08290e7221354973c6421a6b999d1adffb63e82adac076716a"} err="failed to get container status \"ccea0ec2cd3b8c08290e7221354973c6421a6b999d1adffb63e82adac076716a\": rpc error: code = NotFound desc = could not find container \"ccea0ec2cd3b8c08290e7221354973c6421a6b999d1adffb63e82adac076716a\": container with ID starting with ccea0ec2cd3b8c08290e7221354973c6421a6b999d1adffb63e82adac076716a not found: ID does not exist" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.144988 4837 scope.go:117] "RemoveContainer" containerID="0511fe858584c41b4362fa4eb0bbad5c40393493b881d67bbae3af394094d397" Mar 13 12:08:55 crc kubenswrapper[4837]: E0313 12:08:55.145726 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0511fe858584c41b4362fa4eb0bbad5c40393493b881d67bbae3af394094d397\": container with ID starting with 0511fe858584c41b4362fa4eb0bbad5c40393493b881d67bbae3af394094d397 not found: ID does not exist" containerID="0511fe858584c41b4362fa4eb0bbad5c40393493b881d67bbae3af394094d397" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.145769 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0511fe858584c41b4362fa4eb0bbad5c40393493b881d67bbae3af394094d397"} err="failed to get container status \"0511fe858584c41b4362fa4eb0bbad5c40393493b881d67bbae3af394094d397\": rpc error: code = NotFound desc = could not find container \"0511fe858584c41b4362fa4eb0bbad5c40393493b881d67bbae3af394094d397\": container with ID starting with 0511fe858584c41b4362fa4eb0bbad5c40393493b881d67bbae3af394094d397 not found: ID does not exist" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.145794 4837 scope.go:117] "RemoveContainer" containerID="de14fa2730ef467496fc05d9ce620e2ff356ba12b2dd9494751b1d31dcb5f089" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.149246 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.163103 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.171234 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 12:08:55 crc kubenswrapper[4837]: E0313 12:08:55.172672 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3179576-07e2-4e05-8d10-01e3d694863b" containerName="nova-api-api" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.172698 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3179576-07e2-4e05-8d10-01e3d694863b" containerName="nova-api-api" Mar 13 12:08:55 crc kubenswrapper[4837]: E0313 12:08:55.172716 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="707289ff-1434-49b7-904a-58decfdd53ca" containerName="nova-scheduler-scheduler" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.172726 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="707289ff-1434-49b7-904a-58decfdd53ca" containerName="nova-scheduler-scheduler" Mar 13 12:08:55 crc kubenswrapper[4837]: E0313 12:08:55.172745 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3179576-07e2-4e05-8d10-01e3d694863b" containerName="nova-api-log" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.172754 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3179576-07e2-4e05-8d10-01e3d694863b" containerName="nova-api-log" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.172980 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3179576-07e2-4e05-8d10-01e3d694863b" containerName="nova-api-api" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.173008 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3179576-07e2-4e05-8d10-01e3d694863b" containerName="nova-api-log" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.173027 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="707289ff-1434-49b7-904a-58decfdd53ca" containerName="nova-scheduler-scheduler" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.175290 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.177896 4837 scope.go:117] "RemoveContainer" containerID="de14fa2730ef467496fc05d9ce620e2ff356ba12b2dd9494751b1d31dcb5f089" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.178142 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 13 12:08:55 crc kubenswrapper[4837]: E0313 12:08:55.179067 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de14fa2730ef467496fc05d9ce620e2ff356ba12b2dd9494751b1d31dcb5f089\": container with ID starting with de14fa2730ef467496fc05d9ce620e2ff356ba12b2dd9494751b1d31dcb5f089 not found: ID does not exist" containerID="de14fa2730ef467496fc05d9ce620e2ff356ba12b2dd9494751b1d31dcb5f089" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.179096 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de14fa2730ef467496fc05d9ce620e2ff356ba12b2dd9494751b1d31dcb5f089"} err="failed to get container status \"de14fa2730ef467496fc05d9ce620e2ff356ba12b2dd9494751b1d31dcb5f089\": rpc error: code = NotFound desc = could not find container \"de14fa2730ef467496fc05d9ce620e2ff356ba12b2dd9494751b1d31dcb5f089\": container with ID starting with de14fa2730ef467496fc05d9ce620e2ff356ba12b2dd9494751b1d31dcb5f089 not found: ID does not exist" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.198599 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3179576-07e2-4e05-8d10-01e3d694863b-logs\") pod \"f3179576-07e2-4e05-8d10-01e3d694863b\" (UID: \"f3179576-07e2-4e05-8d10-01e3d694863b\") " Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.198741 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3179576-07e2-4e05-8d10-01e3d694863b-config-data\") pod \"f3179576-07e2-4e05-8d10-01e3d694863b\" (UID: \"f3179576-07e2-4e05-8d10-01e3d694863b\") " Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.198900 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85cfl\" (UniqueName: \"kubernetes.io/projected/f3179576-07e2-4e05-8d10-01e3d694863b-kube-api-access-85cfl\") pod \"f3179576-07e2-4e05-8d10-01e3d694863b\" (UID: \"f3179576-07e2-4e05-8d10-01e3d694863b\") " Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.198942 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3179576-07e2-4e05-8d10-01e3d694863b-combined-ca-bundle\") pod \"f3179576-07e2-4e05-8d10-01e3d694863b\" (UID: \"f3179576-07e2-4e05-8d10-01e3d694863b\") " Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.206931 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3179576-07e2-4e05-8d10-01e3d694863b-logs" (OuterVolumeSpecName: "logs") pod "f3179576-07e2-4e05-8d10-01e3d694863b" (UID: "f3179576-07e2-4e05-8d10-01e3d694863b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.214385 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3179576-07e2-4e05-8d10-01e3d694863b-kube-api-access-85cfl" (OuterVolumeSpecName: "kube-api-access-85cfl") pod "f3179576-07e2-4e05-8d10-01e3d694863b" (UID: "f3179576-07e2-4e05-8d10-01e3d694863b"). InnerVolumeSpecName "kube-api-access-85cfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.221947 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.232718 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3179576-07e2-4e05-8d10-01e3d694863b-config-data" (OuterVolumeSpecName: "config-data") pod "f3179576-07e2-4e05-8d10-01e3d694863b" (UID: "f3179576-07e2-4e05-8d10-01e3d694863b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.244032 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3179576-07e2-4e05-8d10-01e3d694863b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f3179576-07e2-4e05-8d10-01e3d694863b" (UID: "f3179576-07e2-4e05-8d10-01e3d694863b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.300650 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cc7473d-2608-4989-990f-a19d70e8a3a3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4cc7473d-2608-4989-990f-a19d70e8a3a3\") " pod="openstack/nova-scheduler-0" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.300794 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xqdx\" (UniqueName: \"kubernetes.io/projected/4cc7473d-2608-4989-990f-a19d70e8a3a3-kube-api-access-7xqdx\") pod \"nova-scheduler-0\" (UID: \"4cc7473d-2608-4989-990f-a19d70e8a3a3\") " pod="openstack/nova-scheduler-0" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.300954 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cc7473d-2608-4989-990f-a19d70e8a3a3-config-data\") pod \"nova-scheduler-0\" (UID: \"4cc7473d-2608-4989-990f-a19d70e8a3a3\") " pod="openstack/nova-scheduler-0" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.301013 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3179576-07e2-4e05-8d10-01e3d694863b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.301026 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3179576-07e2-4e05-8d10-01e3d694863b-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.301035 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3179576-07e2-4e05-8d10-01e3d694863b-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.301045 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85cfl\" (UniqueName: \"kubernetes.io/projected/f3179576-07e2-4e05-8d10-01e3d694863b-kube-api-access-85cfl\") on node \"crc\" DevicePath \"\"" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.402199 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xqdx\" (UniqueName: \"kubernetes.io/projected/4cc7473d-2608-4989-990f-a19d70e8a3a3-kube-api-access-7xqdx\") pod \"nova-scheduler-0\" (UID: \"4cc7473d-2608-4989-990f-a19d70e8a3a3\") " pod="openstack/nova-scheduler-0" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.402368 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cc7473d-2608-4989-990f-a19d70e8a3a3-config-data\") pod \"nova-scheduler-0\" (UID: \"4cc7473d-2608-4989-990f-a19d70e8a3a3\") " pod="openstack/nova-scheduler-0" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.402415 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cc7473d-2608-4989-990f-a19d70e8a3a3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4cc7473d-2608-4989-990f-a19d70e8a3a3\") " pod="openstack/nova-scheduler-0" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.406549 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cc7473d-2608-4989-990f-a19d70e8a3a3-config-data\") pod \"nova-scheduler-0\" (UID: \"4cc7473d-2608-4989-990f-a19d70e8a3a3\") " pod="openstack/nova-scheduler-0" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.413589 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cc7473d-2608-4989-990f-a19d70e8a3a3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4cc7473d-2608-4989-990f-a19d70e8a3a3\") " pod="openstack/nova-scheduler-0" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.424165 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.424663 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xqdx\" (UniqueName: \"kubernetes.io/projected/4cc7473d-2608-4989-990f-a19d70e8a3a3-kube-api-access-7xqdx\") pod \"nova-scheduler-0\" (UID: \"4cc7473d-2608-4989-990f-a19d70e8a3a3\") " pod="openstack/nova-scheduler-0" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.433732 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.461285 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.463185 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.465577 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.474462 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 12:08:55 crc kubenswrapper[4837]: E0313 12:08:55.525416 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3179576_07e2_4e05_8d10_01e3d694863b.slice/crio-462ea19aaa8a4b2f42cf4a80e03784c4432ff7806e973f2c0cf7363762b9df8e\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3179576_07e2_4e05_8d10_01e3d694863b.slice\": RecentStats: unable to find data in memory cache]" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.605154 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534b3e48-da2d-41b6-af02-bef43adcac21-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"534b3e48-da2d-41b6-af02-bef43adcac21\") " pod="openstack/nova-api-0" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.605233 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/534b3e48-da2d-41b6-af02-bef43adcac21-logs\") pod \"nova-api-0\" (UID: \"534b3e48-da2d-41b6-af02-bef43adcac21\") " pod="openstack/nova-api-0" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.605338 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/534b3e48-da2d-41b6-af02-bef43adcac21-config-data\") pod \"nova-api-0\" (UID: \"534b3e48-da2d-41b6-af02-bef43adcac21\") " pod="openstack/nova-api-0" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.605383 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxww9\" (UniqueName: \"kubernetes.io/projected/534b3e48-da2d-41b6-af02-bef43adcac21-kube-api-access-hxww9\") pod \"nova-api-0\" (UID: \"534b3e48-da2d-41b6-af02-bef43adcac21\") " pod="openstack/nova-api-0" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.630959 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.711312 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/534b3e48-da2d-41b6-af02-bef43adcac21-config-data\") pod \"nova-api-0\" (UID: \"534b3e48-da2d-41b6-af02-bef43adcac21\") " pod="openstack/nova-api-0" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.711370 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxww9\" (UniqueName: \"kubernetes.io/projected/534b3e48-da2d-41b6-af02-bef43adcac21-kube-api-access-hxww9\") pod \"nova-api-0\" (UID: \"534b3e48-da2d-41b6-af02-bef43adcac21\") " pod="openstack/nova-api-0" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.711461 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534b3e48-da2d-41b6-af02-bef43adcac21-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"534b3e48-da2d-41b6-af02-bef43adcac21\") " pod="openstack/nova-api-0" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.711501 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/534b3e48-da2d-41b6-af02-bef43adcac21-logs\") pod \"nova-api-0\" (UID: \"534b3e48-da2d-41b6-af02-bef43adcac21\") " pod="openstack/nova-api-0" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.711939 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/534b3e48-da2d-41b6-af02-bef43adcac21-logs\") pod \"nova-api-0\" (UID: \"534b3e48-da2d-41b6-af02-bef43adcac21\") " pod="openstack/nova-api-0" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.722876 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534b3e48-da2d-41b6-af02-bef43adcac21-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"534b3e48-da2d-41b6-af02-bef43adcac21\") " pod="openstack/nova-api-0" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.722974 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/534b3e48-da2d-41b6-af02-bef43adcac21-config-data\") pod \"nova-api-0\" (UID: \"534b3e48-da2d-41b6-af02-bef43adcac21\") " pod="openstack/nova-api-0" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.735825 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxww9\" (UniqueName: \"kubernetes.io/projected/534b3e48-da2d-41b6-af02-bef43adcac21-kube-api-access-hxww9\") pod \"nova-api-0\" (UID: \"534b3e48-da2d-41b6-af02-bef43adcac21\") " pod="openstack/nova-api-0" Mar 13 12:08:55 crc kubenswrapper[4837]: I0313 12:08:55.824721 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 12:08:56 crc kubenswrapper[4837]: W0313 12:08:56.073319 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4cc7473d_2608_4989_990f_a19d70e8a3a3.slice/crio-7499f46006750dbda5f6e6ebf4a383aa5e9cd8afe4efcdd2e54c6f88540d0454 WatchSource:0}: Error finding container 7499f46006750dbda5f6e6ebf4a383aa5e9cd8afe4efcdd2e54c6f88540d0454: Status 404 returned error can't find the container with id 7499f46006750dbda5f6e6ebf4a383aa5e9cd8afe4efcdd2e54c6f88540d0454 Mar 13 12:08:56 crc kubenswrapper[4837]: I0313 12:08:56.074555 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 12:08:56 crc kubenswrapper[4837]: I0313 12:08:56.118364 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4cc7473d-2608-4989-990f-a19d70e8a3a3","Type":"ContainerStarted","Data":"7499f46006750dbda5f6e6ebf4a383aa5e9cd8afe4efcdd2e54c6f88540d0454"} Mar 13 12:08:56 crc kubenswrapper[4837]: I0313 12:08:56.236990 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 12:08:56 crc kubenswrapper[4837]: W0313 12:08:56.239402 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod534b3e48_da2d_41b6_af02_bef43adcac21.slice/crio-bcd98412fdb19b2343b2d9cf6ae91d9ffecdeb486a55232ac24c6fe9606a8706 WatchSource:0}: Error finding container bcd98412fdb19b2343b2d9cf6ae91d9ffecdeb486a55232ac24c6fe9606a8706: Status 404 returned error can't find the container with id bcd98412fdb19b2343b2d9cf6ae91d9ffecdeb486a55232ac24c6fe9606a8706 Mar 13 12:08:57 crc kubenswrapper[4837]: I0313 12:08:57.060876 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="707289ff-1434-49b7-904a-58decfdd53ca" path="/var/lib/kubelet/pods/707289ff-1434-49b7-904a-58decfdd53ca/volumes" Mar 13 12:08:57 crc kubenswrapper[4837]: I0313 12:08:57.062440 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3179576-07e2-4e05-8d10-01e3d694863b" path="/var/lib/kubelet/pods/f3179576-07e2-4e05-8d10-01e3d694863b/volumes" Mar 13 12:08:57 crc kubenswrapper[4837]: I0313 12:08:57.129667 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4cc7473d-2608-4989-990f-a19d70e8a3a3","Type":"ContainerStarted","Data":"84217929c8dd01d9f27889d14b0e8c6e8e14465fc1547bcea5b66260cee7a8c7"} Mar 13 12:08:57 crc kubenswrapper[4837]: I0313 12:08:57.132948 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"534b3e48-da2d-41b6-af02-bef43adcac21","Type":"ContainerStarted","Data":"8dadb969173ec34b078eb02203c6c8d0426368edb3963a41943b966c8fe59ae6"} Mar 13 12:08:57 crc kubenswrapper[4837]: I0313 12:08:57.133164 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"534b3e48-da2d-41b6-af02-bef43adcac21","Type":"ContainerStarted","Data":"723ce6d8aa73da7a7f75ecb7c83fcc098afc292834683b8ef574eab17dadd84a"} Mar 13 12:08:57 crc kubenswrapper[4837]: I0313 12:08:57.133241 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"534b3e48-da2d-41b6-af02-bef43adcac21","Type":"ContainerStarted","Data":"bcd98412fdb19b2343b2d9cf6ae91d9ffecdeb486a55232ac24c6fe9606a8706"} Mar 13 12:08:57 crc kubenswrapper[4837]: I0313 12:08:57.156728 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.156703975 podStartE2EDuration="2.156703975s" podCreationTimestamp="2026-03-13 12:08:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:08:57.145888235 +0000 UTC m=+1252.784154998" watchObservedRunningTime="2026-03-13 12:08:57.156703975 +0000 UTC m=+1252.794970738" Mar 13 12:08:57 crc kubenswrapper[4837]: I0313 12:08:57.170769 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.170741507 podStartE2EDuration="2.170741507s" podCreationTimestamp="2026-03-13 12:08:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:08:57.162924801 +0000 UTC m=+1252.801191584" watchObservedRunningTime="2026-03-13 12:08:57.170741507 +0000 UTC m=+1252.809008270" Mar 13 12:08:58 crc kubenswrapper[4837]: I0313 12:08:58.510941 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 13 12:08:58 crc kubenswrapper[4837]: I0313 12:08:58.511289 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 13 12:09:00 crc kubenswrapper[4837]: I0313 12:09:00.580455 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 13 12:09:00 crc kubenswrapper[4837]: I0313 12:09:00.632059 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 13 12:09:02 crc kubenswrapper[4837]: I0313 12:09:02.468751 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 13 12:09:03 crc kubenswrapper[4837]: I0313 12:09:03.511147 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 13 12:09:03 crc kubenswrapper[4837]: I0313 12:09:03.511403 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 13 12:09:04 crc kubenswrapper[4837]: I0313 12:09:04.524840 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 12:09:04 crc kubenswrapper[4837]: I0313 12:09:04.524885 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 12:09:05 crc kubenswrapper[4837]: I0313 12:09:05.631262 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 13 12:09:05 crc kubenswrapper[4837]: I0313 12:09:05.668869 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 13 12:09:05 crc kubenswrapper[4837]: I0313 12:09:05.827500 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 12:09:05 crc kubenswrapper[4837]: I0313 12:09:05.827562 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 12:09:05 crc kubenswrapper[4837]: I0313 12:09:05.944215 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 12:09:05 crc kubenswrapper[4837]: I0313 12:09:05.944402 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="a250849d-ca15-40fa-8b1d-a32b5abc6861" containerName="kube-state-metrics" containerID="cri-o://07fc1a83feb8d7932c2b80f34ffbd6218ef230bb996e92d9892feae57b23c402" gracePeriod=30 Mar 13 12:09:06 crc kubenswrapper[4837]: I0313 12:09:06.223914 4837 generic.go:334] "Generic (PLEG): container finished" podID="a250849d-ca15-40fa-8b1d-a32b5abc6861" containerID="07fc1a83feb8d7932c2b80f34ffbd6218ef230bb996e92d9892feae57b23c402" exitCode=2 Mar 13 12:09:06 crc kubenswrapper[4837]: I0313 12:09:06.224916 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a250849d-ca15-40fa-8b1d-a32b5abc6861","Type":"ContainerDied","Data":"07fc1a83feb8d7932c2b80f34ffbd6218ef230bb996e92d9892feae57b23c402"} Mar 13 12:09:06 crc kubenswrapper[4837]: I0313 12:09:06.273860 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 13 12:09:06 crc kubenswrapper[4837]: I0313 12:09:06.687923 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 12:09:06 crc kubenswrapper[4837]: I0313 12:09:06.842940 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vwh6\" (UniqueName: \"kubernetes.io/projected/a250849d-ca15-40fa-8b1d-a32b5abc6861-kube-api-access-9vwh6\") pod \"a250849d-ca15-40fa-8b1d-a32b5abc6861\" (UID: \"a250849d-ca15-40fa-8b1d-a32b5abc6861\") " Mar 13 12:09:06 crc kubenswrapper[4837]: I0313 12:09:06.850724 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a250849d-ca15-40fa-8b1d-a32b5abc6861-kube-api-access-9vwh6" (OuterVolumeSpecName: "kube-api-access-9vwh6") pod "a250849d-ca15-40fa-8b1d-a32b5abc6861" (UID: "a250849d-ca15-40fa-8b1d-a32b5abc6861"). InnerVolumeSpecName "kube-api-access-9vwh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:09:06 crc kubenswrapper[4837]: I0313 12:09:06.909842 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="534b3e48-da2d-41b6-af02-bef43adcac21" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.204:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 12:09:06 crc kubenswrapper[4837]: I0313 12:09:06.909877 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="534b3e48-da2d-41b6-af02-bef43adcac21" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.204:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 12:09:06 crc kubenswrapper[4837]: I0313 12:09:06.945493 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vwh6\" (UniqueName: \"kubernetes.io/projected/a250849d-ca15-40fa-8b1d-a32b5abc6861-kube-api-access-9vwh6\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:07 crc kubenswrapper[4837]: I0313 12:09:07.236656 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a250849d-ca15-40fa-8b1d-a32b5abc6861","Type":"ContainerDied","Data":"7a22f32b80bf3ec02fab7028c9c981153ef89481c11b18583b8c1e3f0c67df24"} Mar 13 12:09:07 crc kubenswrapper[4837]: I0313 12:09:07.236670 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 12:09:07 crc kubenswrapper[4837]: I0313 12:09:07.237032 4837 scope.go:117] "RemoveContainer" containerID="07fc1a83feb8d7932c2b80f34ffbd6218ef230bb996e92d9892feae57b23c402" Mar 13 12:09:07 crc kubenswrapper[4837]: I0313 12:09:07.277280 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 12:09:07 crc kubenswrapper[4837]: I0313 12:09:07.292521 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 12:09:07 crc kubenswrapper[4837]: I0313 12:09:07.303664 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 12:09:07 crc kubenswrapper[4837]: E0313 12:09:07.304167 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a250849d-ca15-40fa-8b1d-a32b5abc6861" containerName="kube-state-metrics" Mar 13 12:09:07 crc kubenswrapper[4837]: I0313 12:09:07.304192 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="a250849d-ca15-40fa-8b1d-a32b5abc6861" containerName="kube-state-metrics" Mar 13 12:09:07 crc kubenswrapper[4837]: I0313 12:09:07.304411 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="a250849d-ca15-40fa-8b1d-a32b5abc6861" containerName="kube-state-metrics" Mar 13 12:09:07 crc kubenswrapper[4837]: I0313 12:09:07.305177 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 12:09:07 crc kubenswrapper[4837]: I0313 12:09:07.307263 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 13 12:09:07 crc kubenswrapper[4837]: I0313 12:09:07.308087 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 13 12:09:07 crc kubenswrapper[4837]: I0313 12:09:07.313712 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 12:09:07 crc kubenswrapper[4837]: I0313 12:09:07.455753 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/abd69ff2-e72e-40c0-925f-d0c1c0a40f9a-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"abd69ff2-e72e-40c0-925f-d0c1c0a40f9a\") " pod="openstack/kube-state-metrics-0" Mar 13 12:09:07 crc kubenswrapper[4837]: I0313 12:09:07.455855 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79vvx\" (UniqueName: \"kubernetes.io/projected/abd69ff2-e72e-40c0-925f-d0c1c0a40f9a-kube-api-access-79vvx\") pod \"kube-state-metrics-0\" (UID: \"abd69ff2-e72e-40c0-925f-d0c1c0a40f9a\") " pod="openstack/kube-state-metrics-0" Mar 13 12:09:07 crc kubenswrapper[4837]: I0313 12:09:07.455880 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abd69ff2-e72e-40c0-925f-d0c1c0a40f9a-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"abd69ff2-e72e-40c0-925f-d0c1c0a40f9a\") " pod="openstack/kube-state-metrics-0" Mar 13 12:09:07 crc kubenswrapper[4837]: I0313 12:09:07.455899 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/abd69ff2-e72e-40c0-925f-d0c1c0a40f9a-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"abd69ff2-e72e-40c0-925f-d0c1c0a40f9a\") " pod="openstack/kube-state-metrics-0" Mar 13 12:09:07 crc kubenswrapper[4837]: I0313 12:09:07.558537 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/abd69ff2-e72e-40c0-925f-d0c1c0a40f9a-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"abd69ff2-e72e-40c0-925f-d0c1c0a40f9a\") " pod="openstack/kube-state-metrics-0" Mar 13 12:09:07 crc kubenswrapper[4837]: I0313 12:09:07.558588 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79vvx\" (UniqueName: \"kubernetes.io/projected/abd69ff2-e72e-40c0-925f-d0c1c0a40f9a-kube-api-access-79vvx\") pod \"kube-state-metrics-0\" (UID: \"abd69ff2-e72e-40c0-925f-d0c1c0a40f9a\") " pod="openstack/kube-state-metrics-0" Mar 13 12:09:07 crc kubenswrapper[4837]: I0313 12:09:07.558625 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abd69ff2-e72e-40c0-925f-d0c1c0a40f9a-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"abd69ff2-e72e-40c0-925f-d0c1c0a40f9a\") " pod="openstack/kube-state-metrics-0" Mar 13 12:09:07 crc kubenswrapper[4837]: I0313 12:09:07.558668 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/abd69ff2-e72e-40c0-925f-d0c1c0a40f9a-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"abd69ff2-e72e-40c0-925f-d0c1c0a40f9a\") " pod="openstack/kube-state-metrics-0" Mar 13 12:09:07 crc kubenswrapper[4837]: I0313 12:09:07.563671 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abd69ff2-e72e-40c0-925f-d0c1c0a40f9a-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"abd69ff2-e72e-40c0-925f-d0c1c0a40f9a\") " pod="openstack/kube-state-metrics-0" Mar 13 12:09:07 crc kubenswrapper[4837]: I0313 12:09:07.564274 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/abd69ff2-e72e-40c0-925f-d0c1c0a40f9a-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"abd69ff2-e72e-40c0-925f-d0c1c0a40f9a\") " pod="openstack/kube-state-metrics-0" Mar 13 12:09:07 crc kubenswrapper[4837]: I0313 12:09:07.565370 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/abd69ff2-e72e-40c0-925f-d0c1c0a40f9a-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"abd69ff2-e72e-40c0-925f-d0c1c0a40f9a\") " pod="openstack/kube-state-metrics-0" Mar 13 12:09:07 crc kubenswrapper[4837]: I0313 12:09:07.573147 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79vvx\" (UniqueName: \"kubernetes.io/projected/abd69ff2-e72e-40c0-925f-d0c1c0a40f9a-kube-api-access-79vvx\") pod \"kube-state-metrics-0\" (UID: \"abd69ff2-e72e-40c0-925f-d0c1c0a40f9a\") " pod="openstack/kube-state-metrics-0" Mar 13 12:09:07 crc kubenswrapper[4837]: I0313 12:09:07.624337 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 12:09:08 crc kubenswrapper[4837]: I0313 12:09:08.061403 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:09:08 crc kubenswrapper[4837]: I0313 12:09:08.062006 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a7f70330-cb87-42e5-96c8-6d54828f2a5a" containerName="ceilometer-central-agent" containerID="cri-o://d8b81a1d862c648975bd9a812fe1d61df727077dd39a97f4adfc70dac6066075" gracePeriod=30 Mar 13 12:09:08 crc kubenswrapper[4837]: I0313 12:09:08.062139 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a7f70330-cb87-42e5-96c8-6d54828f2a5a" containerName="proxy-httpd" containerID="cri-o://94da49d6a7255e5847d10069ee75dd614b4c6eea7e080a518814f780623556e5" gracePeriod=30 Mar 13 12:09:08 crc kubenswrapper[4837]: I0313 12:09:08.062186 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a7f70330-cb87-42e5-96c8-6d54828f2a5a" containerName="sg-core" containerID="cri-o://a9f4ef9baf51c5a45fe25c828b539addde1c0065712a676f95056b2183f00569" gracePeriod=30 Mar 13 12:09:08 crc kubenswrapper[4837]: I0313 12:09:08.062228 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a7f70330-cb87-42e5-96c8-6d54828f2a5a" containerName="ceilometer-notification-agent" containerID="cri-o://6d24d7cecf025123d4d281213efc8079b0cb18a3f100808ee593959500d93094" gracePeriod=30 Mar 13 12:09:08 crc kubenswrapper[4837]: I0313 12:09:08.150844 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 12:09:08 crc kubenswrapper[4837]: I0313 12:09:08.261203 4837 generic.go:334] "Generic (PLEG): container finished" podID="a7f70330-cb87-42e5-96c8-6d54828f2a5a" containerID="a9f4ef9baf51c5a45fe25c828b539addde1c0065712a676f95056b2183f00569" exitCode=2 Mar 13 12:09:08 crc kubenswrapper[4837]: I0313 12:09:08.261555 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a7f70330-cb87-42e5-96c8-6d54828f2a5a","Type":"ContainerDied","Data":"a9f4ef9baf51c5a45fe25c828b539addde1c0065712a676f95056b2183f00569"} Mar 13 12:09:08 crc kubenswrapper[4837]: I0313 12:09:08.262989 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"abd69ff2-e72e-40c0-925f-d0c1c0a40f9a","Type":"ContainerStarted","Data":"8afb9347869a58b1a54ddf30e6a1b29a5a1fcc55ece8e9ecf5f34ecb84524951"} Mar 13 12:09:09 crc kubenswrapper[4837]: I0313 12:09:09.068907 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a250849d-ca15-40fa-8b1d-a32b5abc6861" path="/var/lib/kubelet/pods/a250849d-ca15-40fa-8b1d-a32b5abc6861/volumes" Mar 13 12:09:09 crc kubenswrapper[4837]: I0313 12:09:09.277190 4837 generic.go:334] "Generic (PLEG): container finished" podID="a7f70330-cb87-42e5-96c8-6d54828f2a5a" containerID="94da49d6a7255e5847d10069ee75dd614b4c6eea7e080a518814f780623556e5" exitCode=0 Mar 13 12:09:09 crc kubenswrapper[4837]: I0313 12:09:09.277237 4837 generic.go:334] "Generic (PLEG): container finished" podID="a7f70330-cb87-42e5-96c8-6d54828f2a5a" containerID="d8b81a1d862c648975bd9a812fe1d61df727077dd39a97f4adfc70dac6066075" exitCode=0 Mar 13 12:09:09 crc kubenswrapper[4837]: I0313 12:09:09.277275 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a7f70330-cb87-42e5-96c8-6d54828f2a5a","Type":"ContainerDied","Data":"94da49d6a7255e5847d10069ee75dd614b4c6eea7e080a518814f780623556e5"} Mar 13 12:09:09 crc kubenswrapper[4837]: I0313 12:09:09.277339 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a7f70330-cb87-42e5-96c8-6d54828f2a5a","Type":"ContainerDied","Data":"d8b81a1d862c648975bd9a812fe1d61df727077dd39a97f4adfc70dac6066075"} Mar 13 12:09:09 crc kubenswrapper[4837]: I0313 12:09:09.279977 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"abd69ff2-e72e-40c0-925f-d0c1c0a40f9a","Type":"ContainerStarted","Data":"f66785422b595e76fe8cdbc3485cda087523c65c31dd5c0f304d466dab6a34ce"} Mar 13 12:09:09 crc kubenswrapper[4837]: I0313 12:09:09.280112 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 13 12:09:09 crc kubenswrapper[4837]: I0313 12:09:09.305054 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.933464064 podStartE2EDuration="2.305035574s" podCreationTimestamp="2026-03-13 12:09:07 +0000 UTC" firstStartedPulling="2026-03-13 12:09:08.150996528 +0000 UTC m=+1263.789263291" lastFinishedPulling="2026-03-13 12:09:08.522568018 +0000 UTC m=+1264.160834801" observedRunningTime="2026-03-13 12:09:09.303438884 +0000 UTC m=+1264.941705667" watchObservedRunningTime="2026-03-13 12:09:09.305035574 +0000 UTC m=+1264.943302337" Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.316466 4837 generic.go:334] "Generic (PLEG): container finished" podID="a7f70330-cb87-42e5-96c8-6d54828f2a5a" containerID="6d24d7cecf025123d4d281213efc8079b0cb18a3f100808ee593959500d93094" exitCode=0 Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.316509 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a7f70330-cb87-42e5-96c8-6d54828f2a5a","Type":"ContainerDied","Data":"6d24d7cecf025123d4d281213efc8079b0cb18a3f100808ee593959500d93094"} Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.318749 4837 generic.go:334] "Generic (PLEG): container finished" podID="81ec286a-b6df-4462-8023-c01230a50793" containerID="2b72c4b74ac632994ae39578139216d840009de89378dfe0823503769ad992b6" exitCode=137 Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.318789 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"81ec286a-b6df-4462-8023-c01230a50793","Type":"ContainerDied","Data":"2b72c4b74ac632994ae39578139216d840009de89378dfe0823503769ad992b6"} Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.318816 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"81ec286a-b6df-4462-8023-c01230a50793","Type":"ContainerDied","Data":"78f61644c1756b2a1acf80d548b16d064b0de263e518cd87a1b42cea8c63088a"} Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.318830 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78f61644c1756b2a1acf80d548b16d064b0de263e518cd87a1b42cea8c63088a" Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.406135 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.517140 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.517718 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.521134 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.577365 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81ec286a-b6df-4462-8023-c01230a50793-config-data\") pod \"81ec286a-b6df-4462-8023-c01230a50793\" (UID: \"81ec286a-b6df-4462-8023-c01230a50793\") " Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.577511 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srf5b\" (UniqueName: \"kubernetes.io/projected/81ec286a-b6df-4462-8023-c01230a50793-kube-api-access-srf5b\") pod \"81ec286a-b6df-4462-8023-c01230a50793\" (UID: \"81ec286a-b6df-4462-8023-c01230a50793\") " Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.577566 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81ec286a-b6df-4462-8023-c01230a50793-combined-ca-bundle\") pod \"81ec286a-b6df-4462-8023-c01230a50793\" (UID: \"81ec286a-b6df-4462-8023-c01230a50793\") " Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.584836 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81ec286a-b6df-4462-8023-c01230a50793-kube-api-access-srf5b" (OuterVolumeSpecName: "kube-api-access-srf5b") pod "81ec286a-b6df-4462-8023-c01230a50793" (UID: "81ec286a-b6df-4462-8023-c01230a50793"). InnerVolumeSpecName "kube-api-access-srf5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.604718 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81ec286a-b6df-4462-8023-c01230a50793-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "81ec286a-b6df-4462-8023-c01230a50793" (UID: "81ec286a-b6df-4462-8023-c01230a50793"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.607172 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81ec286a-b6df-4462-8023-c01230a50793-config-data" (OuterVolumeSpecName: "config-data") pod "81ec286a-b6df-4462-8023-c01230a50793" (UID: "81ec286a-b6df-4462-8023-c01230a50793"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.628792 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.680540 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81ec286a-b6df-4462-8023-c01230a50793-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.681005 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srf5b\" (UniqueName: \"kubernetes.io/projected/81ec286a-b6df-4462-8023-c01230a50793-kube-api-access-srf5b\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.681020 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81ec286a-b6df-4462-8023-c01230a50793-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.782879 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a7f70330-cb87-42e5-96c8-6d54828f2a5a-sg-core-conf-yaml\") pod \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\" (UID: \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\") " Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.782981 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzlqd\" (UniqueName: \"kubernetes.io/projected/a7f70330-cb87-42e5-96c8-6d54828f2a5a-kube-api-access-lzlqd\") pod \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\" (UID: \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\") " Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.783055 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7f70330-cb87-42e5-96c8-6d54828f2a5a-combined-ca-bundle\") pod \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\" (UID: \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\") " Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.783116 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a7f70330-cb87-42e5-96c8-6d54828f2a5a-run-httpd\") pod \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\" (UID: \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\") " Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.783186 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7f70330-cb87-42e5-96c8-6d54828f2a5a-config-data\") pod \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\" (UID: \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\") " Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.783258 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a7f70330-cb87-42e5-96c8-6d54828f2a5a-log-httpd\") pod \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\" (UID: \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\") " Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.783294 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7f70330-cb87-42e5-96c8-6d54828f2a5a-scripts\") pod \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\" (UID: \"a7f70330-cb87-42e5-96c8-6d54828f2a5a\") " Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.783802 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7f70330-cb87-42e5-96c8-6d54828f2a5a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a7f70330-cb87-42e5-96c8-6d54828f2a5a" (UID: "a7f70330-cb87-42e5-96c8-6d54828f2a5a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.783852 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7f70330-cb87-42e5-96c8-6d54828f2a5a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a7f70330-cb87-42e5-96c8-6d54828f2a5a" (UID: "a7f70330-cb87-42e5-96c8-6d54828f2a5a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.784364 4837 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a7f70330-cb87-42e5-96c8-6d54828f2a5a-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.784385 4837 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a7f70330-cb87-42e5-96c8-6d54828f2a5a-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.787290 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7f70330-cb87-42e5-96c8-6d54828f2a5a-kube-api-access-lzlqd" (OuterVolumeSpecName: "kube-api-access-lzlqd") pod "a7f70330-cb87-42e5-96c8-6d54828f2a5a" (UID: "a7f70330-cb87-42e5-96c8-6d54828f2a5a"). InnerVolumeSpecName "kube-api-access-lzlqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.788296 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7f70330-cb87-42e5-96c8-6d54828f2a5a-scripts" (OuterVolumeSpecName: "scripts") pod "a7f70330-cb87-42e5-96c8-6d54828f2a5a" (UID: "a7f70330-cb87-42e5-96c8-6d54828f2a5a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.812880 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7f70330-cb87-42e5-96c8-6d54828f2a5a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a7f70330-cb87-42e5-96c8-6d54828f2a5a" (UID: "a7f70330-cb87-42e5-96c8-6d54828f2a5a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.857165 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7f70330-cb87-42e5-96c8-6d54828f2a5a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a7f70330-cb87-42e5-96c8-6d54828f2a5a" (UID: "a7f70330-cb87-42e5-96c8-6d54828f2a5a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.877176 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7f70330-cb87-42e5-96c8-6d54828f2a5a-config-data" (OuterVolumeSpecName: "config-data") pod "a7f70330-cb87-42e5-96c8-6d54828f2a5a" (UID: "a7f70330-cb87-42e5-96c8-6d54828f2a5a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.886386 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzlqd\" (UniqueName: \"kubernetes.io/projected/a7f70330-cb87-42e5-96c8-6d54828f2a5a-kube-api-access-lzlqd\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.886419 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7f70330-cb87-42e5-96c8-6d54828f2a5a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.886428 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7f70330-cb87-42e5-96c8-6d54828f2a5a-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.886437 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7f70330-cb87-42e5-96c8-6d54828f2a5a-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:13 crc kubenswrapper[4837]: I0313 12:09:13.886446 4837 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a7f70330-cb87-42e5-96c8-6d54828f2a5a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.329360 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.329367 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a7f70330-cb87-42e5-96c8-6d54828f2a5a","Type":"ContainerDied","Data":"c1caae87e2bfbe9657e4b62036ebd200f6d2445955d6ab66a4adb47c94a2fae0"} Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.330596 4837 scope.go:117] "RemoveContainer" containerID="94da49d6a7255e5847d10069ee75dd614b4c6eea7e080a518814f780623556e5" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.329511 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.336972 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.367351 4837 scope.go:117] "RemoveContainer" containerID="a9f4ef9baf51c5a45fe25c828b539addde1c0065712a676f95056b2183f00569" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.390037 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.399806 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.411814 4837 scope.go:117] "RemoveContainer" containerID="6d24d7cecf025123d4d281213efc8079b0cb18a3f100808ee593959500d93094" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.426526 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 12:09:14 crc kubenswrapper[4837]: E0313 12:09:14.427070 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81ec286a-b6df-4462-8023-c01230a50793" containerName="nova-cell1-novncproxy-novncproxy" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.427092 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="81ec286a-b6df-4462-8023-c01230a50793" containerName="nova-cell1-novncproxy-novncproxy" Mar 13 12:09:14 crc kubenswrapper[4837]: E0313 12:09:14.427124 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7f70330-cb87-42e5-96c8-6d54828f2a5a" containerName="ceilometer-central-agent" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.427133 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7f70330-cb87-42e5-96c8-6d54828f2a5a" containerName="ceilometer-central-agent" Mar 13 12:09:14 crc kubenswrapper[4837]: E0313 12:09:14.427158 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7f70330-cb87-42e5-96c8-6d54828f2a5a" containerName="ceilometer-notification-agent" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.427167 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7f70330-cb87-42e5-96c8-6d54828f2a5a" containerName="ceilometer-notification-agent" Mar 13 12:09:14 crc kubenswrapper[4837]: E0313 12:09:14.427177 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7f70330-cb87-42e5-96c8-6d54828f2a5a" containerName="proxy-httpd" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.427184 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7f70330-cb87-42e5-96c8-6d54828f2a5a" containerName="proxy-httpd" Mar 13 12:09:14 crc kubenswrapper[4837]: E0313 12:09:14.427200 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7f70330-cb87-42e5-96c8-6d54828f2a5a" containerName="sg-core" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.427210 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7f70330-cb87-42e5-96c8-6d54828f2a5a" containerName="sg-core" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.427442 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7f70330-cb87-42e5-96c8-6d54828f2a5a" containerName="ceilometer-central-agent" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.427462 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="81ec286a-b6df-4462-8023-c01230a50793" containerName="nova-cell1-novncproxy-novncproxy" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.427486 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7f70330-cb87-42e5-96c8-6d54828f2a5a" containerName="proxy-httpd" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.427511 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7f70330-cb87-42e5-96c8-6d54828f2a5a" containerName="sg-core" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.427524 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7f70330-cb87-42e5-96c8-6d54828f2a5a" containerName="ceilometer-notification-agent" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.428297 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.441981 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.441981 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.442066 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.442450 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.458401 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.465376 4837 scope.go:117] "RemoveContainer" containerID="d8b81a1d862c648975bd9a812fe1d61df727077dd39a97f4adfc70dac6066075" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.475502 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.489035 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.491726 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.494041 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.494569 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.496753 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.499225 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.614172 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " pod="openstack/ceilometer-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.614247 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " pod="openstack/ceilometer-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.614309 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/662e258d-fe94-4373-912d-c906f1e93c90-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"662e258d-fe94-4373-912d-c906f1e93c90\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.614327 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/662e258d-fe94-4373-912d-c906f1e93c90-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"662e258d-fe94-4373-912d-c906f1e93c90\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.614378 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/662e258d-fe94-4373-912d-c906f1e93c90-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"662e258d-fe94-4373-912d-c906f1e93c90\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.614424 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87f0825e-ff58-4bf4-bf83-6522dcc333e2-run-httpd\") pod \"ceilometer-0\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " pod="openstack/ceilometer-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.614443 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmn4v\" (UniqueName: \"kubernetes.io/projected/662e258d-fe94-4373-912d-c906f1e93c90-kube-api-access-xmn4v\") pod \"nova-cell1-novncproxy-0\" (UID: \"662e258d-fe94-4373-912d-c906f1e93c90\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.614463 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " pod="openstack/ceilometer-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.614513 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzw9w\" (UniqueName: \"kubernetes.io/projected/87f0825e-ff58-4bf4-bf83-6522dcc333e2-kube-api-access-bzw9w\") pod \"ceilometer-0\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " pod="openstack/ceilometer-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.614743 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87f0825e-ff58-4bf4-bf83-6522dcc333e2-log-httpd\") pod \"ceilometer-0\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " pod="openstack/ceilometer-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.614895 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/662e258d-fe94-4373-912d-c906f1e93c90-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"662e258d-fe94-4373-912d-c906f1e93c90\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.614956 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-scripts\") pod \"ceilometer-0\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " pod="openstack/ceilometer-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.615095 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-config-data\") pod \"ceilometer-0\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " pod="openstack/ceilometer-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.717083 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " pod="openstack/ceilometer-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.717434 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " pod="openstack/ceilometer-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.717494 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/662e258d-fe94-4373-912d-c906f1e93c90-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"662e258d-fe94-4373-912d-c906f1e93c90\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.717518 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/662e258d-fe94-4373-912d-c906f1e93c90-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"662e258d-fe94-4373-912d-c906f1e93c90\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.717557 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/662e258d-fe94-4373-912d-c906f1e93c90-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"662e258d-fe94-4373-912d-c906f1e93c90\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.717587 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87f0825e-ff58-4bf4-bf83-6522dcc333e2-run-httpd\") pod \"ceilometer-0\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " pod="openstack/ceilometer-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.717611 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmn4v\" (UniqueName: \"kubernetes.io/projected/662e258d-fe94-4373-912d-c906f1e93c90-kube-api-access-xmn4v\") pod \"nova-cell1-novncproxy-0\" (UID: \"662e258d-fe94-4373-912d-c906f1e93c90\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.717649 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " pod="openstack/ceilometer-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.717677 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzw9w\" (UniqueName: \"kubernetes.io/projected/87f0825e-ff58-4bf4-bf83-6522dcc333e2-kube-api-access-bzw9w\") pod \"ceilometer-0\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " pod="openstack/ceilometer-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.717707 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87f0825e-ff58-4bf4-bf83-6522dcc333e2-log-httpd\") pod \"ceilometer-0\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " pod="openstack/ceilometer-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.717732 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/662e258d-fe94-4373-912d-c906f1e93c90-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"662e258d-fe94-4373-912d-c906f1e93c90\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.717748 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-scripts\") pod \"ceilometer-0\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " pod="openstack/ceilometer-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.717775 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-config-data\") pod \"ceilometer-0\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " pod="openstack/ceilometer-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.718969 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87f0825e-ff58-4bf4-bf83-6522dcc333e2-log-httpd\") pod \"ceilometer-0\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " pod="openstack/ceilometer-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.719359 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87f0825e-ff58-4bf4-bf83-6522dcc333e2-run-httpd\") pod \"ceilometer-0\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " pod="openstack/ceilometer-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.721385 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " pod="openstack/ceilometer-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.721676 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-scripts\") pod \"ceilometer-0\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " pod="openstack/ceilometer-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.721865 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/662e258d-fe94-4373-912d-c906f1e93c90-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"662e258d-fe94-4373-912d-c906f1e93c90\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.721946 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/662e258d-fe94-4373-912d-c906f1e93c90-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"662e258d-fe94-4373-912d-c906f1e93c90\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.723078 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-config-data\") pod \"ceilometer-0\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " pod="openstack/ceilometer-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.723513 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " pod="openstack/ceilometer-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.727374 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " pod="openstack/ceilometer-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.728343 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/662e258d-fe94-4373-912d-c906f1e93c90-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"662e258d-fe94-4373-912d-c906f1e93c90\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.737106 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/662e258d-fe94-4373-912d-c906f1e93c90-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"662e258d-fe94-4373-912d-c906f1e93c90\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.741394 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzw9w\" (UniqueName: \"kubernetes.io/projected/87f0825e-ff58-4bf4-bf83-6522dcc333e2-kube-api-access-bzw9w\") pod \"ceilometer-0\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " pod="openstack/ceilometer-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.749220 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmn4v\" (UniqueName: \"kubernetes.io/projected/662e258d-fe94-4373-912d-c906f1e93c90-kube-api-access-xmn4v\") pod \"nova-cell1-novncproxy-0\" (UID: \"662e258d-fe94-4373-912d-c906f1e93c90\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.767734 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:09:14 crc kubenswrapper[4837]: I0313 12:09:14.807097 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:09:15 crc kubenswrapper[4837]: I0313 12:09:15.063757 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81ec286a-b6df-4462-8023-c01230a50793" path="/var/lib/kubelet/pods/81ec286a-b6df-4462-8023-c01230a50793/volumes" Mar 13 12:09:15 crc kubenswrapper[4837]: I0313 12:09:15.064507 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7f70330-cb87-42e5-96c8-6d54828f2a5a" path="/var/lib/kubelet/pods/a7f70330-cb87-42e5-96c8-6d54828f2a5a/volumes" Mar 13 12:09:15 crc kubenswrapper[4837]: I0313 12:09:15.260986 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 12:09:15 crc kubenswrapper[4837]: I0313 12:09:15.380655 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"662e258d-fe94-4373-912d-c906f1e93c90","Type":"ContainerStarted","Data":"e242c6f26aa5b6f1cc1df2ec9bee4085137c95b2dd848bcff17437ed302ab123"} Mar 13 12:09:15 crc kubenswrapper[4837]: I0313 12:09:15.385049 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:09:15 crc kubenswrapper[4837]: I0313 12:09:15.829751 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 13 12:09:15 crc kubenswrapper[4837]: I0313 12:09:15.830129 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 13 12:09:15 crc kubenswrapper[4837]: I0313 12:09:15.830556 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 13 12:09:15 crc kubenswrapper[4837]: I0313 12:09:15.830624 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 13 12:09:15 crc kubenswrapper[4837]: I0313 12:09:15.833395 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 13 12:09:15 crc kubenswrapper[4837]: I0313 12:09:15.835542 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 13 12:09:16 crc kubenswrapper[4837]: I0313 12:09:16.044883 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-ql9zn"] Mar 13 12:09:16 crc kubenswrapper[4837]: I0313 12:09:16.046788 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" Mar 13 12:09:16 crc kubenswrapper[4837]: I0313 12:09:16.063823 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-ql9zn"] Mar 13 12:09:16 crc kubenswrapper[4837]: I0313 12:09:16.144424 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-ql9zn\" (UID: \"6d9c85e6-5c66-4c94-996b-0278453fd29c\") " pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" Mar 13 12:09:16 crc kubenswrapper[4837]: I0313 12:09:16.144804 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-config\") pod \"dnsmasq-dns-89c5cd4d5-ql9zn\" (UID: \"6d9c85e6-5c66-4c94-996b-0278453fd29c\") " pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" Mar 13 12:09:16 crc kubenswrapper[4837]: I0313 12:09:16.144891 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-ql9zn\" (UID: \"6d9c85e6-5c66-4c94-996b-0278453fd29c\") " pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" Mar 13 12:09:16 crc kubenswrapper[4837]: I0313 12:09:16.144966 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-ql9zn\" (UID: \"6d9c85e6-5c66-4c94-996b-0278453fd29c\") " pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" Mar 13 12:09:16 crc kubenswrapper[4837]: I0313 12:09:16.145083 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-ql9zn\" (UID: \"6d9c85e6-5c66-4c94-996b-0278453fd29c\") " pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" Mar 13 12:09:16 crc kubenswrapper[4837]: I0313 12:09:16.145114 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdwv8\" (UniqueName: \"kubernetes.io/projected/6d9c85e6-5c66-4c94-996b-0278453fd29c-kube-api-access-zdwv8\") pod \"dnsmasq-dns-89c5cd4d5-ql9zn\" (UID: \"6d9c85e6-5c66-4c94-996b-0278453fd29c\") " pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" Mar 13 12:09:16 crc kubenswrapper[4837]: I0313 12:09:16.247268 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-ql9zn\" (UID: \"6d9c85e6-5c66-4c94-996b-0278453fd29c\") " pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" Mar 13 12:09:16 crc kubenswrapper[4837]: I0313 12:09:16.247432 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-ql9zn\" (UID: \"6d9c85e6-5c66-4c94-996b-0278453fd29c\") " pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" Mar 13 12:09:16 crc kubenswrapper[4837]: I0313 12:09:16.247570 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-ql9zn\" (UID: \"6d9c85e6-5c66-4c94-996b-0278453fd29c\") " pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" Mar 13 12:09:16 crc kubenswrapper[4837]: I0313 12:09:16.247631 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdwv8\" (UniqueName: \"kubernetes.io/projected/6d9c85e6-5c66-4c94-996b-0278453fd29c-kube-api-access-zdwv8\") pod \"dnsmasq-dns-89c5cd4d5-ql9zn\" (UID: \"6d9c85e6-5c66-4c94-996b-0278453fd29c\") " pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" Mar 13 12:09:16 crc kubenswrapper[4837]: I0313 12:09:16.247752 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-ql9zn\" (UID: \"6d9c85e6-5c66-4c94-996b-0278453fd29c\") " pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" Mar 13 12:09:16 crc kubenswrapper[4837]: I0313 12:09:16.247790 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-config\") pod \"dnsmasq-dns-89c5cd4d5-ql9zn\" (UID: \"6d9c85e6-5c66-4c94-996b-0278453fd29c\") " pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" Mar 13 12:09:16 crc kubenswrapper[4837]: I0313 12:09:16.248385 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-ql9zn\" (UID: \"6d9c85e6-5c66-4c94-996b-0278453fd29c\") " pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" Mar 13 12:09:16 crc kubenswrapper[4837]: I0313 12:09:16.251448 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-config\") pod \"dnsmasq-dns-89c5cd4d5-ql9zn\" (UID: \"6d9c85e6-5c66-4c94-996b-0278453fd29c\") " pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" Mar 13 12:09:16 crc kubenswrapper[4837]: I0313 12:09:16.252499 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-ql9zn\" (UID: \"6d9c85e6-5c66-4c94-996b-0278453fd29c\") " pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" Mar 13 12:09:16 crc kubenswrapper[4837]: I0313 12:09:16.252515 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-ql9zn\" (UID: \"6d9c85e6-5c66-4c94-996b-0278453fd29c\") " pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" Mar 13 12:09:16 crc kubenswrapper[4837]: I0313 12:09:16.252705 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-ql9zn\" (UID: \"6d9c85e6-5c66-4c94-996b-0278453fd29c\") " pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" Mar 13 12:09:16 crc kubenswrapper[4837]: I0313 12:09:16.272953 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdwv8\" (UniqueName: \"kubernetes.io/projected/6d9c85e6-5c66-4c94-996b-0278453fd29c-kube-api-access-zdwv8\") pod \"dnsmasq-dns-89c5cd4d5-ql9zn\" (UID: \"6d9c85e6-5c66-4c94-996b-0278453fd29c\") " pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" Mar 13 12:09:16 crc kubenswrapper[4837]: I0313 12:09:16.380787 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" Mar 13 12:09:16 crc kubenswrapper[4837]: I0313 12:09:16.394397 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"662e258d-fe94-4373-912d-c906f1e93c90","Type":"ContainerStarted","Data":"3ce5094b532dea20025f607d318aa78612e1278ac2854d7f4b06ea6f2e4d4746"} Mar 13 12:09:16 crc kubenswrapper[4837]: I0313 12:09:16.402123 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87f0825e-ff58-4bf4-bf83-6522dcc333e2","Type":"ContainerStarted","Data":"26f413af49f804fe13ff8c9b2b887ef2e357277501d29cf746c057c4c5c85b88"} Mar 13 12:09:16 crc kubenswrapper[4837]: I0313 12:09:16.402174 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87f0825e-ff58-4bf4-bf83-6522dcc333e2","Type":"ContainerStarted","Data":"8371eb3af69417b51c9486141ddff58bfc7ec752522d7b166877385a6b1e772e"} Mar 13 12:09:16 crc kubenswrapper[4837]: I0313 12:09:16.410673 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.410656599 podStartE2EDuration="2.410656599s" podCreationTimestamp="2026-03-13 12:09:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:09:16.409917896 +0000 UTC m=+1272.048184659" watchObservedRunningTime="2026-03-13 12:09:16.410656599 +0000 UTC m=+1272.048923362" Mar 13 12:09:17 crc kubenswrapper[4837]: W0313 12:09:16.886731 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d9c85e6_5c66_4c94_996b_0278453fd29c.slice/crio-b047a2dec8a72a4326c0ddd3270cd5183bb922bd572e2b9241c600e148c1eea7 WatchSource:0}: Error finding container b047a2dec8a72a4326c0ddd3270cd5183bb922bd572e2b9241c600e148c1eea7: Status 404 returned error can't find the container with id b047a2dec8a72a4326c0ddd3270cd5183bb922bd572e2b9241c600e148c1eea7 Mar 13 12:09:17 crc kubenswrapper[4837]: I0313 12:09:16.896017 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-ql9zn"] Mar 13 12:09:17 crc kubenswrapper[4837]: I0313 12:09:17.412223 4837 generic.go:334] "Generic (PLEG): container finished" podID="6d9c85e6-5c66-4c94-996b-0278453fd29c" containerID="0ac8018727334fad931d8e9b782b5ff6d28c6c9743c0f7da2e79336a427ee5cf" exitCode=0 Mar 13 12:09:17 crc kubenswrapper[4837]: I0313 12:09:17.412302 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" event={"ID":"6d9c85e6-5c66-4c94-996b-0278453fd29c","Type":"ContainerDied","Data":"0ac8018727334fad931d8e9b782b5ff6d28c6c9743c0f7da2e79336a427ee5cf"} Mar 13 12:09:17 crc kubenswrapper[4837]: I0313 12:09:17.412614 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" event={"ID":"6d9c85e6-5c66-4c94-996b-0278453fd29c","Type":"ContainerStarted","Data":"b047a2dec8a72a4326c0ddd3270cd5183bb922bd572e2b9241c600e148c1eea7"} Mar 13 12:09:17 crc kubenswrapper[4837]: I0313 12:09:17.416646 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87f0825e-ff58-4bf4-bf83-6522dcc333e2","Type":"ContainerStarted","Data":"e7e5f4559015b914e67c3cf32dcbe3ce14a32ed203cfc64d9b54f0456af10b72"} Mar 13 12:09:17 crc kubenswrapper[4837]: I0313 12:09:17.642677 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 13 12:09:18 crc kubenswrapper[4837]: I0313 12:09:18.431225 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" event={"ID":"6d9c85e6-5c66-4c94-996b-0278453fd29c","Type":"ContainerStarted","Data":"daf8bcea9fd0562663127a8a93a369152f67e1407f2a8bee704558594419d5d6"} Mar 13 12:09:18 crc kubenswrapper[4837]: I0313 12:09:18.432361 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" Mar 13 12:09:18 crc kubenswrapper[4837]: I0313 12:09:18.437193 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87f0825e-ff58-4bf4-bf83-6522dcc333e2","Type":"ContainerStarted","Data":"0ad1a68e05e02f84e1d06c1b173ebb7a467268f5b60941fa800e2751955e3df1"} Mar 13 12:09:18 crc kubenswrapper[4837]: I0313 12:09:18.459584 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" podStartSLOduration=3.45955511 podStartE2EDuration="3.45955511s" podCreationTimestamp="2026-03-13 12:09:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:09:18.458385084 +0000 UTC m=+1274.096651847" watchObservedRunningTime="2026-03-13 12:09:18.45955511 +0000 UTC m=+1274.097821863" Mar 13 12:09:18 crc kubenswrapper[4837]: I0313 12:09:18.612490 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 12:09:18 crc kubenswrapper[4837]: I0313 12:09:18.612946 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="534b3e48-da2d-41b6-af02-bef43adcac21" containerName="nova-api-log" containerID="cri-o://723ce6d8aa73da7a7f75ecb7c83fcc098afc292834683b8ef574eab17dadd84a" gracePeriod=30 Mar 13 12:09:18 crc kubenswrapper[4837]: I0313 12:09:18.613116 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="534b3e48-da2d-41b6-af02-bef43adcac21" containerName="nova-api-api" containerID="cri-o://8dadb969173ec34b078eb02203c6c8d0426368edb3963a41943b966c8fe59ae6" gracePeriod=30 Mar 13 12:09:19 crc kubenswrapper[4837]: I0313 12:09:19.146307 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:09:19 crc kubenswrapper[4837]: I0313 12:09:19.449379 4837 generic.go:334] "Generic (PLEG): container finished" podID="534b3e48-da2d-41b6-af02-bef43adcac21" containerID="723ce6d8aa73da7a7f75ecb7c83fcc098afc292834683b8ef574eab17dadd84a" exitCode=143 Mar 13 12:09:19 crc kubenswrapper[4837]: I0313 12:09:19.449448 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"534b3e48-da2d-41b6-af02-bef43adcac21","Type":"ContainerDied","Data":"723ce6d8aa73da7a7f75ecb7c83fcc098afc292834683b8ef574eab17dadd84a"} Mar 13 12:09:19 crc kubenswrapper[4837]: I0313 12:09:19.768902 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:09:20 crc kubenswrapper[4837]: I0313 12:09:20.462282 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87f0825e-ff58-4bf4-bf83-6522dcc333e2","Type":"ContainerStarted","Data":"770ef0a10fb37e3814cf493f7e62eb7e15ce55983a506bec280081e698fad5c8"} Mar 13 12:09:20 crc kubenswrapper[4837]: I0313 12:09:20.462442 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="87f0825e-ff58-4bf4-bf83-6522dcc333e2" containerName="proxy-httpd" containerID="cri-o://770ef0a10fb37e3814cf493f7e62eb7e15ce55983a506bec280081e698fad5c8" gracePeriod=30 Mar 13 12:09:20 crc kubenswrapper[4837]: I0313 12:09:20.462495 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 12:09:20 crc kubenswrapper[4837]: I0313 12:09:20.462443 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="87f0825e-ff58-4bf4-bf83-6522dcc333e2" containerName="sg-core" containerID="cri-o://0ad1a68e05e02f84e1d06c1b173ebb7a467268f5b60941fa800e2751955e3df1" gracePeriod=30 Mar 13 12:09:20 crc kubenswrapper[4837]: I0313 12:09:20.462409 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="87f0825e-ff58-4bf4-bf83-6522dcc333e2" containerName="ceilometer-central-agent" containerID="cri-o://26f413af49f804fe13ff8c9b2b887ef2e357277501d29cf746c057c4c5c85b88" gracePeriod=30 Mar 13 12:09:20 crc kubenswrapper[4837]: I0313 12:09:20.462530 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="87f0825e-ff58-4bf4-bf83-6522dcc333e2" containerName="ceilometer-notification-agent" containerID="cri-o://e7e5f4559015b914e67c3cf32dcbe3ce14a32ed203cfc64d9b54f0456af10b72" gracePeriod=30 Mar 13 12:09:20 crc kubenswrapper[4837]: I0313 12:09:20.494848 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.23566101 podStartE2EDuration="6.494827362s" podCreationTimestamp="2026-03-13 12:09:14 +0000 UTC" firstStartedPulling="2026-03-13 12:09:15.39019404 +0000 UTC m=+1271.028460803" lastFinishedPulling="2026-03-13 12:09:19.649360392 +0000 UTC m=+1275.287627155" observedRunningTime="2026-03-13 12:09:20.481139031 +0000 UTC m=+1276.119405804" watchObservedRunningTime="2026-03-13 12:09:20.494827362 +0000 UTC m=+1276.133094125" Mar 13 12:09:21 crc kubenswrapper[4837]: I0313 12:09:21.477719 4837 generic.go:334] "Generic (PLEG): container finished" podID="87f0825e-ff58-4bf4-bf83-6522dcc333e2" containerID="770ef0a10fb37e3814cf493f7e62eb7e15ce55983a506bec280081e698fad5c8" exitCode=0 Mar 13 12:09:21 crc kubenswrapper[4837]: I0313 12:09:21.477808 4837 generic.go:334] "Generic (PLEG): container finished" podID="87f0825e-ff58-4bf4-bf83-6522dcc333e2" containerID="0ad1a68e05e02f84e1d06c1b173ebb7a467268f5b60941fa800e2751955e3df1" exitCode=2 Mar 13 12:09:21 crc kubenswrapper[4837]: I0313 12:09:21.477829 4837 generic.go:334] "Generic (PLEG): container finished" podID="87f0825e-ff58-4bf4-bf83-6522dcc333e2" containerID="e7e5f4559015b914e67c3cf32dcbe3ce14a32ed203cfc64d9b54f0456af10b72" exitCode=0 Mar 13 12:09:21 crc kubenswrapper[4837]: I0313 12:09:21.477758 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87f0825e-ff58-4bf4-bf83-6522dcc333e2","Type":"ContainerDied","Data":"770ef0a10fb37e3814cf493f7e62eb7e15ce55983a506bec280081e698fad5c8"} Mar 13 12:09:21 crc kubenswrapper[4837]: I0313 12:09:21.477882 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87f0825e-ff58-4bf4-bf83-6522dcc333e2","Type":"ContainerDied","Data":"0ad1a68e05e02f84e1d06c1b173ebb7a467268f5b60941fa800e2751955e3df1"} Mar 13 12:09:21 crc kubenswrapper[4837]: I0313 12:09:21.477904 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87f0825e-ff58-4bf4-bf83-6522dcc333e2","Type":"ContainerDied","Data":"e7e5f4559015b914e67c3cf32dcbe3ce14a32ed203cfc64d9b54f0456af10b72"} Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.241958 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.307560 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/534b3e48-da2d-41b6-af02-bef43adcac21-logs\") pod \"534b3e48-da2d-41b6-af02-bef43adcac21\" (UID: \"534b3e48-da2d-41b6-af02-bef43adcac21\") " Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.307654 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534b3e48-da2d-41b6-af02-bef43adcac21-combined-ca-bundle\") pod \"534b3e48-da2d-41b6-af02-bef43adcac21\" (UID: \"534b3e48-da2d-41b6-af02-bef43adcac21\") " Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.307760 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/534b3e48-da2d-41b6-af02-bef43adcac21-config-data\") pod \"534b3e48-da2d-41b6-af02-bef43adcac21\" (UID: \"534b3e48-da2d-41b6-af02-bef43adcac21\") " Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.307857 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxww9\" (UniqueName: \"kubernetes.io/projected/534b3e48-da2d-41b6-af02-bef43adcac21-kube-api-access-hxww9\") pod \"534b3e48-da2d-41b6-af02-bef43adcac21\" (UID: \"534b3e48-da2d-41b6-af02-bef43adcac21\") " Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.308524 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/534b3e48-da2d-41b6-af02-bef43adcac21-logs" (OuterVolumeSpecName: "logs") pod "534b3e48-da2d-41b6-af02-bef43adcac21" (UID: "534b3e48-da2d-41b6-af02-bef43adcac21"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.310332 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/534b3e48-da2d-41b6-af02-bef43adcac21-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.315302 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/534b3e48-da2d-41b6-af02-bef43adcac21-kube-api-access-hxww9" (OuterVolumeSpecName: "kube-api-access-hxww9") pod "534b3e48-da2d-41b6-af02-bef43adcac21" (UID: "534b3e48-da2d-41b6-af02-bef43adcac21"). InnerVolumeSpecName "kube-api-access-hxww9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.334348 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/534b3e48-da2d-41b6-af02-bef43adcac21-config-data" (OuterVolumeSpecName: "config-data") pod "534b3e48-da2d-41b6-af02-bef43adcac21" (UID: "534b3e48-da2d-41b6-af02-bef43adcac21"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.359231 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/534b3e48-da2d-41b6-af02-bef43adcac21-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "534b3e48-da2d-41b6-af02-bef43adcac21" (UID: "534b3e48-da2d-41b6-af02-bef43adcac21"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.412243 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/534b3e48-da2d-41b6-af02-bef43adcac21-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.412288 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxww9\" (UniqueName: \"kubernetes.io/projected/534b3e48-da2d-41b6-af02-bef43adcac21-kube-api-access-hxww9\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.412301 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534b3e48-da2d-41b6-af02-bef43adcac21-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.507526 4837 generic.go:334] "Generic (PLEG): container finished" podID="534b3e48-da2d-41b6-af02-bef43adcac21" containerID="8dadb969173ec34b078eb02203c6c8d0426368edb3963a41943b966c8fe59ae6" exitCode=0 Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.507573 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"534b3e48-da2d-41b6-af02-bef43adcac21","Type":"ContainerDied","Data":"8dadb969173ec34b078eb02203c6c8d0426368edb3963a41943b966c8fe59ae6"} Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.507588 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.507613 4837 scope.go:117] "RemoveContainer" containerID="8dadb969173ec34b078eb02203c6c8d0426368edb3963a41943b966c8fe59ae6" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.507599 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"534b3e48-da2d-41b6-af02-bef43adcac21","Type":"ContainerDied","Data":"bcd98412fdb19b2343b2d9cf6ae91d9ffecdeb486a55232ac24c6fe9606a8706"} Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.545952 4837 scope.go:117] "RemoveContainer" containerID="723ce6d8aa73da7a7f75ecb7c83fcc098afc292834683b8ef574eab17dadd84a" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.555146 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.569010 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.576909 4837 scope.go:117] "RemoveContainer" containerID="8dadb969173ec34b078eb02203c6c8d0426368edb3963a41943b966c8fe59ae6" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.579631 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 13 12:09:22 crc kubenswrapper[4837]: E0313 12:09:22.580099 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="534b3e48-da2d-41b6-af02-bef43adcac21" containerName="nova-api-api" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.580121 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="534b3e48-da2d-41b6-af02-bef43adcac21" containerName="nova-api-api" Mar 13 12:09:22 crc kubenswrapper[4837]: E0313 12:09:22.580143 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="534b3e48-da2d-41b6-af02-bef43adcac21" containerName="nova-api-log" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.580155 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="534b3e48-da2d-41b6-af02-bef43adcac21" containerName="nova-api-log" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.580414 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="534b3e48-da2d-41b6-af02-bef43adcac21" containerName="nova-api-api" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.580435 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="534b3e48-da2d-41b6-af02-bef43adcac21" containerName="nova-api-log" Mar 13 12:09:22 crc kubenswrapper[4837]: E0313 12:09:22.580803 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8dadb969173ec34b078eb02203c6c8d0426368edb3963a41943b966c8fe59ae6\": container with ID starting with 8dadb969173ec34b078eb02203c6c8d0426368edb3963a41943b966c8fe59ae6 not found: ID does not exist" containerID="8dadb969173ec34b078eb02203c6c8d0426368edb3963a41943b966c8fe59ae6" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.580855 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dadb969173ec34b078eb02203c6c8d0426368edb3963a41943b966c8fe59ae6"} err="failed to get container status \"8dadb969173ec34b078eb02203c6c8d0426368edb3963a41943b966c8fe59ae6\": rpc error: code = NotFound desc = could not find container \"8dadb969173ec34b078eb02203c6c8d0426368edb3963a41943b966c8fe59ae6\": container with ID starting with 8dadb969173ec34b078eb02203c6c8d0426368edb3963a41943b966c8fe59ae6 not found: ID does not exist" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.580887 4837 scope.go:117] "RemoveContainer" containerID="723ce6d8aa73da7a7f75ecb7c83fcc098afc292834683b8ef574eab17dadd84a" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.581678 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 12:09:22 crc kubenswrapper[4837]: E0313 12:09:22.582836 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"723ce6d8aa73da7a7f75ecb7c83fcc098afc292834683b8ef574eab17dadd84a\": container with ID starting with 723ce6d8aa73da7a7f75ecb7c83fcc098afc292834683b8ef574eab17dadd84a not found: ID does not exist" containerID="723ce6d8aa73da7a7f75ecb7c83fcc098afc292834683b8ef574eab17dadd84a" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.582872 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"723ce6d8aa73da7a7f75ecb7c83fcc098afc292834683b8ef574eab17dadd84a"} err="failed to get container status \"723ce6d8aa73da7a7f75ecb7c83fcc098afc292834683b8ef574eab17dadd84a\": rpc error: code = NotFound desc = could not find container \"723ce6d8aa73da7a7f75ecb7c83fcc098afc292834683b8ef574eab17dadd84a\": container with ID starting with 723ce6d8aa73da7a7f75ecb7c83fcc098afc292834683b8ef574eab17dadd84a not found: ID does not exist" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.595236 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.623903 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.624521 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.624843 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.726114 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/78c9774b-e6ed-434a-9a05-77de64d14c5c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"78c9774b-e6ed-434a-9a05-77de64d14c5c\") " pod="openstack/nova-api-0" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.726219 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78c9774b-e6ed-434a-9a05-77de64d14c5c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"78c9774b-e6ed-434a-9a05-77de64d14c5c\") " pod="openstack/nova-api-0" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.726265 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78c9774b-e6ed-434a-9a05-77de64d14c5c-logs\") pod \"nova-api-0\" (UID: \"78c9774b-e6ed-434a-9a05-77de64d14c5c\") " pod="openstack/nova-api-0" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.726305 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78c9774b-e6ed-434a-9a05-77de64d14c5c-config-data\") pod \"nova-api-0\" (UID: \"78c9774b-e6ed-434a-9a05-77de64d14c5c\") " pod="openstack/nova-api-0" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.726429 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/78c9774b-e6ed-434a-9a05-77de64d14c5c-public-tls-certs\") pod \"nova-api-0\" (UID: \"78c9774b-e6ed-434a-9a05-77de64d14c5c\") " pod="openstack/nova-api-0" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.726496 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52nqk\" (UniqueName: \"kubernetes.io/projected/78c9774b-e6ed-434a-9a05-77de64d14c5c-kube-api-access-52nqk\") pod \"nova-api-0\" (UID: \"78c9774b-e6ed-434a-9a05-77de64d14c5c\") " pod="openstack/nova-api-0" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.828738 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78c9774b-e6ed-434a-9a05-77de64d14c5c-config-data\") pod \"nova-api-0\" (UID: \"78c9774b-e6ed-434a-9a05-77de64d14c5c\") " pod="openstack/nova-api-0" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.828896 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/78c9774b-e6ed-434a-9a05-77de64d14c5c-public-tls-certs\") pod \"nova-api-0\" (UID: \"78c9774b-e6ed-434a-9a05-77de64d14c5c\") " pod="openstack/nova-api-0" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.828988 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52nqk\" (UniqueName: \"kubernetes.io/projected/78c9774b-e6ed-434a-9a05-77de64d14c5c-kube-api-access-52nqk\") pod \"nova-api-0\" (UID: \"78c9774b-e6ed-434a-9a05-77de64d14c5c\") " pod="openstack/nova-api-0" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.829043 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/78c9774b-e6ed-434a-9a05-77de64d14c5c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"78c9774b-e6ed-434a-9a05-77de64d14c5c\") " pod="openstack/nova-api-0" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.829150 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78c9774b-e6ed-434a-9a05-77de64d14c5c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"78c9774b-e6ed-434a-9a05-77de64d14c5c\") " pod="openstack/nova-api-0" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.829258 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78c9774b-e6ed-434a-9a05-77de64d14c5c-logs\") pod \"nova-api-0\" (UID: \"78c9774b-e6ed-434a-9a05-77de64d14c5c\") " pod="openstack/nova-api-0" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.829687 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78c9774b-e6ed-434a-9a05-77de64d14c5c-logs\") pod \"nova-api-0\" (UID: \"78c9774b-e6ed-434a-9a05-77de64d14c5c\") " pod="openstack/nova-api-0" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.835522 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78c9774b-e6ed-434a-9a05-77de64d14c5c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"78c9774b-e6ed-434a-9a05-77de64d14c5c\") " pod="openstack/nova-api-0" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.835611 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78c9774b-e6ed-434a-9a05-77de64d14c5c-config-data\") pod \"nova-api-0\" (UID: \"78c9774b-e6ed-434a-9a05-77de64d14c5c\") " pod="openstack/nova-api-0" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.836082 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/78c9774b-e6ed-434a-9a05-77de64d14c5c-public-tls-certs\") pod \"nova-api-0\" (UID: \"78c9774b-e6ed-434a-9a05-77de64d14c5c\") " pod="openstack/nova-api-0" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.837057 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/78c9774b-e6ed-434a-9a05-77de64d14c5c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"78c9774b-e6ed-434a-9a05-77de64d14c5c\") " pod="openstack/nova-api-0" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.846112 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52nqk\" (UniqueName: \"kubernetes.io/projected/78c9774b-e6ed-434a-9a05-77de64d14c5c-kube-api-access-52nqk\") pod \"nova-api-0\" (UID: \"78c9774b-e6ed-434a-9a05-77de64d14c5c\") " pod="openstack/nova-api-0" Mar 13 12:09:22 crc kubenswrapper[4837]: I0313 12:09:22.962356 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 12:09:23 crc kubenswrapper[4837]: I0313 12:09:23.062905 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="534b3e48-da2d-41b6-af02-bef43adcac21" path="/var/lib/kubelet/pods/534b3e48-da2d-41b6-af02-bef43adcac21/volumes" Mar 13 12:09:23 crc kubenswrapper[4837]: I0313 12:09:23.439681 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 12:09:23 crc kubenswrapper[4837]: I0313 12:09:23.521252 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"78c9774b-e6ed-434a-9a05-77de64d14c5c","Type":"ContainerStarted","Data":"196c93139204cac88ac74bf775fe6446d52e56b24f2532d6b6a8393e6ffd7da4"} Mar 13 12:09:24 crc kubenswrapper[4837]: I0313 12:09:24.532628 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"78c9774b-e6ed-434a-9a05-77de64d14c5c","Type":"ContainerStarted","Data":"4ca07acd2a41f0c3f14868adbd5c92b4f4e135a23bfc2d9caf191ce9bdb4d94a"} Mar 13 12:09:24 crc kubenswrapper[4837]: I0313 12:09:24.532993 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"78c9774b-e6ed-434a-9a05-77de64d14c5c","Type":"ContainerStarted","Data":"7b6a520617c17e8a5aea39cc8ad6ac82a12e5d413248ffe3dd096388d1de71a6"} Mar 13 12:09:24 crc kubenswrapper[4837]: I0313 12:09:24.552869 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.552847102 podStartE2EDuration="2.552847102s" podCreationTimestamp="2026-03-13 12:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:09:24.54773584 +0000 UTC m=+1280.186002603" watchObservedRunningTime="2026-03-13 12:09:24.552847102 +0000 UTC m=+1280.191113875" Mar 13 12:09:24 crc kubenswrapper[4837]: I0313 12:09:24.768525 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:09:24 crc kubenswrapper[4837]: I0313 12:09:24.788664 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.350603 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.490522 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-combined-ca-bundle\") pod \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.490804 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87f0825e-ff58-4bf4-bf83-6522dcc333e2-run-httpd\") pod \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.491151 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87f0825e-ff58-4bf4-bf83-6522dcc333e2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "87f0825e-ff58-4bf4-bf83-6522dcc333e2" (UID: "87f0825e-ff58-4bf4-bf83-6522dcc333e2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.491354 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-ceilometer-tls-certs\") pod \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.491799 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-sg-core-conf-yaml\") pod \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.491940 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87f0825e-ff58-4bf4-bf83-6522dcc333e2-log-httpd\") pod \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.492060 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-scripts\") pod \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.492525 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-config-data\") pod \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.492690 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzw9w\" (UniqueName: \"kubernetes.io/projected/87f0825e-ff58-4bf4-bf83-6522dcc333e2-kube-api-access-bzw9w\") pod \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\" (UID: \"87f0825e-ff58-4bf4-bf83-6522dcc333e2\") " Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.492461 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87f0825e-ff58-4bf4-bf83-6522dcc333e2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "87f0825e-ff58-4bf4-bf83-6522dcc333e2" (UID: "87f0825e-ff58-4bf4-bf83-6522dcc333e2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.494098 4837 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87f0825e-ff58-4bf4-bf83-6522dcc333e2-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.494189 4837 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87f0825e-ff58-4bf4-bf83-6522dcc333e2-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.505712 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87f0825e-ff58-4bf4-bf83-6522dcc333e2-kube-api-access-bzw9w" (OuterVolumeSpecName: "kube-api-access-bzw9w") pod "87f0825e-ff58-4bf4-bf83-6522dcc333e2" (UID: "87f0825e-ff58-4bf4-bf83-6522dcc333e2"). InnerVolumeSpecName "kube-api-access-bzw9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.506908 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-scripts" (OuterVolumeSpecName: "scripts") pod "87f0825e-ff58-4bf4-bf83-6522dcc333e2" (UID: "87f0825e-ff58-4bf4-bf83-6522dcc333e2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.529858 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "87f0825e-ff58-4bf4-bf83-6522dcc333e2" (UID: "87f0825e-ff58-4bf4-bf83-6522dcc333e2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.545945 4837 generic.go:334] "Generic (PLEG): container finished" podID="87f0825e-ff58-4bf4-bf83-6522dcc333e2" containerID="26f413af49f804fe13ff8c9b2b887ef2e357277501d29cf746c057c4c5c85b88" exitCode=0 Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.545993 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87f0825e-ff58-4bf4-bf83-6522dcc333e2","Type":"ContainerDied","Data":"26f413af49f804fe13ff8c9b2b887ef2e357277501d29cf746c057c4c5c85b88"} Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.546039 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87f0825e-ff58-4bf4-bf83-6522dcc333e2","Type":"ContainerDied","Data":"8371eb3af69417b51c9486141ddff58bfc7ec752522d7b166877385a6b1e772e"} Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.546060 4837 scope.go:117] "RemoveContainer" containerID="770ef0a10fb37e3814cf493f7e62eb7e15ce55983a506bec280081e698fad5c8" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.546095 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.553586 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "87f0825e-ff58-4bf4-bf83-6522dcc333e2" (UID: "87f0825e-ff58-4bf4-bf83-6522dcc333e2"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.564502 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.606802 4837 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.606839 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.606850 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzw9w\" (UniqueName: \"kubernetes.io/projected/87f0825e-ff58-4bf4-bf83-6522dcc333e2-kube-api-access-bzw9w\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.606859 4837 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.606928 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "87f0825e-ff58-4bf4-bf83-6522dcc333e2" (UID: "87f0825e-ff58-4bf4-bf83-6522dcc333e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.624727 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-config-data" (OuterVolumeSpecName: "config-data") pod "87f0825e-ff58-4bf4-bf83-6522dcc333e2" (UID: "87f0825e-ff58-4bf4-bf83-6522dcc333e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.662762 4837 scope.go:117] "RemoveContainer" containerID="0ad1a68e05e02f84e1d06c1b173ebb7a467268f5b60941fa800e2751955e3df1" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.684327 4837 scope.go:117] "RemoveContainer" containerID="e7e5f4559015b914e67c3cf32dcbe3ce14a32ed203cfc64d9b54f0456af10b72" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.708679 4837 scope.go:117] "RemoveContainer" containerID="26f413af49f804fe13ff8c9b2b887ef2e357277501d29cf746c057c4c5c85b88" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.710229 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.710279 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87f0825e-ff58-4bf4-bf83-6522dcc333e2-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.731980 4837 scope.go:117] "RemoveContainer" containerID="770ef0a10fb37e3814cf493f7e62eb7e15ce55983a506bec280081e698fad5c8" Mar 13 12:09:25 crc kubenswrapper[4837]: E0313 12:09:25.733906 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"770ef0a10fb37e3814cf493f7e62eb7e15ce55983a506bec280081e698fad5c8\": container with ID starting with 770ef0a10fb37e3814cf493f7e62eb7e15ce55983a506bec280081e698fad5c8 not found: ID does not exist" containerID="770ef0a10fb37e3814cf493f7e62eb7e15ce55983a506bec280081e698fad5c8" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.733942 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"770ef0a10fb37e3814cf493f7e62eb7e15ce55983a506bec280081e698fad5c8"} err="failed to get container status \"770ef0a10fb37e3814cf493f7e62eb7e15ce55983a506bec280081e698fad5c8\": rpc error: code = NotFound desc = could not find container \"770ef0a10fb37e3814cf493f7e62eb7e15ce55983a506bec280081e698fad5c8\": container with ID starting with 770ef0a10fb37e3814cf493f7e62eb7e15ce55983a506bec280081e698fad5c8 not found: ID does not exist" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.733965 4837 scope.go:117] "RemoveContainer" containerID="0ad1a68e05e02f84e1d06c1b173ebb7a467268f5b60941fa800e2751955e3df1" Mar 13 12:09:25 crc kubenswrapper[4837]: E0313 12:09:25.734437 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ad1a68e05e02f84e1d06c1b173ebb7a467268f5b60941fa800e2751955e3df1\": container with ID starting with 0ad1a68e05e02f84e1d06c1b173ebb7a467268f5b60941fa800e2751955e3df1 not found: ID does not exist" containerID="0ad1a68e05e02f84e1d06c1b173ebb7a467268f5b60941fa800e2751955e3df1" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.734475 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ad1a68e05e02f84e1d06c1b173ebb7a467268f5b60941fa800e2751955e3df1"} err="failed to get container status \"0ad1a68e05e02f84e1d06c1b173ebb7a467268f5b60941fa800e2751955e3df1\": rpc error: code = NotFound desc = could not find container \"0ad1a68e05e02f84e1d06c1b173ebb7a467268f5b60941fa800e2751955e3df1\": container with ID starting with 0ad1a68e05e02f84e1d06c1b173ebb7a467268f5b60941fa800e2751955e3df1 not found: ID does not exist" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.734494 4837 scope.go:117] "RemoveContainer" containerID="e7e5f4559015b914e67c3cf32dcbe3ce14a32ed203cfc64d9b54f0456af10b72" Mar 13 12:09:25 crc kubenswrapper[4837]: E0313 12:09:25.735169 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7e5f4559015b914e67c3cf32dcbe3ce14a32ed203cfc64d9b54f0456af10b72\": container with ID starting with e7e5f4559015b914e67c3cf32dcbe3ce14a32ed203cfc64d9b54f0456af10b72 not found: ID does not exist" containerID="e7e5f4559015b914e67c3cf32dcbe3ce14a32ed203cfc64d9b54f0456af10b72" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.735240 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7e5f4559015b914e67c3cf32dcbe3ce14a32ed203cfc64d9b54f0456af10b72"} err="failed to get container status \"e7e5f4559015b914e67c3cf32dcbe3ce14a32ed203cfc64d9b54f0456af10b72\": rpc error: code = NotFound desc = could not find container \"e7e5f4559015b914e67c3cf32dcbe3ce14a32ed203cfc64d9b54f0456af10b72\": container with ID starting with e7e5f4559015b914e67c3cf32dcbe3ce14a32ed203cfc64d9b54f0456af10b72 not found: ID does not exist" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.735285 4837 scope.go:117] "RemoveContainer" containerID="26f413af49f804fe13ff8c9b2b887ef2e357277501d29cf746c057c4c5c85b88" Mar 13 12:09:25 crc kubenswrapper[4837]: E0313 12:09:25.735692 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26f413af49f804fe13ff8c9b2b887ef2e357277501d29cf746c057c4c5c85b88\": container with ID starting with 26f413af49f804fe13ff8c9b2b887ef2e357277501d29cf746c057c4c5c85b88 not found: ID does not exist" containerID="26f413af49f804fe13ff8c9b2b887ef2e357277501d29cf746c057c4c5c85b88" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.735720 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26f413af49f804fe13ff8c9b2b887ef2e357277501d29cf746c057c4c5c85b88"} err="failed to get container status \"26f413af49f804fe13ff8c9b2b887ef2e357277501d29cf746c057c4c5c85b88\": rpc error: code = NotFound desc = could not find container \"26f413af49f804fe13ff8c9b2b887ef2e357277501d29cf746c057c4c5c85b88\": container with ID starting with 26f413af49f804fe13ff8c9b2b887ef2e357277501d29cf746c057c4c5c85b88 not found: ID does not exist" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.826384 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-mzmd5"] Mar 13 12:09:25 crc kubenswrapper[4837]: E0313 12:09:25.827133 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87f0825e-ff58-4bf4-bf83-6522dcc333e2" containerName="proxy-httpd" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.827157 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="87f0825e-ff58-4bf4-bf83-6522dcc333e2" containerName="proxy-httpd" Mar 13 12:09:25 crc kubenswrapper[4837]: E0313 12:09:25.827284 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87f0825e-ff58-4bf4-bf83-6522dcc333e2" containerName="ceilometer-central-agent" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.827299 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="87f0825e-ff58-4bf4-bf83-6522dcc333e2" containerName="ceilometer-central-agent" Mar 13 12:09:25 crc kubenswrapper[4837]: E0313 12:09:25.827318 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87f0825e-ff58-4bf4-bf83-6522dcc333e2" containerName="ceilometer-notification-agent" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.827327 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="87f0825e-ff58-4bf4-bf83-6522dcc333e2" containerName="ceilometer-notification-agent" Mar 13 12:09:25 crc kubenswrapper[4837]: E0313 12:09:25.827342 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87f0825e-ff58-4bf4-bf83-6522dcc333e2" containerName="sg-core" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.827350 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="87f0825e-ff58-4bf4-bf83-6522dcc333e2" containerName="sg-core" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.827673 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="87f0825e-ff58-4bf4-bf83-6522dcc333e2" containerName="ceilometer-notification-agent" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.827703 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="87f0825e-ff58-4bf4-bf83-6522dcc333e2" containerName="proxy-httpd" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.827722 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="87f0825e-ff58-4bf4-bf83-6522dcc333e2" containerName="ceilometer-central-agent" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.828950 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="87f0825e-ff58-4bf4-bf83-6522dcc333e2" containerName="sg-core" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.830403 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mzmd5" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.833120 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.833361 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.857113 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-mzmd5"] Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.915216 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0f45aae-caa3-4c50-9059-be42d328cba1-scripts\") pod \"nova-cell1-cell-mapping-mzmd5\" (UID: \"f0f45aae-caa3-4c50-9059-be42d328cba1\") " pod="openstack/nova-cell1-cell-mapping-mzmd5" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.915513 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0f45aae-caa3-4c50-9059-be42d328cba1-config-data\") pod \"nova-cell1-cell-mapping-mzmd5\" (UID: \"f0f45aae-caa3-4c50-9059-be42d328cba1\") " pod="openstack/nova-cell1-cell-mapping-mzmd5" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.915594 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0f45aae-caa3-4c50-9059-be42d328cba1-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mzmd5\" (UID: \"f0f45aae-caa3-4c50-9059-be42d328cba1\") " pod="openstack/nova-cell1-cell-mapping-mzmd5" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.915669 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctw8c\" (UniqueName: \"kubernetes.io/projected/f0f45aae-caa3-4c50-9059-be42d328cba1-kube-api-access-ctw8c\") pod \"nova-cell1-cell-mapping-mzmd5\" (UID: \"f0f45aae-caa3-4c50-9059-be42d328cba1\") " pod="openstack/nova-cell1-cell-mapping-mzmd5" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.949564 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.958745 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.986941 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.990339 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.994094 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 13 12:09:25 crc kubenswrapper[4837]: I0313 12:09:25.995105 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:25.999987 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.013727 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.017882 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctw8c\" (UniqueName: \"kubernetes.io/projected/f0f45aae-caa3-4c50-9059-be42d328cba1-kube-api-access-ctw8c\") pod \"nova-cell1-cell-mapping-mzmd5\" (UID: \"f0f45aae-caa3-4c50-9059-be42d328cba1\") " pod="openstack/nova-cell1-cell-mapping-mzmd5" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.018015 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0f45aae-caa3-4c50-9059-be42d328cba1-scripts\") pod \"nova-cell1-cell-mapping-mzmd5\" (UID: \"f0f45aae-caa3-4c50-9059-be42d328cba1\") " pod="openstack/nova-cell1-cell-mapping-mzmd5" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.018174 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0f45aae-caa3-4c50-9059-be42d328cba1-config-data\") pod \"nova-cell1-cell-mapping-mzmd5\" (UID: \"f0f45aae-caa3-4c50-9059-be42d328cba1\") " pod="openstack/nova-cell1-cell-mapping-mzmd5" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.018271 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0f45aae-caa3-4c50-9059-be42d328cba1-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mzmd5\" (UID: \"f0f45aae-caa3-4c50-9059-be42d328cba1\") " pod="openstack/nova-cell1-cell-mapping-mzmd5" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.025773 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0f45aae-caa3-4c50-9059-be42d328cba1-config-data\") pod \"nova-cell1-cell-mapping-mzmd5\" (UID: \"f0f45aae-caa3-4c50-9059-be42d328cba1\") " pod="openstack/nova-cell1-cell-mapping-mzmd5" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.026725 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0f45aae-caa3-4c50-9059-be42d328cba1-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mzmd5\" (UID: \"f0f45aae-caa3-4c50-9059-be42d328cba1\") " pod="openstack/nova-cell1-cell-mapping-mzmd5" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.029816 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0f45aae-caa3-4c50-9059-be42d328cba1-scripts\") pod \"nova-cell1-cell-mapping-mzmd5\" (UID: \"f0f45aae-caa3-4c50-9059-be42d328cba1\") " pod="openstack/nova-cell1-cell-mapping-mzmd5" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.043036 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctw8c\" (UniqueName: \"kubernetes.io/projected/f0f45aae-caa3-4c50-9059-be42d328cba1-kube-api-access-ctw8c\") pod \"nova-cell1-cell-mapping-mzmd5\" (UID: \"f0f45aae-caa3-4c50-9059-be42d328cba1\") " pod="openstack/nova-cell1-cell-mapping-mzmd5" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.119988 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82b5b509-a674-4a89-a7cc-c01c7bfca144-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"82b5b509-a674-4a89-a7cc-c01c7bfca144\") " pod="openstack/ceilometer-0" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.120068 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/82b5b509-a674-4a89-a7cc-c01c7bfca144-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"82b5b509-a674-4a89-a7cc-c01c7bfca144\") " pod="openstack/ceilometer-0" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.120129 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82b5b509-a674-4a89-a7cc-c01c7bfca144-config-data\") pod \"ceilometer-0\" (UID: \"82b5b509-a674-4a89-a7cc-c01c7bfca144\") " pod="openstack/ceilometer-0" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.120173 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdwkd\" (UniqueName: \"kubernetes.io/projected/82b5b509-a674-4a89-a7cc-c01c7bfca144-kube-api-access-wdwkd\") pod \"ceilometer-0\" (UID: \"82b5b509-a674-4a89-a7cc-c01c7bfca144\") " pod="openstack/ceilometer-0" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.120280 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82b5b509-a674-4a89-a7cc-c01c7bfca144-log-httpd\") pod \"ceilometer-0\" (UID: \"82b5b509-a674-4a89-a7cc-c01c7bfca144\") " pod="openstack/ceilometer-0" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.120375 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/82b5b509-a674-4a89-a7cc-c01c7bfca144-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"82b5b509-a674-4a89-a7cc-c01c7bfca144\") " pod="openstack/ceilometer-0" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.120408 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82b5b509-a674-4a89-a7cc-c01c7bfca144-run-httpd\") pod \"ceilometer-0\" (UID: \"82b5b509-a674-4a89-a7cc-c01c7bfca144\") " pod="openstack/ceilometer-0" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.120450 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82b5b509-a674-4a89-a7cc-c01c7bfca144-scripts\") pod \"ceilometer-0\" (UID: \"82b5b509-a674-4a89-a7cc-c01c7bfca144\") " pod="openstack/ceilometer-0" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.173161 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mzmd5" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.222803 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/82b5b509-a674-4a89-a7cc-c01c7bfca144-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"82b5b509-a674-4a89-a7cc-c01c7bfca144\") " pod="openstack/ceilometer-0" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.222875 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82b5b509-a674-4a89-a7cc-c01c7bfca144-run-httpd\") pod \"ceilometer-0\" (UID: \"82b5b509-a674-4a89-a7cc-c01c7bfca144\") " pod="openstack/ceilometer-0" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.222920 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82b5b509-a674-4a89-a7cc-c01c7bfca144-scripts\") pod \"ceilometer-0\" (UID: \"82b5b509-a674-4a89-a7cc-c01c7bfca144\") " pod="openstack/ceilometer-0" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.222947 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82b5b509-a674-4a89-a7cc-c01c7bfca144-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"82b5b509-a674-4a89-a7cc-c01c7bfca144\") " pod="openstack/ceilometer-0" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.222979 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/82b5b509-a674-4a89-a7cc-c01c7bfca144-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"82b5b509-a674-4a89-a7cc-c01c7bfca144\") " pod="openstack/ceilometer-0" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.223038 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82b5b509-a674-4a89-a7cc-c01c7bfca144-config-data\") pod \"ceilometer-0\" (UID: \"82b5b509-a674-4a89-a7cc-c01c7bfca144\") " pod="openstack/ceilometer-0" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.223072 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdwkd\" (UniqueName: \"kubernetes.io/projected/82b5b509-a674-4a89-a7cc-c01c7bfca144-kube-api-access-wdwkd\") pod \"ceilometer-0\" (UID: \"82b5b509-a674-4a89-a7cc-c01c7bfca144\") " pod="openstack/ceilometer-0" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.223146 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82b5b509-a674-4a89-a7cc-c01c7bfca144-log-httpd\") pod \"ceilometer-0\" (UID: \"82b5b509-a674-4a89-a7cc-c01c7bfca144\") " pod="openstack/ceilometer-0" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.223815 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82b5b509-a674-4a89-a7cc-c01c7bfca144-log-httpd\") pod \"ceilometer-0\" (UID: \"82b5b509-a674-4a89-a7cc-c01c7bfca144\") " pod="openstack/ceilometer-0" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.224081 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82b5b509-a674-4a89-a7cc-c01c7bfca144-run-httpd\") pod \"ceilometer-0\" (UID: \"82b5b509-a674-4a89-a7cc-c01c7bfca144\") " pod="openstack/ceilometer-0" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.232483 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/82b5b509-a674-4a89-a7cc-c01c7bfca144-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"82b5b509-a674-4a89-a7cc-c01c7bfca144\") " pod="openstack/ceilometer-0" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.232999 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82b5b509-a674-4a89-a7cc-c01c7bfca144-scripts\") pod \"ceilometer-0\" (UID: \"82b5b509-a674-4a89-a7cc-c01c7bfca144\") " pod="openstack/ceilometer-0" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.233124 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/82b5b509-a674-4a89-a7cc-c01c7bfca144-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"82b5b509-a674-4a89-a7cc-c01c7bfca144\") " pod="openstack/ceilometer-0" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.234090 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82b5b509-a674-4a89-a7cc-c01c7bfca144-config-data\") pod \"ceilometer-0\" (UID: \"82b5b509-a674-4a89-a7cc-c01c7bfca144\") " pod="openstack/ceilometer-0" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.234334 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82b5b509-a674-4a89-a7cc-c01c7bfca144-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"82b5b509-a674-4a89-a7cc-c01c7bfca144\") " pod="openstack/ceilometer-0" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.247139 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdwkd\" (UniqueName: \"kubernetes.io/projected/82b5b509-a674-4a89-a7cc-c01c7bfca144-kube-api-access-wdwkd\") pod \"ceilometer-0\" (UID: \"82b5b509-a674-4a89-a7cc-c01c7bfca144\") " pod="openstack/ceilometer-0" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.311723 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.384007 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.456334 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-5blpv"] Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.457712 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-757b4f8459-5blpv" podUID="6de330b6-0bbb-4a9d-9062-9c7ed182a189" containerName="dnsmasq-dns" containerID="cri-o://1d8786b6d9674dc9d5eaebc032e5dbd8c1d018dc4a94605d311592e57b3895fc" gracePeriod=10 Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.738019 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-mzmd5"] Mar 13 12:09:26 crc kubenswrapper[4837]: W0313 12:09:26.752277 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0f45aae_caa3_4c50_9059_be42d328cba1.slice/crio-169405adf02892a1cf5eb78b7101a3517b5c16d690364e5a55dd374130c20692 WatchSource:0}: Error finding container 169405adf02892a1cf5eb78b7101a3517b5c16d690364e5a55dd374130c20692: Status 404 returned error can't find the container with id 169405adf02892a1cf5eb78b7101a3517b5c16d690364e5a55dd374130c20692 Mar 13 12:09:26 crc kubenswrapper[4837]: I0313 12:09:26.969096 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.108788 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87f0825e-ff58-4bf4-bf83-6522dcc333e2" path="/var/lib/kubelet/pods/87f0825e-ff58-4bf4-bf83-6522dcc333e2/volumes" Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.150168 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-5blpv" Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.252185 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-dns-swift-storage-0\") pod \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\" (UID: \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\") " Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.252318 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-ovsdbserver-nb\") pod \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\" (UID: \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\") " Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.252422 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-config\") pod \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\" (UID: \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\") " Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.252678 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-ovsdbserver-sb\") pod \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\" (UID: \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\") " Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.252759 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-dns-svc\") pod \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\" (UID: \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\") " Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.252896 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gvxg\" (UniqueName: \"kubernetes.io/projected/6de330b6-0bbb-4a9d-9062-9c7ed182a189-kube-api-access-7gvxg\") pod \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\" (UID: \"6de330b6-0bbb-4a9d-9062-9c7ed182a189\") " Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.279766 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6de330b6-0bbb-4a9d-9062-9c7ed182a189-kube-api-access-7gvxg" (OuterVolumeSpecName: "kube-api-access-7gvxg") pod "6de330b6-0bbb-4a9d-9062-9c7ed182a189" (UID: "6de330b6-0bbb-4a9d-9062-9c7ed182a189"). InnerVolumeSpecName "kube-api-access-7gvxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.322334 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6de330b6-0bbb-4a9d-9062-9c7ed182a189" (UID: "6de330b6-0bbb-4a9d-9062-9c7ed182a189"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.332176 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6de330b6-0bbb-4a9d-9062-9c7ed182a189" (UID: "6de330b6-0bbb-4a9d-9062-9c7ed182a189"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.337713 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6de330b6-0bbb-4a9d-9062-9c7ed182a189" (UID: "6de330b6-0bbb-4a9d-9062-9c7ed182a189"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.343365 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6de330b6-0bbb-4a9d-9062-9c7ed182a189" (UID: "6de330b6-0bbb-4a9d-9062-9c7ed182a189"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.343433 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-config" (OuterVolumeSpecName: "config") pod "6de330b6-0bbb-4a9d-9062-9c7ed182a189" (UID: "6de330b6-0bbb-4a9d-9062-9c7ed182a189"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.355759 4837 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.355799 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.355811 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.355826 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.355838 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6de330b6-0bbb-4a9d-9062-9c7ed182a189-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.355849 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gvxg\" (UniqueName: \"kubernetes.io/projected/6de330b6-0bbb-4a9d-9062-9c7ed182a189-kube-api-access-7gvxg\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.572628 4837 generic.go:334] "Generic (PLEG): container finished" podID="6de330b6-0bbb-4a9d-9062-9c7ed182a189" containerID="1d8786b6d9674dc9d5eaebc032e5dbd8c1d018dc4a94605d311592e57b3895fc" exitCode=0 Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.572727 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-5blpv" event={"ID":"6de330b6-0bbb-4a9d-9062-9c7ed182a189","Type":"ContainerDied","Data":"1d8786b6d9674dc9d5eaebc032e5dbd8c1d018dc4a94605d311592e57b3895fc"} Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.572761 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-5blpv" event={"ID":"6de330b6-0bbb-4a9d-9062-9c7ed182a189","Type":"ContainerDied","Data":"d820b1edec0c5d2d420936bab95dffbf9bd4c7adef7db33d312ced4b311526ff"} Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.572768 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-5blpv" Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.572781 4837 scope.go:117] "RemoveContainer" containerID="1d8786b6d9674dc9d5eaebc032e5dbd8c1d018dc4a94605d311592e57b3895fc" Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.575341 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mzmd5" event={"ID":"f0f45aae-caa3-4c50-9059-be42d328cba1","Type":"ContainerStarted","Data":"e540ca1787fcba1ed1f9804f4336a11c9388c115ed0bc76404d559071e68ab56"} Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.576205 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mzmd5" event={"ID":"f0f45aae-caa3-4c50-9059-be42d328cba1","Type":"ContainerStarted","Data":"169405adf02892a1cf5eb78b7101a3517b5c16d690364e5a55dd374130c20692"} Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.583172 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82b5b509-a674-4a89-a7cc-c01c7bfca144","Type":"ContainerStarted","Data":"7d1984717c670e6c61e1d51e4ec9299f3dd2cfc5f7b44bf67bbe7d72918d5c6e"} Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.593685 4837 scope.go:117] "RemoveContainer" containerID="7468313c118293c73f68950b41a915eb07c6510dd5985dea1ec55106483d1ae9" Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.621456 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-mzmd5" podStartSLOduration=2.621433569 podStartE2EDuration="2.621433569s" podCreationTimestamp="2026-03-13 12:09:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:09:27.606471077 +0000 UTC m=+1283.244737850" watchObservedRunningTime="2026-03-13 12:09:27.621433569 +0000 UTC m=+1283.259700342" Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.636117 4837 scope.go:117] "RemoveContainer" containerID="1d8786b6d9674dc9d5eaebc032e5dbd8c1d018dc4a94605d311592e57b3895fc" Mar 13 12:09:27 crc kubenswrapper[4837]: E0313 12:09:27.639832 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d8786b6d9674dc9d5eaebc032e5dbd8c1d018dc4a94605d311592e57b3895fc\": container with ID starting with 1d8786b6d9674dc9d5eaebc032e5dbd8c1d018dc4a94605d311592e57b3895fc not found: ID does not exist" containerID="1d8786b6d9674dc9d5eaebc032e5dbd8c1d018dc4a94605d311592e57b3895fc" Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.639920 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d8786b6d9674dc9d5eaebc032e5dbd8c1d018dc4a94605d311592e57b3895fc"} err="failed to get container status \"1d8786b6d9674dc9d5eaebc032e5dbd8c1d018dc4a94605d311592e57b3895fc\": rpc error: code = NotFound desc = could not find container \"1d8786b6d9674dc9d5eaebc032e5dbd8c1d018dc4a94605d311592e57b3895fc\": container with ID starting with 1d8786b6d9674dc9d5eaebc032e5dbd8c1d018dc4a94605d311592e57b3895fc not found: ID does not exist" Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.640009 4837 scope.go:117] "RemoveContainer" containerID="7468313c118293c73f68950b41a915eb07c6510dd5985dea1ec55106483d1ae9" Mar 13 12:09:27 crc kubenswrapper[4837]: E0313 12:09:27.640772 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7468313c118293c73f68950b41a915eb07c6510dd5985dea1ec55106483d1ae9\": container with ID starting with 7468313c118293c73f68950b41a915eb07c6510dd5985dea1ec55106483d1ae9 not found: ID does not exist" containerID="7468313c118293c73f68950b41a915eb07c6510dd5985dea1ec55106483d1ae9" Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.640840 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7468313c118293c73f68950b41a915eb07c6510dd5985dea1ec55106483d1ae9"} err="failed to get container status \"7468313c118293c73f68950b41a915eb07c6510dd5985dea1ec55106483d1ae9\": rpc error: code = NotFound desc = could not find container \"7468313c118293c73f68950b41a915eb07c6510dd5985dea1ec55106483d1ae9\": container with ID starting with 7468313c118293c73f68950b41a915eb07c6510dd5985dea1ec55106483d1ae9 not found: ID does not exist" Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.652658 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-5blpv"] Mar 13 12:09:27 crc kubenswrapper[4837]: I0313 12:09:27.663595 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-5blpv"] Mar 13 12:09:28 crc kubenswrapper[4837]: I0313 12:09:28.596674 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82b5b509-a674-4a89-a7cc-c01c7bfca144","Type":"ContainerStarted","Data":"bbfcf746609a946bc3db7fc627e745ae9e9768a0d134e4e98791513fbddaa72d"} Mar 13 12:09:28 crc kubenswrapper[4837]: I0313 12:09:28.597052 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82b5b509-a674-4a89-a7cc-c01c7bfca144","Type":"ContainerStarted","Data":"045eef85cb2e6b2e1ec2f3cbd2e8715b1219b97a406f5894bd80115a5a961db4"} Mar 13 12:09:29 crc kubenswrapper[4837]: I0313 12:09:29.063298 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6de330b6-0bbb-4a9d-9062-9c7ed182a189" path="/var/lib/kubelet/pods/6de330b6-0bbb-4a9d-9062-9c7ed182a189/volumes" Mar 13 12:09:29 crc kubenswrapper[4837]: I0313 12:09:29.611073 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82b5b509-a674-4a89-a7cc-c01c7bfca144","Type":"ContainerStarted","Data":"9dcde2302697f768971d264413f868731fa4ac41393b876ece3a51c664182c72"} Mar 13 12:09:31 crc kubenswrapper[4837]: I0313 12:09:31.630663 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82b5b509-a674-4a89-a7cc-c01c7bfca144","Type":"ContainerStarted","Data":"2e4fc926acf9cb684f3794b18592198042298af8a9d1176684449adb0020a337"} Mar 13 12:09:31 crc kubenswrapper[4837]: I0313 12:09:31.631185 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 12:09:31 crc kubenswrapper[4837]: I0313 12:09:31.659963 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.424966887 podStartE2EDuration="6.659937463s" podCreationTimestamp="2026-03-13 12:09:25 +0000 UTC" firstStartedPulling="2026-03-13 12:09:27.059493605 +0000 UTC m=+1282.697760378" lastFinishedPulling="2026-03-13 12:09:30.294464191 +0000 UTC m=+1285.932730954" observedRunningTime="2026-03-13 12:09:31.651287921 +0000 UTC m=+1287.289554694" watchObservedRunningTime="2026-03-13 12:09:31.659937463 +0000 UTC m=+1287.298204226" Mar 13 12:09:32 crc kubenswrapper[4837]: I0313 12:09:32.640884 4837 generic.go:334] "Generic (PLEG): container finished" podID="f0f45aae-caa3-4c50-9059-be42d328cba1" containerID="e540ca1787fcba1ed1f9804f4336a11c9388c115ed0bc76404d559071e68ab56" exitCode=0 Mar 13 12:09:32 crc kubenswrapper[4837]: I0313 12:09:32.640953 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mzmd5" event={"ID":"f0f45aae-caa3-4c50-9059-be42d328cba1","Type":"ContainerDied","Data":"e540ca1787fcba1ed1f9804f4336a11c9388c115ed0bc76404d559071e68ab56"} Mar 13 12:09:32 crc kubenswrapper[4837]: I0313 12:09:32.963394 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 12:09:32 crc kubenswrapper[4837]: I0313 12:09:32.963447 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 12:09:33 crc kubenswrapper[4837]: I0313 12:09:33.986020 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="78c9774b-e6ed-434a-9a05-77de64d14c5c" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.209:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 12:09:33 crc kubenswrapper[4837]: I0313 12:09:33.986037 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="78c9774b-e6ed-434a-9a05-77de64d14c5c" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.209:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 12:09:34 crc kubenswrapper[4837]: I0313 12:09:34.102774 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mzmd5" Mar 13 12:09:34 crc kubenswrapper[4837]: I0313 12:09:34.202876 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0f45aae-caa3-4c50-9059-be42d328cba1-combined-ca-bundle\") pod \"f0f45aae-caa3-4c50-9059-be42d328cba1\" (UID: \"f0f45aae-caa3-4c50-9059-be42d328cba1\") " Mar 13 12:09:34 crc kubenswrapper[4837]: I0313 12:09:34.203022 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0f45aae-caa3-4c50-9059-be42d328cba1-config-data\") pod \"f0f45aae-caa3-4c50-9059-be42d328cba1\" (UID: \"f0f45aae-caa3-4c50-9059-be42d328cba1\") " Mar 13 12:09:34 crc kubenswrapper[4837]: I0313 12:09:34.203076 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctw8c\" (UniqueName: \"kubernetes.io/projected/f0f45aae-caa3-4c50-9059-be42d328cba1-kube-api-access-ctw8c\") pod \"f0f45aae-caa3-4c50-9059-be42d328cba1\" (UID: \"f0f45aae-caa3-4c50-9059-be42d328cba1\") " Mar 13 12:09:34 crc kubenswrapper[4837]: I0313 12:09:34.203224 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0f45aae-caa3-4c50-9059-be42d328cba1-scripts\") pod \"f0f45aae-caa3-4c50-9059-be42d328cba1\" (UID: \"f0f45aae-caa3-4c50-9059-be42d328cba1\") " Mar 13 12:09:34 crc kubenswrapper[4837]: I0313 12:09:34.223231 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0f45aae-caa3-4c50-9059-be42d328cba1-kube-api-access-ctw8c" (OuterVolumeSpecName: "kube-api-access-ctw8c") pod "f0f45aae-caa3-4c50-9059-be42d328cba1" (UID: "f0f45aae-caa3-4c50-9059-be42d328cba1"). InnerVolumeSpecName "kube-api-access-ctw8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:09:34 crc kubenswrapper[4837]: I0313 12:09:34.224385 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0f45aae-caa3-4c50-9059-be42d328cba1-scripts" (OuterVolumeSpecName: "scripts") pod "f0f45aae-caa3-4c50-9059-be42d328cba1" (UID: "f0f45aae-caa3-4c50-9059-be42d328cba1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:34 crc kubenswrapper[4837]: I0313 12:09:34.237818 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0f45aae-caa3-4c50-9059-be42d328cba1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f0f45aae-caa3-4c50-9059-be42d328cba1" (UID: "f0f45aae-caa3-4c50-9059-be42d328cba1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:34 crc kubenswrapper[4837]: I0313 12:09:34.243873 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0f45aae-caa3-4c50-9059-be42d328cba1-config-data" (OuterVolumeSpecName: "config-data") pod "f0f45aae-caa3-4c50-9059-be42d328cba1" (UID: "f0f45aae-caa3-4c50-9059-be42d328cba1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:34 crc kubenswrapper[4837]: I0313 12:09:34.308002 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0f45aae-caa3-4c50-9059-be42d328cba1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:34 crc kubenswrapper[4837]: I0313 12:09:34.308290 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0f45aae-caa3-4c50-9059-be42d328cba1-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:34 crc kubenswrapper[4837]: I0313 12:09:34.308359 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctw8c\" (UniqueName: \"kubernetes.io/projected/f0f45aae-caa3-4c50-9059-be42d328cba1-kube-api-access-ctw8c\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:34 crc kubenswrapper[4837]: I0313 12:09:34.308417 4837 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0f45aae-caa3-4c50-9059-be42d328cba1-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:34 crc kubenswrapper[4837]: I0313 12:09:34.664472 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mzmd5" event={"ID":"f0f45aae-caa3-4c50-9059-be42d328cba1","Type":"ContainerDied","Data":"169405adf02892a1cf5eb78b7101a3517b5c16d690364e5a55dd374130c20692"} Mar 13 12:09:34 crc kubenswrapper[4837]: I0313 12:09:34.664733 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="169405adf02892a1cf5eb78b7101a3517b5c16d690364e5a55dd374130c20692" Mar 13 12:09:34 crc kubenswrapper[4837]: I0313 12:09:34.664884 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mzmd5" Mar 13 12:09:34 crc kubenswrapper[4837]: I0313 12:09:34.860832 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 12:09:34 crc kubenswrapper[4837]: I0313 12:09:34.861093 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="4cc7473d-2608-4989-990f-a19d70e8a3a3" containerName="nova-scheduler-scheduler" containerID="cri-o://84217929c8dd01d9f27889d14b0e8c6e8e14465fc1547bcea5b66260cee7a8c7" gracePeriod=30 Mar 13 12:09:34 crc kubenswrapper[4837]: I0313 12:09:34.883346 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 12:09:34 crc kubenswrapper[4837]: I0313 12:09:34.883667 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="78c9774b-e6ed-434a-9a05-77de64d14c5c" containerName="nova-api-log" containerID="cri-o://7b6a520617c17e8a5aea39cc8ad6ac82a12e5d413248ffe3dd096388d1de71a6" gracePeriod=30 Mar 13 12:09:34 crc kubenswrapper[4837]: I0313 12:09:34.884154 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="78c9774b-e6ed-434a-9a05-77de64d14c5c" containerName="nova-api-api" containerID="cri-o://4ca07acd2a41f0c3f14868adbd5c92b4f4e135a23bfc2d9caf191ce9bdb4d94a" gracePeriod=30 Mar 13 12:09:34 crc kubenswrapper[4837]: I0313 12:09:34.899354 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:09:34 crc kubenswrapper[4837]: I0313 12:09:34.899859 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43" containerName="nova-metadata-log" containerID="cri-o://9b97f0741ed8dc4568de5acf76058ec03e048925850e843f27b09f36ed5bcf98" gracePeriod=30 Mar 13 12:09:34 crc kubenswrapper[4837]: I0313 12:09:34.900077 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43" containerName="nova-metadata-metadata" containerID="cri-o://456c80f0855c2245bf0a0fc6d9cc652dad19c0773d477ea76ccfc7415d5a5c8e" gracePeriod=30 Mar 13 12:09:35 crc kubenswrapper[4837]: E0313 12:09:35.633166 4837 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="84217929c8dd01d9f27889d14b0e8c6e8e14465fc1547bcea5b66260cee7a8c7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 12:09:35 crc kubenswrapper[4837]: E0313 12:09:35.636116 4837 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="84217929c8dd01d9f27889d14b0e8c6e8e14465fc1547bcea5b66260cee7a8c7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 12:09:35 crc kubenswrapper[4837]: E0313 12:09:35.637410 4837 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="84217929c8dd01d9f27889d14b0e8c6e8e14465fc1547bcea5b66260cee7a8c7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 12:09:35 crc kubenswrapper[4837]: E0313 12:09:35.637451 4837 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="4cc7473d-2608-4989-990f-a19d70e8a3a3" containerName="nova-scheduler-scheduler" Mar 13 12:09:35 crc kubenswrapper[4837]: I0313 12:09:35.674866 4837 generic.go:334] "Generic (PLEG): container finished" podID="5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43" containerID="9b97f0741ed8dc4568de5acf76058ec03e048925850e843f27b09f36ed5bcf98" exitCode=143 Mar 13 12:09:35 crc kubenswrapper[4837]: I0313 12:09:35.674951 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43","Type":"ContainerDied","Data":"9b97f0741ed8dc4568de5acf76058ec03e048925850e843f27b09f36ed5bcf98"} Mar 13 12:09:35 crc kubenswrapper[4837]: I0313 12:09:35.676980 4837 generic.go:334] "Generic (PLEG): container finished" podID="78c9774b-e6ed-434a-9a05-77de64d14c5c" containerID="7b6a520617c17e8a5aea39cc8ad6ac82a12e5d413248ffe3dd096388d1de71a6" exitCode=143 Mar 13 12:09:35 crc kubenswrapper[4837]: I0313 12:09:35.677038 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"78c9774b-e6ed-434a-9a05-77de64d14c5c","Type":"ContainerDied","Data":"7b6a520617c17e8a5aea39cc8ad6ac82a12e5d413248ffe3dd096388d1de71a6"} Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.505726 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.589883 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-config-data\") pod \"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43\" (UID: \"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43\") " Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.590053 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-combined-ca-bundle\") pod \"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43\" (UID: \"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43\") " Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.590117 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7skk9\" (UniqueName: \"kubernetes.io/projected/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-kube-api-access-7skk9\") pod \"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43\" (UID: \"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43\") " Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.590816 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-nova-metadata-tls-certs\") pod \"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43\" (UID: \"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43\") " Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.591177 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-logs\") pod \"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43\" (UID: \"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43\") " Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.591880 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-logs" (OuterVolumeSpecName: "logs") pod "5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43" (UID: "5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.598267 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-kube-api-access-7skk9" (OuterVolumeSpecName: "kube-api-access-7skk9") pod "5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43" (UID: "5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43"). InnerVolumeSpecName "kube-api-access-7skk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.619946 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-config-data" (OuterVolumeSpecName: "config-data") pod "5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43" (UID: "5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.621882 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43" (UID: "5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.640090 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43" (UID: "5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.692987 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.693022 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7skk9\" (UniqueName: \"kubernetes.io/projected/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-kube-api-access-7skk9\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.693033 4837 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.693044 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.693053 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.704281 4837 generic.go:334] "Generic (PLEG): container finished" podID="5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43" containerID="456c80f0855c2245bf0a0fc6d9cc652dad19c0773d477ea76ccfc7415d5a5c8e" exitCode=0 Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.704335 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43","Type":"ContainerDied","Data":"456c80f0855c2245bf0a0fc6d9cc652dad19c0773d477ea76ccfc7415d5a5c8e"} Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.704368 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43","Type":"ContainerDied","Data":"c2ff21ee05eb4c0cd65e5feb281a54f68d478fb493c97488ec3ad06bbc0f4880"} Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.704391 4837 scope.go:117] "RemoveContainer" containerID="456c80f0855c2245bf0a0fc6d9cc652dad19c0773d477ea76ccfc7415d5a5c8e" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.704531 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.739427 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.740718 4837 scope.go:117] "RemoveContainer" containerID="9b97f0741ed8dc4568de5acf76058ec03e048925850e843f27b09f36ed5bcf98" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.751112 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.759306 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:09:38 crc kubenswrapper[4837]: E0313 12:09:38.759790 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6de330b6-0bbb-4a9d-9062-9c7ed182a189" containerName="dnsmasq-dns" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.759805 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6de330b6-0bbb-4a9d-9062-9c7ed182a189" containerName="dnsmasq-dns" Mar 13 12:09:38 crc kubenswrapper[4837]: E0313 12:09:38.759828 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43" containerName="nova-metadata-metadata" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.759834 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43" containerName="nova-metadata-metadata" Mar 13 12:09:38 crc kubenswrapper[4837]: E0313 12:09:38.759844 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6de330b6-0bbb-4a9d-9062-9c7ed182a189" containerName="init" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.759850 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6de330b6-0bbb-4a9d-9062-9c7ed182a189" containerName="init" Mar 13 12:09:38 crc kubenswrapper[4837]: E0313 12:09:38.759863 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0f45aae-caa3-4c50-9059-be42d328cba1" containerName="nova-manage" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.759869 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0f45aae-caa3-4c50-9059-be42d328cba1" containerName="nova-manage" Mar 13 12:09:38 crc kubenswrapper[4837]: E0313 12:09:38.759897 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43" containerName="nova-metadata-log" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.759903 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43" containerName="nova-metadata-log" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.760070 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43" containerName="nova-metadata-metadata" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.760080 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43" containerName="nova-metadata-log" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.760088 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0f45aae-caa3-4c50-9059-be42d328cba1" containerName="nova-manage" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.760111 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="6de330b6-0bbb-4a9d-9062-9c7ed182a189" containerName="dnsmasq-dns" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.761125 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.768571 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.781980 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.790706 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.791272 4837 scope.go:117] "RemoveContainer" containerID="456c80f0855c2245bf0a0fc6d9cc652dad19c0773d477ea76ccfc7415d5a5c8e" Mar 13 12:09:38 crc kubenswrapper[4837]: E0313 12:09:38.800706 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"456c80f0855c2245bf0a0fc6d9cc652dad19c0773d477ea76ccfc7415d5a5c8e\": container with ID starting with 456c80f0855c2245bf0a0fc6d9cc652dad19c0773d477ea76ccfc7415d5a5c8e not found: ID does not exist" containerID="456c80f0855c2245bf0a0fc6d9cc652dad19c0773d477ea76ccfc7415d5a5c8e" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.800866 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"456c80f0855c2245bf0a0fc6d9cc652dad19c0773d477ea76ccfc7415d5a5c8e"} err="failed to get container status \"456c80f0855c2245bf0a0fc6d9cc652dad19c0773d477ea76ccfc7415d5a5c8e\": rpc error: code = NotFound desc = could not find container \"456c80f0855c2245bf0a0fc6d9cc652dad19c0773d477ea76ccfc7415d5a5c8e\": container with ID starting with 456c80f0855c2245bf0a0fc6d9cc652dad19c0773d477ea76ccfc7415d5a5c8e not found: ID does not exist" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.800901 4837 scope.go:117] "RemoveContainer" containerID="9b97f0741ed8dc4568de5acf76058ec03e048925850e843f27b09f36ed5bcf98" Mar 13 12:09:38 crc kubenswrapper[4837]: E0313 12:09:38.801505 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b97f0741ed8dc4568de5acf76058ec03e048925850e843f27b09f36ed5bcf98\": container with ID starting with 9b97f0741ed8dc4568de5acf76058ec03e048925850e843f27b09f36ed5bcf98 not found: ID does not exist" containerID="9b97f0741ed8dc4568de5acf76058ec03e048925850e843f27b09f36ed5bcf98" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.801621 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b97f0741ed8dc4568de5acf76058ec03e048925850e843f27b09f36ed5bcf98"} err="failed to get container status \"9b97f0741ed8dc4568de5acf76058ec03e048925850e843f27b09f36ed5bcf98\": rpc error: code = NotFound desc = could not find container \"9b97f0741ed8dc4568de5acf76058ec03e048925850e843f27b09f36ed5bcf98\": container with ID starting with 9b97f0741ed8dc4568de5acf76058ec03e048925850e843f27b09f36ed5bcf98 not found: ID does not exist" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.914814 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7faa5418-aa48-4e20-830c-bb171cfea0d9-logs\") pod \"nova-metadata-0\" (UID: \"7faa5418-aa48-4e20-830c-bb171cfea0d9\") " pod="openstack/nova-metadata-0" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.914909 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7faa5418-aa48-4e20-830c-bb171cfea0d9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7faa5418-aa48-4e20-830c-bb171cfea0d9\") " pod="openstack/nova-metadata-0" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.915018 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7faa5418-aa48-4e20-830c-bb171cfea0d9-config-data\") pod \"nova-metadata-0\" (UID: \"7faa5418-aa48-4e20-830c-bb171cfea0d9\") " pod="openstack/nova-metadata-0" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.915072 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rwv5\" (UniqueName: \"kubernetes.io/projected/7faa5418-aa48-4e20-830c-bb171cfea0d9-kube-api-access-9rwv5\") pod \"nova-metadata-0\" (UID: \"7faa5418-aa48-4e20-830c-bb171cfea0d9\") " pod="openstack/nova-metadata-0" Mar 13 12:09:38 crc kubenswrapper[4837]: I0313 12:09:38.915136 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7faa5418-aa48-4e20-830c-bb171cfea0d9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7faa5418-aa48-4e20-830c-bb171cfea0d9\") " pod="openstack/nova-metadata-0" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.016756 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7faa5418-aa48-4e20-830c-bb171cfea0d9-config-data\") pod \"nova-metadata-0\" (UID: \"7faa5418-aa48-4e20-830c-bb171cfea0d9\") " pod="openstack/nova-metadata-0" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.016843 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rwv5\" (UniqueName: \"kubernetes.io/projected/7faa5418-aa48-4e20-830c-bb171cfea0d9-kube-api-access-9rwv5\") pod \"nova-metadata-0\" (UID: \"7faa5418-aa48-4e20-830c-bb171cfea0d9\") " pod="openstack/nova-metadata-0" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.016907 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7faa5418-aa48-4e20-830c-bb171cfea0d9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7faa5418-aa48-4e20-830c-bb171cfea0d9\") " pod="openstack/nova-metadata-0" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.016952 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7faa5418-aa48-4e20-830c-bb171cfea0d9-logs\") pod \"nova-metadata-0\" (UID: \"7faa5418-aa48-4e20-830c-bb171cfea0d9\") " pod="openstack/nova-metadata-0" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.017028 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7faa5418-aa48-4e20-830c-bb171cfea0d9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7faa5418-aa48-4e20-830c-bb171cfea0d9\") " pod="openstack/nova-metadata-0" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.017441 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7faa5418-aa48-4e20-830c-bb171cfea0d9-logs\") pod \"nova-metadata-0\" (UID: \"7faa5418-aa48-4e20-830c-bb171cfea0d9\") " pod="openstack/nova-metadata-0" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.020680 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7faa5418-aa48-4e20-830c-bb171cfea0d9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7faa5418-aa48-4e20-830c-bb171cfea0d9\") " pod="openstack/nova-metadata-0" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.020731 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7faa5418-aa48-4e20-830c-bb171cfea0d9-config-data\") pod \"nova-metadata-0\" (UID: \"7faa5418-aa48-4e20-830c-bb171cfea0d9\") " pod="openstack/nova-metadata-0" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.021311 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7faa5418-aa48-4e20-830c-bb171cfea0d9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7faa5418-aa48-4e20-830c-bb171cfea0d9\") " pod="openstack/nova-metadata-0" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.040325 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rwv5\" (UniqueName: \"kubernetes.io/projected/7faa5418-aa48-4e20-830c-bb171cfea0d9-kube-api-access-9rwv5\") pod \"nova-metadata-0\" (UID: \"7faa5418-aa48-4e20-830c-bb171cfea0d9\") " pod="openstack/nova-metadata-0" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.063047 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43" path="/var/lib/kubelet/pods/5000e5ff-8cf6-4f0c-a6c4-e6b550c2fe43/volumes" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.090768 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 12:09:39 crc kubenswrapper[4837]: W0313 12:09:39.548023 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7faa5418_aa48_4e20_830c_bb171cfea0d9.slice/crio-b1cb476eb0c804299feb94db952778382ba43003eedec8e33c2f2553f024b174 WatchSource:0}: Error finding container b1cb476eb0c804299feb94db952778382ba43003eedec8e33c2f2553f024b174: Status 404 returned error can't find the container with id b1cb476eb0c804299feb94db952778382ba43003eedec8e33c2f2553f024b174 Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.559224 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.671393 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.719007 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.742717 4837 generic.go:334] "Generic (PLEG): container finished" podID="78c9774b-e6ed-434a-9a05-77de64d14c5c" containerID="4ca07acd2a41f0c3f14868adbd5c92b4f4e135a23bfc2d9caf191ce9bdb4d94a" exitCode=0 Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.742791 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"78c9774b-e6ed-434a-9a05-77de64d14c5c","Type":"ContainerDied","Data":"4ca07acd2a41f0c3f14868adbd5c92b4f4e135a23bfc2d9caf191ce9bdb4d94a"} Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.742820 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"78c9774b-e6ed-434a-9a05-77de64d14c5c","Type":"ContainerDied","Data":"196c93139204cac88ac74bf775fe6446d52e56b24f2532d6b6a8393e6ffd7da4"} Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.742836 4837 scope.go:117] "RemoveContainer" containerID="4ca07acd2a41f0c3f14868adbd5c92b4f4e135a23bfc2d9caf191ce9bdb4d94a" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.742832 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.743939 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cc7473d-2608-4989-990f-a19d70e8a3a3-combined-ca-bundle\") pod \"4cc7473d-2608-4989-990f-a19d70e8a3a3\" (UID: \"4cc7473d-2608-4989-990f-a19d70e8a3a3\") " Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.744135 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cc7473d-2608-4989-990f-a19d70e8a3a3-config-data\") pod \"4cc7473d-2608-4989-990f-a19d70e8a3a3\" (UID: \"4cc7473d-2608-4989-990f-a19d70e8a3a3\") " Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.744231 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xqdx\" (UniqueName: \"kubernetes.io/projected/4cc7473d-2608-4989-990f-a19d70e8a3a3-kube-api-access-7xqdx\") pod \"4cc7473d-2608-4989-990f-a19d70e8a3a3\" (UID: \"4cc7473d-2608-4989-990f-a19d70e8a3a3\") " Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.749478 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7faa5418-aa48-4e20-830c-bb171cfea0d9","Type":"ContainerStarted","Data":"b1cb476eb0c804299feb94db952778382ba43003eedec8e33c2f2553f024b174"} Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.755137 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cc7473d-2608-4989-990f-a19d70e8a3a3-kube-api-access-7xqdx" (OuterVolumeSpecName: "kube-api-access-7xqdx") pod "4cc7473d-2608-4989-990f-a19d70e8a3a3" (UID: "4cc7473d-2608-4989-990f-a19d70e8a3a3"). InnerVolumeSpecName "kube-api-access-7xqdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.755533 4837 generic.go:334] "Generic (PLEG): container finished" podID="4cc7473d-2608-4989-990f-a19d70e8a3a3" containerID="84217929c8dd01d9f27889d14b0e8c6e8e14465fc1547bcea5b66260cee7a8c7" exitCode=0 Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.755572 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4cc7473d-2608-4989-990f-a19d70e8a3a3","Type":"ContainerDied","Data":"84217929c8dd01d9f27889d14b0e8c6e8e14465fc1547bcea5b66260cee7a8c7"} Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.755583 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.755598 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4cc7473d-2608-4989-990f-a19d70e8a3a3","Type":"ContainerDied","Data":"7499f46006750dbda5f6e6ebf4a383aa5e9cd8afe4efcdd2e54c6f88540d0454"} Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.770906 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cc7473d-2608-4989-990f-a19d70e8a3a3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4cc7473d-2608-4989-990f-a19d70e8a3a3" (UID: "4cc7473d-2608-4989-990f-a19d70e8a3a3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.773537 4837 scope.go:117] "RemoveContainer" containerID="7b6a520617c17e8a5aea39cc8ad6ac82a12e5d413248ffe3dd096388d1de71a6" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.778316 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cc7473d-2608-4989-990f-a19d70e8a3a3-config-data" (OuterVolumeSpecName: "config-data") pod "4cc7473d-2608-4989-990f-a19d70e8a3a3" (UID: "4cc7473d-2608-4989-990f-a19d70e8a3a3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.791965 4837 scope.go:117] "RemoveContainer" containerID="4ca07acd2a41f0c3f14868adbd5c92b4f4e135a23bfc2d9caf191ce9bdb4d94a" Mar 13 12:09:39 crc kubenswrapper[4837]: E0313 12:09:39.792746 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ca07acd2a41f0c3f14868adbd5c92b4f4e135a23bfc2d9caf191ce9bdb4d94a\": container with ID starting with 4ca07acd2a41f0c3f14868adbd5c92b4f4e135a23bfc2d9caf191ce9bdb4d94a not found: ID does not exist" containerID="4ca07acd2a41f0c3f14868adbd5c92b4f4e135a23bfc2d9caf191ce9bdb4d94a" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.792791 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ca07acd2a41f0c3f14868adbd5c92b4f4e135a23bfc2d9caf191ce9bdb4d94a"} err="failed to get container status \"4ca07acd2a41f0c3f14868adbd5c92b4f4e135a23bfc2d9caf191ce9bdb4d94a\": rpc error: code = NotFound desc = could not find container \"4ca07acd2a41f0c3f14868adbd5c92b4f4e135a23bfc2d9caf191ce9bdb4d94a\": container with ID starting with 4ca07acd2a41f0c3f14868adbd5c92b4f4e135a23bfc2d9caf191ce9bdb4d94a not found: ID does not exist" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.792824 4837 scope.go:117] "RemoveContainer" containerID="7b6a520617c17e8a5aea39cc8ad6ac82a12e5d413248ffe3dd096388d1de71a6" Mar 13 12:09:39 crc kubenswrapper[4837]: E0313 12:09:39.793222 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b6a520617c17e8a5aea39cc8ad6ac82a12e5d413248ffe3dd096388d1de71a6\": container with ID starting with 7b6a520617c17e8a5aea39cc8ad6ac82a12e5d413248ffe3dd096388d1de71a6 not found: ID does not exist" containerID="7b6a520617c17e8a5aea39cc8ad6ac82a12e5d413248ffe3dd096388d1de71a6" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.793434 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b6a520617c17e8a5aea39cc8ad6ac82a12e5d413248ffe3dd096388d1de71a6"} err="failed to get container status \"7b6a520617c17e8a5aea39cc8ad6ac82a12e5d413248ffe3dd096388d1de71a6\": rpc error: code = NotFound desc = could not find container \"7b6a520617c17e8a5aea39cc8ad6ac82a12e5d413248ffe3dd096388d1de71a6\": container with ID starting with 7b6a520617c17e8a5aea39cc8ad6ac82a12e5d413248ffe3dd096388d1de71a6 not found: ID does not exist" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.793584 4837 scope.go:117] "RemoveContainer" containerID="84217929c8dd01d9f27889d14b0e8c6e8e14465fc1547bcea5b66260cee7a8c7" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.821030 4837 scope.go:117] "RemoveContainer" containerID="84217929c8dd01d9f27889d14b0e8c6e8e14465fc1547bcea5b66260cee7a8c7" Mar 13 12:09:39 crc kubenswrapper[4837]: E0313 12:09:39.821520 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84217929c8dd01d9f27889d14b0e8c6e8e14465fc1547bcea5b66260cee7a8c7\": container with ID starting with 84217929c8dd01d9f27889d14b0e8c6e8e14465fc1547bcea5b66260cee7a8c7 not found: ID does not exist" containerID="84217929c8dd01d9f27889d14b0e8c6e8e14465fc1547bcea5b66260cee7a8c7" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.821653 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84217929c8dd01d9f27889d14b0e8c6e8e14465fc1547bcea5b66260cee7a8c7"} err="failed to get container status \"84217929c8dd01d9f27889d14b0e8c6e8e14465fc1547bcea5b66260cee7a8c7\": rpc error: code = NotFound desc = could not find container \"84217929c8dd01d9f27889d14b0e8c6e8e14465fc1547bcea5b66260cee7a8c7\": container with ID starting with 84217929c8dd01d9f27889d14b0e8c6e8e14465fc1547bcea5b66260cee7a8c7 not found: ID does not exist" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.846099 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52nqk\" (UniqueName: \"kubernetes.io/projected/78c9774b-e6ed-434a-9a05-77de64d14c5c-kube-api-access-52nqk\") pod \"78c9774b-e6ed-434a-9a05-77de64d14c5c\" (UID: \"78c9774b-e6ed-434a-9a05-77de64d14c5c\") " Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.846259 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/78c9774b-e6ed-434a-9a05-77de64d14c5c-public-tls-certs\") pod \"78c9774b-e6ed-434a-9a05-77de64d14c5c\" (UID: \"78c9774b-e6ed-434a-9a05-77de64d14c5c\") " Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.846331 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78c9774b-e6ed-434a-9a05-77de64d14c5c-logs\") pod \"78c9774b-e6ed-434a-9a05-77de64d14c5c\" (UID: \"78c9774b-e6ed-434a-9a05-77de64d14c5c\") " Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.846536 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78c9774b-e6ed-434a-9a05-77de64d14c5c-config-data\") pod \"78c9774b-e6ed-434a-9a05-77de64d14c5c\" (UID: \"78c9774b-e6ed-434a-9a05-77de64d14c5c\") " Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.846693 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78c9774b-e6ed-434a-9a05-77de64d14c5c-combined-ca-bundle\") pod \"78c9774b-e6ed-434a-9a05-77de64d14c5c\" (UID: \"78c9774b-e6ed-434a-9a05-77de64d14c5c\") " Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.846908 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78c9774b-e6ed-434a-9a05-77de64d14c5c-logs" (OuterVolumeSpecName: "logs") pod "78c9774b-e6ed-434a-9a05-77de64d14c5c" (UID: "78c9774b-e6ed-434a-9a05-77de64d14c5c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.847027 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/78c9774b-e6ed-434a-9a05-77de64d14c5c-internal-tls-certs\") pod \"78c9774b-e6ed-434a-9a05-77de64d14c5c\" (UID: \"78c9774b-e6ed-434a-9a05-77de64d14c5c\") " Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.847549 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cc7473d-2608-4989-990f-a19d70e8a3a3-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.847630 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xqdx\" (UniqueName: \"kubernetes.io/projected/4cc7473d-2608-4989-990f-a19d70e8a3a3-kube-api-access-7xqdx\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.847764 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cc7473d-2608-4989-990f-a19d70e8a3a3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.851062 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78c9774b-e6ed-434a-9a05-77de64d14c5c-kube-api-access-52nqk" (OuterVolumeSpecName: "kube-api-access-52nqk") pod "78c9774b-e6ed-434a-9a05-77de64d14c5c" (UID: "78c9774b-e6ed-434a-9a05-77de64d14c5c"). InnerVolumeSpecName "kube-api-access-52nqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.881481 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78c9774b-e6ed-434a-9a05-77de64d14c5c-config-data" (OuterVolumeSpecName: "config-data") pod "78c9774b-e6ed-434a-9a05-77de64d14c5c" (UID: "78c9774b-e6ed-434a-9a05-77de64d14c5c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.886473 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78c9774b-e6ed-434a-9a05-77de64d14c5c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "78c9774b-e6ed-434a-9a05-77de64d14c5c" (UID: "78c9774b-e6ed-434a-9a05-77de64d14c5c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.909920 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78c9774b-e6ed-434a-9a05-77de64d14c5c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "78c9774b-e6ed-434a-9a05-77de64d14c5c" (UID: "78c9774b-e6ed-434a-9a05-77de64d14c5c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.915835 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78c9774b-e6ed-434a-9a05-77de64d14c5c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "78c9774b-e6ed-434a-9a05-77de64d14c5c" (UID: "78c9774b-e6ed-434a-9a05-77de64d14c5c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.949788 4837 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/78c9774b-e6ed-434a-9a05-77de64d14c5c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.949864 4837 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78c9774b-e6ed-434a-9a05-77de64d14c5c-logs\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.949877 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78c9774b-e6ed-434a-9a05-77de64d14c5c-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.949918 4837 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78c9774b-e6ed-434a-9a05-77de64d14c5c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.949931 4837 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/78c9774b-e6ed-434a-9a05-77de64d14c5c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:39 crc kubenswrapper[4837]: I0313 12:09:39.949942 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52nqk\" (UniqueName: \"kubernetes.io/projected/78c9774b-e6ed-434a-9a05-77de64d14c5c-kube-api-access-52nqk\") on node \"crc\" DevicePath \"\"" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.078523 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.087390 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.100574 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.109384 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.124449 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 13 12:09:40 crc kubenswrapper[4837]: E0313 12:09:40.124943 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cc7473d-2608-4989-990f-a19d70e8a3a3" containerName="nova-scheduler-scheduler" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.124967 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cc7473d-2608-4989-990f-a19d70e8a3a3" containerName="nova-scheduler-scheduler" Mar 13 12:09:40 crc kubenswrapper[4837]: E0313 12:09:40.124992 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78c9774b-e6ed-434a-9a05-77de64d14c5c" containerName="nova-api-api" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.125001 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="78c9774b-e6ed-434a-9a05-77de64d14c5c" containerName="nova-api-api" Mar 13 12:09:40 crc kubenswrapper[4837]: E0313 12:09:40.125019 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78c9774b-e6ed-434a-9a05-77de64d14c5c" containerName="nova-api-log" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.125028 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="78c9774b-e6ed-434a-9a05-77de64d14c5c" containerName="nova-api-log" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.125243 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cc7473d-2608-4989-990f-a19d70e8a3a3" containerName="nova-scheduler-scheduler" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.125269 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="78c9774b-e6ed-434a-9a05-77de64d14c5c" containerName="nova-api-api" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.125285 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="78c9774b-e6ed-434a-9a05-77de64d14c5c" containerName="nova-api-log" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.126443 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.140499 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.140773 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.140919 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.141814 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.144817 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.145900 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.149761 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.170328 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.253974 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e6cd1d9-f670-4e94-8322-44e471c3be71-logs\") pod \"nova-api-0\" (UID: \"4e6cd1d9-f670-4e94-8322-44e471c3be71\") " pod="openstack/nova-api-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.254257 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e6cd1d9-f670-4e94-8322-44e471c3be71-public-tls-certs\") pod \"nova-api-0\" (UID: \"4e6cd1d9-f670-4e94-8322-44e471c3be71\") " pod="openstack/nova-api-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.254319 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d380e047-7297-4835-b948-6c86c6b6aa27-config-data\") pod \"nova-scheduler-0\" (UID: \"d380e047-7297-4835-b948-6c86c6b6aa27\") " pod="openstack/nova-scheduler-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.254350 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e6cd1d9-f670-4e94-8322-44e471c3be71-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4e6cd1d9-f670-4e94-8322-44e471c3be71\") " pod="openstack/nova-api-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.254621 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs59c\" (UniqueName: \"kubernetes.io/projected/4e6cd1d9-f670-4e94-8322-44e471c3be71-kube-api-access-gs59c\") pod \"nova-api-0\" (UID: \"4e6cd1d9-f670-4e94-8322-44e471c3be71\") " pod="openstack/nova-api-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.254775 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d380e047-7297-4835-b948-6c86c6b6aa27-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d380e047-7297-4835-b948-6c86c6b6aa27\") " pod="openstack/nova-scheduler-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.254821 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6pv7\" (UniqueName: \"kubernetes.io/projected/d380e047-7297-4835-b948-6c86c6b6aa27-kube-api-access-t6pv7\") pod \"nova-scheduler-0\" (UID: \"d380e047-7297-4835-b948-6c86c6b6aa27\") " pod="openstack/nova-scheduler-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.254908 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e6cd1d9-f670-4e94-8322-44e471c3be71-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4e6cd1d9-f670-4e94-8322-44e471c3be71\") " pod="openstack/nova-api-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.254950 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e6cd1d9-f670-4e94-8322-44e471c3be71-config-data\") pod \"nova-api-0\" (UID: \"4e6cd1d9-f670-4e94-8322-44e471c3be71\") " pod="openstack/nova-api-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.356228 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs59c\" (UniqueName: \"kubernetes.io/projected/4e6cd1d9-f670-4e94-8322-44e471c3be71-kube-api-access-gs59c\") pod \"nova-api-0\" (UID: \"4e6cd1d9-f670-4e94-8322-44e471c3be71\") " pod="openstack/nova-api-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.356323 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d380e047-7297-4835-b948-6c86c6b6aa27-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d380e047-7297-4835-b948-6c86c6b6aa27\") " pod="openstack/nova-scheduler-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.356396 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6pv7\" (UniqueName: \"kubernetes.io/projected/d380e047-7297-4835-b948-6c86c6b6aa27-kube-api-access-t6pv7\") pod \"nova-scheduler-0\" (UID: \"d380e047-7297-4835-b948-6c86c6b6aa27\") " pod="openstack/nova-scheduler-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.356442 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e6cd1d9-f670-4e94-8322-44e471c3be71-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4e6cd1d9-f670-4e94-8322-44e471c3be71\") " pod="openstack/nova-api-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.356473 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e6cd1d9-f670-4e94-8322-44e471c3be71-config-data\") pod \"nova-api-0\" (UID: \"4e6cd1d9-f670-4e94-8322-44e471c3be71\") " pod="openstack/nova-api-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.356499 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e6cd1d9-f670-4e94-8322-44e471c3be71-logs\") pod \"nova-api-0\" (UID: \"4e6cd1d9-f670-4e94-8322-44e471c3be71\") " pod="openstack/nova-api-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.356551 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e6cd1d9-f670-4e94-8322-44e471c3be71-public-tls-certs\") pod \"nova-api-0\" (UID: \"4e6cd1d9-f670-4e94-8322-44e471c3be71\") " pod="openstack/nova-api-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.356596 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d380e047-7297-4835-b948-6c86c6b6aa27-config-data\") pod \"nova-scheduler-0\" (UID: \"d380e047-7297-4835-b948-6c86c6b6aa27\") " pod="openstack/nova-scheduler-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.356652 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e6cd1d9-f670-4e94-8322-44e471c3be71-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4e6cd1d9-f670-4e94-8322-44e471c3be71\") " pod="openstack/nova-api-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.357169 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e6cd1d9-f670-4e94-8322-44e471c3be71-logs\") pod \"nova-api-0\" (UID: \"4e6cd1d9-f670-4e94-8322-44e471c3be71\") " pod="openstack/nova-api-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.361325 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e6cd1d9-f670-4e94-8322-44e471c3be71-public-tls-certs\") pod \"nova-api-0\" (UID: \"4e6cd1d9-f670-4e94-8322-44e471c3be71\") " pod="openstack/nova-api-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.361527 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e6cd1d9-f670-4e94-8322-44e471c3be71-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4e6cd1d9-f670-4e94-8322-44e471c3be71\") " pod="openstack/nova-api-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.362003 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e6cd1d9-f670-4e94-8322-44e471c3be71-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4e6cd1d9-f670-4e94-8322-44e471c3be71\") " pod="openstack/nova-api-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.362558 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d380e047-7297-4835-b948-6c86c6b6aa27-config-data\") pod \"nova-scheduler-0\" (UID: \"d380e047-7297-4835-b948-6c86c6b6aa27\") " pod="openstack/nova-scheduler-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.362657 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e6cd1d9-f670-4e94-8322-44e471c3be71-config-data\") pod \"nova-api-0\" (UID: \"4e6cd1d9-f670-4e94-8322-44e471c3be71\") " pod="openstack/nova-api-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.363026 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d380e047-7297-4835-b948-6c86c6b6aa27-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d380e047-7297-4835-b948-6c86c6b6aa27\") " pod="openstack/nova-scheduler-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.378432 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs59c\" (UniqueName: \"kubernetes.io/projected/4e6cd1d9-f670-4e94-8322-44e471c3be71-kube-api-access-gs59c\") pod \"nova-api-0\" (UID: \"4e6cd1d9-f670-4e94-8322-44e471c3be71\") " pod="openstack/nova-api-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.384891 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6pv7\" (UniqueName: \"kubernetes.io/projected/d380e047-7297-4835-b948-6c86c6b6aa27-kube-api-access-t6pv7\") pod \"nova-scheduler-0\" (UID: \"d380e047-7297-4835-b948-6c86c6b6aa27\") " pod="openstack/nova-scheduler-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.518334 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.527046 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.769982 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7faa5418-aa48-4e20-830c-bb171cfea0d9","Type":"ContainerStarted","Data":"47b96a2a9e2d4fd021ca3db7be839e86e74c808de1dd61390ea0329c3aa36dbb"} Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.770190 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7faa5418-aa48-4e20-830c-bb171cfea0d9","Type":"ContainerStarted","Data":"f8feaf729480571f3bbdf2223f87e36947b32528c83503013718e8240ce9b19e"} Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.797786 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.797767274 podStartE2EDuration="2.797767274s" podCreationTimestamp="2026-03-13 12:09:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:09:40.789318558 +0000 UTC m=+1296.427585311" watchObservedRunningTime="2026-03-13 12:09:40.797767274 +0000 UTC m=+1296.436034037" Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.962591 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 12:09:40 crc kubenswrapper[4837]: I0313 12:09:40.976359 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 12:09:41 crc kubenswrapper[4837]: I0313 12:09:41.060773 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cc7473d-2608-4989-990f-a19d70e8a3a3" path="/var/lib/kubelet/pods/4cc7473d-2608-4989-990f-a19d70e8a3a3/volumes" Mar 13 12:09:41 crc kubenswrapper[4837]: I0313 12:09:41.061967 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78c9774b-e6ed-434a-9a05-77de64d14c5c" path="/var/lib/kubelet/pods/78c9774b-e6ed-434a-9a05-77de64d14c5c/volumes" Mar 13 12:09:41 crc kubenswrapper[4837]: I0313 12:09:41.780151 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d380e047-7297-4835-b948-6c86c6b6aa27","Type":"ContainerStarted","Data":"1ab5e7d45319b1507059f32e31610ecdcd6883c277c5700b64696c664cfd5b58"} Mar 13 12:09:41 crc kubenswrapper[4837]: I0313 12:09:41.780512 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d380e047-7297-4835-b948-6c86c6b6aa27","Type":"ContainerStarted","Data":"275a8f630bfdc653a0077fb2b60275b114cfeb9c7c779f255ab8db2f4c7baf1d"} Mar 13 12:09:41 crc kubenswrapper[4837]: I0313 12:09:41.782988 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4e6cd1d9-f670-4e94-8322-44e471c3be71","Type":"ContainerStarted","Data":"cc7a15dd902b97c758cb1f742fe8affed99ef0bc2035dc9fea8d357f72b9a616"} Mar 13 12:09:41 crc kubenswrapper[4837]: I0313 12:09:41.783069 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4e6cd1d9-f670-4e94-8322-44e471c3be71","Type":"ContainerStarted","Data":"53969cca8ee0ed37fdb444b1b9c5bf33145fca56378ff3e37a209bebb610a563"} Mar 13 12:09:41 crc kubenswrapper[4837]: I0313 12:09:41.783089 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4e6cd1d9-f670-4e94-8322-44e471c3be71","Type":"ContainerStarted","Data":"302f8203084fe3c1a235a2a972eeb6c0b3fc658aafcc9322a703a34aebd27b45"} Mar 13 12:09:41 crc kubenswrapper[4837]: I0313 12:09:41.808011 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.807989861 podStartE2EDuration="1.807989861s" podCreationTimestamp="2026-03-13 12:09:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:09:41.801087634 +0000 UTC m=+1297.439354437" watchObservedRunningTime="2026-03-13 12:09:41.807989861 +0000 UTC m=+1297.446256644" Mar 13 12:09:41 crc kubenswrapper[4837]: I0313 12:09:41.839652 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.839603817 podStartE2EDuration="1.839603817s" podCreationTimestamp="2026-03-13 12:09:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:09:41.835342152 +0000 UTC m=+1297.473608965" watchObservedRunningTime="2026-03-13 12:09:41.839603817 +0000 UTC m=+1297.477870590" Mar 13 12:09:44 crc kubenswrapper[4837]: I0313 12:09:44.091910 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 13 12:09:44 crc kubenswrapper[4837]: I0313 12:09:44.092186 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 13 12:09:45 crc kubenswrapper[4837]: I0313 12:09:45.527679 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 13 12:09:49 crc kubenswrapper[4837]: I0313 12:09:49.091417 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 13 12:09:49 crc kubenswrapper[4837]: I0313 12:09:49.091756 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 13 12:09:50 crc kubenswrapper[4837]: I0313 12:09:50.098903 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7faa5418-aa48-4e20-830c-bb171cfea0d9" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.212:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 12:09:50 crc kubenswrapper[4837]: I0313 12:09:50.104846 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7faa5418-aa48-4e20-830c-bb171cfea0d9" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.212:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 12:09:50 crc kubenswrapper[4837]: I0313 12:09:50.519341 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 12:09:50 crc kubenswrapper[4837]: I0313 12:09:50.519450 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 12:09:50 crc kubenswrapper[4837]: I0313 12:09:50.527871 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 13 12:09:50 crc kubenswrapper[4837]: I0313 12:09:50.557808 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 13 12:09:50 crc kubenswrapper[4837]: I0313 12:09:50.912650 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 13 12:09:51 crc kubenswrapper[4837]: I0313 12:09:51.536857 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4e6cd1d9-f670-4e94-8322-44e471c3be71" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.213:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 12:09:51 crc kubenswrapper[4837]: I0313 12:09:51.537250 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4e6cd1d9-f670-4e94-8322-44e471c3be71" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.213:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 12:09:56 crc kubenswrapper[4837]: I0313 12:09:56.323054 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 13 12:09:59 crc kubenswrapper[4837]: I0313 12:09:59.097134 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 13 12:09:59 crc kubenswrapper[4837]: I0313 12:09:59.098958 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 13 12:09:59 crc kubenswrapper[4837]: I0313 12:09:59.103472 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 13 12:09:59 crc kubenswrapper[4837]: I0313 12:09:59.990409 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 13 12:10:00 crc kubenswrapper[4837]: I0313 12:10:00.152154 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556730-jvprz"] Mar 13 12:10:00 crc kubenswrapper[4837]: I0313 12:10:00.153758 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556730-jvprz" Mar 13 12:10:00 crc kubenswrapper[4837]: I0313 12:10:00.155781 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:10:00 crc kubenswrapper[4837]: I0313 12:10:00.156087 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:10:00 crc kubenswrapper[4837]: I0313 12:10:00.156096 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:10:00 crc kubenswrapper[4837]: I0313 12:10:00.164379 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556730-jvprz"] Mar 13 12:10:00 crc kubenswrapper[4837]: I0313 12:10:00.243776 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2j9n\" (UniqueName: \"kubernetes.io/projected/348878ea-aa9f-4306-af10-6a56583447a4-kube-api-access-z2j9n\") pod \"auto-csr-approver-29556730-jvprz\" (UID: \"348878ea-aa9f-4306-af10-6a56583447a4\") " pod="openshift-infra/auto-csr-approver-29556730-jvprz" Mar 13 12:10:00 crc kubenswrapper[4837]: I0313 12:10:00.345709 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2j9n\" (UniqueName: \"kubernetes.io/projected/348878ea-aa9f-4306-af10-6a56583447a4-kube-api-access-z2j9n\") pod \"auto-csr-approver-29556730-jvprz\" (UID: \"348878ea-aa9f-4306-af10-6a56583447a4\") " pod="openshift-infra/auto-csr-approver-29556730-jvprz" Mar 13 12:10:00 crc kubenswrapper[4837]: I0313 12:10:00.365918 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2j9n\" (UniqueName: \"kubernetes.io/projected/348878ea-aa9f-4306-af10-6a56583447a4-kube-api-access-z2j9n\") pod \"auto-csr-approver-29556730-jvprz\" (UID: \"348878ea-aa9f-4306-af10-6a56583447a4\") " pod="openshift-infra/auto-csr-approver-29556730-jvprz" Mar 13 12:10:00 crc kubenswrapper[4837]: I0313 12:10:00.482874 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556730-jvprz" Mar 13 12:10:00 crc kubenswrapper[4837]: I0313 12:10:00.536045 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 13 12:10:00 crc kubenswrapper[4837]: I0313 12:10:00.536815 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 13 12:10:00 crc kubenswrapper[4837]: I0313 12:10:00.538398 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 13 12:10:00 crc kubenswrapper[4837]: I0313 12:10:00.581344 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 13 12:10:00 crc kubenswrapper[4837]: I0313 12:10:00.937333 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556730-jvprz"] Mar 13 12:10:00 crc kubenswrapper[4837]: W0313 12:10:00.942948 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod348878ea_aa9f_4306_af10_6a56583447a4.slice/crio-4d6246100aafc9dcc92b0edfac8408d1bba30aa0d3382d3be1cbdf84969b73b2 WatchSource:0}: Error finding container 4d6246100aafc9dcc92b0edfac8408d1bba30aa0d3382d3be1cbdf84969b73b2: Status 404 returned error can't find the container with id 4d6246100aafc9dcc92b0edfac8408d1bba30aa0d3382d3be1cbdf84969b73b2 Mar 13 12:10:00 crc kubenswrapper[4837]: I0313 12:10:00.991195 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556730-jvprz" event={"ID":"348878ea-aa9f-4306-af10-6a56583447a4","Type":"ContainerStarted","Data":"4d6246100aafc9dcc92b0edfac8408d1bba30aa0d3382d3be1cbdf84969b73b2"} Mar 13 12:10:00 crc kubenswrapper[4837]: I0313 12:10:00.991488 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 13 12:10:00 crc kubenswrapper[4837]: I0313 12:10:00.997436 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 13 12:10:03 crc kubenswrapper[4837]: I0313 12:10:03.009474 4837 generic.go:334] "Generic (PLEG): container finished" podID="348878ea-aa9f-4306-af10-6a56583447a4" containerID="400f25fc20473b4a0989af2562c9f1940f8ca26a8e2532da0bcde1d8c359bf39" exitCode=0 Mar 13 12:10:03 crc kubenswrapper[4837]: I0313 12:10:03.009533 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556730-jvprz" event={"ID":"348878ea-aa9f-4306-af10-6a56583447a4","Type":"ContainerDied","Data":"400f25fc20473b4a0989af2562c9f1940f8ca26a8e2532da0bcde1d8c359bf39"} Mar 13 12:10:04 crc kubenswrapper[4837]: I0313 12:10:04.432580 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556730-jvprz" Mar 13 12:10:04 crc kubenswrapper[4837]: I0313 12:10:04.522492 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2j9n\" (UniqueName: \"kubernetes.io/projected/348878ea-aa9f-4306-af10-6a56583447a4-kube-api-access-z2j9n\") pod \"348878ea-aa9f-4306-af10-6a56583447a4\" (UID: \"348878ea-aa9f-4306-af10-6a56583447a4\") " Mar 13 12:10:04 crc kubenswrapper[4837]: I0313 12:10:04.528925 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/348878ea-aa9f-4306-af10-6a56583447a4-kube-api-access-z2j9n" (OuterVolumeSpecName: "kube-api-access-z2j9n") pod "348878ea-aa9f-4306-af10-6a56583447a4" (UID: "348878ea-aa9f-4306-af10-6a56583447a4"). InnerVolumeSpecName "kube-api-access-z2j9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:10:04 crc kubenswrapper[4837]: I0313 12:10:04.625836 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2j9n\" (UniqueName: \"kubernetes.io/projected/348878ea-aa9f-4306-af10-6a56583447a4-kube-api-access-z2j9n\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:05 crc kubenswrapper[4837]: I0313 12:10:05.029845 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556730-jvprz" event={"ID":"348878ea-aa9f-4306-af10-6a56583447a4","Type":"ContainerDied","Data":"4d6246100aafc9dcc92b0edfac8408d1bba30aa0d3382d3be1cbdf84969b73b2"} Mar 13 12:10:05 crc kubenswrapper[4837]: I0313 12:10:05.029894 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d6246100aafc9dcc92b0edfac8408d1bba30aa0d3382d3be1cbdf84969b73b2" Mar 13 12:10:05 crc kubenswrapper[4837]: I0313 12:10:05.029894 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556730-jvprz" Mar 13 12:10:05 crc kubenswrapper[4837]: I0313 12:10:05.496457 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556724-st6gn"] Mar 13 12:10:05 crc kubenswrapper[4837]: I0313 12:10:05.505828 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556724-st6gn"] Mar 13 12:10:07 crc kubenswrapper[4837]: I0313 12:10:07.060822 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bda3181-d107-4de8-b754-e5e67dd8dd9c" path="/var/lib/kubelet/pods/8bda3181-d107-4de8-b754-e5e67dd8dd9c/volumes" Mar 13 12:10:08 crc kubenswrapper[4837]: I0313 12:10:08.815486 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 12:10:09 crc kubenswrapper[4837]: I0313 12:10:09.517452 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 12:10:13 crc kubenswrapper[4837]: I0313 12:10:13.268823 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="e7b01be4-73b6-48eb-a06d-4fb38863d982" containerName="rabbitmq" containerID="cri-o://616ab6849fdbe4a471544990fac8e7c0dc2c1e3c72338a214f97078a3b1bb01c" gracePeriod=604796 Mar 13 12:10:13 crc kubenswrapper[4837]: I0313 12:10:13.444344 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="13254c8b-516c-435e-9db2-a8d518434f29" containerName="rabbitmq" containerID="cri-o://0464fd995746ce013b42a116039d645f924aa9c972effad4862f7a836f1488e1" gracePeriod=604797 Mar 13 12:10:13 crc kubenswrapper[4837]: I0313 12:10:13.454712 4837 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="e7b01be4-73b6-48eb-a06d-4fb38863d982" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.103:5671: connect: connection refused" Mar 13 12:10:19 crc kubenswrapper[4837]: I0313 12:10:19.872027 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 12:10:19 crc kubenswrapper[4837]: I0313 12:10:19.983684 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.037477 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pl7pd\" (UniqueName: \"kubernetes.io/projected/e7b01be4-73b6-48eb-a06d-4fb38863d982-kube-api-access-pl7pd\") pod \"e7b01be4-73b6-48eb-a06d-4fb38863d982\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.037605 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e7b01be4-73b6-48eb-a06d-4fb38863d982-plugins-conf\") pod \"e7b01be4-73b6-48eb-a06d-4fb38863d982\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.037694 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e7b01be4-73b6-48eb-a06d-4fb38863d982-erlang-cookie-secret\") pod \"e7b01be4-73b6-48eb-a06d-4fb38863d982\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.037758 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e7b01be4-73b6-48eb-a06d-4fb38863d982-config-data\") pod \"e7b01be4-73b6-48eb-a06d-4fb38863d982\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.037810 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e7b01be4-73b6-48eb-a06d-4fb38863d982-rabbitmq-confd\") pod \"e7b01be4-73b6-48eb-a06d-4fb38863d982\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.037877 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e7b01be4-73b6-48eb-a06d-4fb38863d982-server-conf\") pod \"e7b01be4-73b6-48eb-a06d-4fb38863d982\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.037978 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e7b01be4-73b6-48eb-a06d-4fb38863d982-rabbitmq-plugins\") pod \"e7b01be4-73b6-48eb-a06d-4fb38863d982\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.038009 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e7b01be4-73b6-48eb-a06d-4fb38863d982-rabbitmq-erlang-cookie\") pod \"e7b01be4-73b6-48eb-a06d-4fb38863d982\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.038095 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e7b01be4-73b6-48eb-a06d-4fb38863d982-pod-info\") pod \"e7b01be4-73b6-48eb-a06d-4fb38863d982\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.038142 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"e7b01be4-73b6-48eb-a06d-4fb38863d982\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.038175 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e7b01be4-73b6-48eb-a06d-4fb38863d982-rabbitmq-tls\") pod \"e7b01be4-73b6-48eb-a06d-4fb38863d982\" (UID: \"e7b01be4-73b6-48eb-a06d-4fb38863d982\") " Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.044838 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7b01be4-73b6-48eb-a06d-4fb38863d982-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "e7b01be4-73b6-48eb-a06d-4fb38863d982" (UID: "e7b01be4-73b6-48eb-a06d-4fb38863d982"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.045482 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7b01be4-73b6-48eb-a06d-4fb38863d982-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "e7b01be4-73b6-48eb-a06d-4fb38863d982" (UID: "e7b01be4-73b6-48eb-a06d-4fb38863d982"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.045598 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "persistence") pod "e7b01be4-73b6-48eb-a06d-4fb38863d982" (UID: "e7b01be4-73b6-48eb-a06d-4fb38863d982"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.045854 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7b01be4-73b6-48eb-a06d-4fb38863d982-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "e7b01be4-73b6-48eb-a06d-4fb38863d982" (UID: "e7b01be4-73b6-48eb-a06d-4fb38863d982"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.046057 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/e7b01be4-73b6-48eb-a06d-4fb38863d982-pod-info" (OuterVolumeSpecName: "pod-info") pod "e7b01be4-73b6-48eb-a06d-4fb38863d982" (UID: "e7b01be4-73b6-48eb-a06d-4fb38863d982"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.051022 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7b01be4-73b6-48eb-a06d-4fb38863d982-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "e7b01be4-73b6-48eb-a06d-4fb38863d982" (UID: "e7b01be4-73b6-48eb-a06d-4fb38863d982"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.053391 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7b01be4-73b6-48eb-a06d-4fb38863d982-kube-api-access-pl7pd" (OuterVolumeSpecName: "kube-api-access-pl7pd") pod "e7b01be4-73b6-48eb-a06d-4fb38863d982" (UID: "e7b01be4-73b6-48eb-a06d-4fb38863d982"). InnerVolumeSpecName "kube-api-access-pl7pd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.055929 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7b01be4-73b6-48eb-a06d-4fb38863d982-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "e7b01be4-73b6-48eb-a06d-4fb38863d982" (UID: "e7b01be4-73b6-48eb-a06d-4fb38863d982"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.102255 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7b01be4-73b6-48eb-a06d-4fb38863d982-server-conf" (OuterVolumeSpecName: "server-conf") pod "e7b01be4-73b6-48eb-a06d-4fb38863d982" (UID: "e7b01be4-73b6-48eb-a06d-4fb38863d982"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.112212 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7b01be4-73b6-48eb-a06d-4fb38863d982-config-data" (OuterVolumeSpecName: "config-data") pod "e7b01be4-73b6-48eb-a06d-4fb38863d982" (UID: "e7b01be4-73b6-48eb-a06d-4fb38863d982"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.140410 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/13254c8b-516c-435e-9db2-a8d518434f29-pod-info\") pod \"13254c8b-516c-435e-9db2-a8d518434f29\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.140474 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/13254c8b-516c-435e-9db2-a8d518434f29-config-data\") pod \"13254c8b-516c-435e-9db2-a8d518434f29\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.140914 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/13254c8b-516c-435e-9db2-a8d518434f29-server-conf\") pod \"13254c8b-516c-435e-9db2-a8d518434f29\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.141027 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/13254c8b-516c-435e-9db2-a8d518434f29-plugins-conf\") pod \"13254c8b-516c-435e-9db2-a8d518434f29\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.141119 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"13254c8b-516c-435e-9db2-a8d518434f29\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.141156 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/13254c8b-516c-435e-9db2-a8d518434f29-rabbitmq-erlang-cookie\") pod \"13254c8b-516c-435e-9db2-a8d518434f29\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.141183 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfz87\" (UniqueName: \"kubernetes.io/projected/13254c8b-516c-435e-9db2-a8d518434f29-kube-api-access-wfz87\") pod \"13254c8b-516c-435e-9db2-a8d518434f29\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.141215 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/13254c8b-516c-435e-9db2-a8d518434f29-rabbitmq-confd\") pod \"13254c8b-516c-435e-9db2-a8d518434f29\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.141244 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/13254c8b-516c-435e-9db2-a8d518434f29-erlang-cookie-secret\") pod \"13254c8b-516c-435e-9db2-a8d518434f29\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.141300 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/13254c8b-516c-435e-9db2-a8d518434f29-rabbitmq-tls\") pod \"13254c8b-516c-435e-9db2-a8d518434f29\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.141319 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/13254c8b-516c-435e-9db2-a8d518434f29-rabbitmq-plugins\") pod \"13254c8b-516c-435e-9db2-a8d518434f29\" (UID: \"13254c8b-516c-435e-9db2-a8d518434f29\") " Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.141855 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13254c8b-516c-435e-9db2-a8d518434f29-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "13254c8b-516c-435e-9db2-a8d518434f29" (UID: "13254c8b-516c-435e-9db2-a8d518434f29"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.142630 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13254c8b-516c-435e-9db2-a8d518434f29-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "13254c8b-516c-435e-9db2-a8d518434f29" (UID: "13254c8b-516c-435e-9db2-a8d518434f29"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.145001 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e7b01be4-73b6-48eb-a06d-4fb38863d982-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.145035 4837 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/13254c8b-516c-435e-9db2-a8d518434f29-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.145166 4837 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e7b01be4-73b6-48eb-a06d-4fb38863d982-server-conf\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.145185 4837 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/13254c8b-516c-435e-9db2-a8d518434f29-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.145199 4837 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e7b01be4-73b6-48eb-a06d-4fb38863d982-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.145197 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13254c8b-516c-435e-9db2-a8d518434f29-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "13254c8b-516c-435e-9db2-a8d518434f29" (UID: "13254c8b-516c-435e-9db2-a8d518434f29"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.145210 4837 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e7b01be4-73b6-48eb-a06d-4fb38863d982-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.145227 4837 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e7b01be4-73b6-48eb-a06d-4fb38863d982-pod-info\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.145256 4837 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.145278 4837 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e7b01be4-73b6-48eb-a06d-4fb38863d982-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.145294 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pl7pd\" (UniqueName: \"kubernetes.io/projected/e7b01be4-73b6-48eb-a06d-4fb38863d982-kube-api-access-pl7pd\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.145305 4837 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e7b01be4-73b6-48eb-a06d-4fb38863d982-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.145317 4837 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e7b01be4-73b6-48eb-a06d-4fb38863d982-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.146921 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "13254c8b-516c-435e-9db2-a8d518434f29" (UID: "13254c8b-516c-435e-9db2-a8d518434f29"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.147443 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13254c8b-516c-435e-9db2-a8d518434f29-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "13254c8b-516c-435e-9db2-a8d518434f29" (UID: "13254c8b-516c-435e-9db2-a8d518434f29"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.148526 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13254c8b-516c-435e-9db2-a8d518434f29-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "13254c8b-516c-435e-9db2-a8d518434f29" (UID: "13254c8b-516c-435e-9db2-a8d518434f29"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.153329 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7b01be4-73b6-48eb-a06d-4fb38863d982-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "e7b01be4-73b6-48eb-a06d-4fb38863d982" (UID: "e7b01be4-73b6-48eb-a06d-4fb38863d982"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.181403 4837 generic.go:334] "Generic (PLEG): container finished" podID="e7b01be4-73b6-48eb-a06d-4fb38863d982" containerID="616ab6849fdbe4a471544990fac8e7c0dc2c1e3c72338a214f97078a3b1bb01c" exitCode=0 Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.181490 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e7b01be4-73b6-48eb-a06d-4fb38863d982","Type":"ContainerDied","Data":"616ab6849fdbe4a471544990fac8e7c0dc2c1e3c72338a214f97078a3b1bb01c"} Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.181517 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e7b01be4-73b6-48eb-a06d-4fb38863d982","Type":"ContainerDied","Data":"c8180a84e0af5653dde0ac3c7b4b0a9aa55749048023693725605b5733ff15c7"} Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.181534 4837 scope.go:117] "RemoveContainer" containerID="616ab6849fdbe4a471544990fac8e7c0dc2c1e3c72338a214f97078a3b1bb01c" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.181701 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.190136 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"13254c8b-516c-435e-9db2-a8d518434f29","Type":"ContainerDied","Data":"0464fd995746ce013b42a116039d645f924aa9c972effad4862f7a836f1488e1"} Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.190091 4837 generic.go:334] "Generic (PLEG): container finished" podID="13254c8b-516c-435e-9db2-a8d518434f29" containerID="0464fd995746ce013b42a116039d645f924aa9c972effad4862f7a836f1488e1" exitCode=0 Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.190325 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"13254c8b-516c-435e-9db2-a8d518434f29","Type":"ContainerDied","Data":"5d7d6eb76793e1d7753bbea8ba4648a937e2549b33bfd032cf29e8f6d1e62f4c"} Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.190346 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.198150 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/13254c8b-516c-435e-9db2-a8d518434f29-pod-info" (OuterVolumeSpecName: "pod-info") pod "13254c8b-516c-435e-9db2-a8d518434f29" (UID: "13254c8b-516c-435e-9db2-a8d518434f29"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.200775 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13254c8b-516c-435e-9db2-a8d518434f29-kube-api-access-wfz87" (OuterVolumeSpecName: "kube-api-access-wfz87") pod "13254c8b-516c-435e-9db2-a8d518434f29" (UID: "13254c8b-516c-435e-9db2-a8d518434f29"). InnerVolumeSpecName "kube-api-access-wfz87". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.213555 4837 scope.go:117] "RemoveContainer" containerID="afe3a88a0e8205fefe122a8099e4acc29a3ebc22c1a9a9cfe3c00f5ab1794007" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.224474 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13254c8b-516c-435e-9db2-a8d518434f29-config-data" (OuterVolumeSpecName: "config-data") pod "13254c8b-516c-435e-9db2-a8d518434f29" (UID: "13254c8b-516c-435e-9db2-a8d518434f29"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.238095 4837 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.250500 4837 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e7b01be4-73b6-48eb-a06d-4fb38863d982-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.250554 4837 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.250565 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfz87\" (UniqueName: \"kubernetes.io/projected/13254c8b-516c-435e-9db2-a8d518434f29-kube-api-access-wfz87\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.250575 4837 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/13254c8b-516c-435e-9db2-a8d518434f29-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.250584 4837 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/13254c8b-516c-435e-9db2-a8d518434f29-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.250592 4837 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/13254c8b-516c-435e-9db2-a8d518434f29-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.250601 4837 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.250608 4837 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/13254c8b-516c-435e-9db2-a8d518434f29-pod-info\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.250616 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/13254c8b-516c-435e-9db2-a8d518434f29-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.256761 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.267317 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13254c8b-516c-435e-9db2-a8d518434f29-server-conf" (OuterVolumeSpecName: "server-conf") pod "13254c8b-516c-435e-9db2-a8d518434f29" (UID: "13254c8b-516c-435e-9db2-a8d518434f29"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.267722 4837 scope.go:117] "RemoveContainer" containerID="616ab6849fdbe4a471544990fac8e7c0dc2c1e3c72338a214f97078a3b1bb01c" Mar 13 12:10:20 crc kubenswrapper[4837]: E0313 12:10:20.273034 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"616ab6849fdbe4a471544990fac8e7c0dc2c1e3c72338a214f97078a3b1bb01c\": container with ID starting with 616ab6849fdbe4a471544990fac8e7c0dc2c1e3c72338a214f97078a3b1bb01c not found: ID does not exist" containerID="616ab6849fdbe4a471544990fac8e7c0dc2c1e3c72338a214f97078a3b1bb01c" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.273084 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"616ab6849fdbe4a471544990fac8e7c0dc2c1e3c72338a214f97078a3b1bb01c"} err="failed to get container status \"616ab6849fdbe4a471544990fac8e7c0dc2c1e3c72338a214f97078a3b1bb01c\": rpc error: code = NotFound desc = could not find container \"616ab6849fdbe4a471544990fac8e7c0dc2c1e3c72338a214f97078a3b1bb01c\": container with ID starting with 616ab6849fdbe4a471544990fac8e7c0dc2c1e3c72338a214f97078a3b1bb01c not found: ID does not exist" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.273109 4837 scope.go:117] "RemoveContainer" containerID="afe3a88a0e8205fefe122a8099e4acc29a3ebc22c1a9a9cfe3c00f5ab1794007" Mar 13 12:10:20 crc kubenswrapper[4837]: E0313 12:10:20.273576 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afe3a88a0e8205fefe122a8099e4acc29a3ebc22c1a9a9cfe3c00f5ab1794007\": container with ID starting with afe3a88a0e8205fefe122a8099e4acc29a3ebc22c1a9a9cfe3c00f5ab1794007 not found: ID does not exist" containerID="afe3a88a0e8205fefe122a8099e4acc29a3ebc22c1a9a9cfe3c00f5ab1794007" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.273719 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afe3a88a0e8205fefe122a8099e4acc29a3ebc22c1a9a9cfe3c00f5ab1794007"} err="failed to get container status \"afe3a88a0e8205fefe122a8099e4acc29a3ebc22c1a9a9cfe3c00f5ab1794007\": rpc error: code = NotFound desc = could not find container \"afe3a88a0e8205fefe122a8099e4acc29a3ebc22c1a9a9cfe3c00f5ab1794007\": container with ID starting with afe3a88a0e8205fefe122a8099e4acc29a3ebc22c1a9a9cfe3c00f5ab1794007 not found: ID does not exist" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.273822 4837 scope.go:117] "RemoveContainer" containerID="0464fd995746ce013b42a116039d645f924aa9c972effad4862f7a836f1488e1" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.296791 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.303969 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 12:10:20 crc kubenswrapper[4837]: E0313 12:10:20.307442 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="348878ea-aa9f-4306-af10-6a56583447a4" containerName="oc" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.307600 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="348878ea-aa9f-4306-af10-6a56583447a4" containerName="oc" Mar 13 12:10:20 crc kubenswrapper[4837]: E0313 12:10:20.307956 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7b01be4-73b6-48eb-a06d-4fb38863d982" containerName="rabbitmq" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.308173 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7b01be4-73b6-48eb-a06d-4fb38863d982" containerName="rabbitmq" Mar 13 12:10:20 crc kubenswrapper[4837]: E0313 12:10:20.308291 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7b01be4-73b6-48eb-a06d-4fb38863d982" containerName="setup-container" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.308363 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7b01be4-73b6-48eb-a06d-4fb38863d982" containerName="setup-container" Mar 13 12:10:20 crc kubenswrapper[4837]: E0313 12:10:20.308439 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13254c8b-516c-435e-9db2-a8d518434f29" containerName="rabbitmq" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.308503 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="13254c8b-516c-435e-9db2-a8d518434f29" containerName="rabbitmq" Mar 13 12:10:20 crc kubenswrapper[4837]: E0313 12:10:20.308584 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13254c8b-516c-435e-9db2-a8d518434f29" containerName="setup-container" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.308696 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="13254c8b-516c-435e-9db2-a8d518434f29" containerName="setup-container" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.309212 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="13254c8b-516c-435e-9db2-a8d518434f29" containerName="rabbitmq" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.309347 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7b01be4-73b6-48eb-a06d-4fb38863d982" containerName="rabbitmq" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.309522 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="348878ea-aa9f-4306-af10-6a56583447a4" containerName="oc" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.310827 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.313142 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.317531 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.317632 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.317825 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.318373 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-8bxdt" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.318443 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.318449 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.318751 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.332955 4837 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.351577 4837 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/13254c8b-516c-435e-9db2-a8d518434f29-server-conf\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.351611 4837 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.352259 4837 scope.go:117] "RemoveContainer" containerID="d7e3a2439b933c4a76e0a0472aaff3cf352a36e55d6c5a3aa674478c5299b9be" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.384487 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13254c8b-516c-435e-9db2-a8d518434f29-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "13254c8b-516c-435e-9db2-a8d518434f29" (UID: "13254c8b-516c-435e-9db2-a8d518434f29"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.384670 4837 scope.go:117] "RemoveContainer" containerID="0464fd995746ce013b42a116039d645f924aa9c972effad4862f7a836f1488e1" Mar 13 12:10:20 crc kubenswrapper[4837]: E0313 12:10:20.385167 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0464fd995746ce013b42a116039d645f924aa9c972effad4862f7a836f1488e1\": container with ID starting with 0464fd995746ce013b42a116039d645f924aa9c972effad4862f7a836f1488e1 not found: ID does not exist" containerID="0464fd995746ce013b42a116039d645f924aa9c972effad4862f7a836f1488e1" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.385212 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0464fd995746ce013b42a116039d645f924aa9c972effad4862f7a836f1488e1"} err="failed to get container status \"0464fd995746ce013b42a116039d645f924aa9c972effad4862f7a836f1488e1\": rpc error: code = NotFound desc = could not find container \"0464fd995746ce013b42a116039d645f924aa9c972effad4862f7a836f1488e1\": container with ID starting with 0464fd995746ce013b42a116039d645f924aa9c972effad4862f7a836f1488e1 not found: ID does not exist" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.385275 4837 scope.go:117] "RemoveContainer" containerID="d7e3a2439b933c4a76e0a0472aaff3cf352a36e55d6c5a3aa674478c5299b9be" Mar 13 12:10:20 crc kubenswrapper[4837]: E0313 12:10:20.385776 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7e3a2439b933c4a76e0a0472aaff3cf352a36e55d6c5a3aa674478c5299b9be\": container with ID starting with d7e3a2439b933c4a76e0a0472aaff3cf352a36e55d6c5a3aa674478c5299b9be not found: ID does not exist" containerID="d7e3a2439b933c4a76e0a0472aaff3cf352a36e55d6c5a3aa674478c5299b9be" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.385805 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7e3a2439b933c4a76e0a0472aaff3cf352a36e55d6c5a3aa674478c5299b9be"} err="failed to get container status \"d7e3a2439b933c4a76e0a0472aaff3cf352a36e55d6c5a3aa674478c5299b9be\": rpc error: code = NotFound desc = could not find container \"d7e3a2439b933c4a76e0a0472aaff3cf352a36e55d6c5a3aa674478c5299b9be\": container with ID starting with d7e3a2439b933c4a76e0a0472aaff3cf352a36e55d6c5a3aa674478c5299b9be not found: ID does not exist" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.453448 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/245e5a26-d143-4e4d-bae8-094275a91574-pod-info\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.453544 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/245e5a26-d143-4e4d-bae8-094275a91574-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.453564 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/245e5a26-d143-4e4d-bae8-094275a91574-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.453583 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/245e5a26-d143-4e4d-bae8-094275a91574-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.453711 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/245e5a26-d143-4e4d-bae8-094275a91574-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.453746 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/245e5a26-d143-4e4d-bae8-094275a91574-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.453772 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvf5h\" (UniqueName: \"kubernetes.io/projected/245e5a26-d143-4e4d-bae8-094275a91574-kube-api-access-jvf5h\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.453847 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/245e5a26-d143-4e4d-bae8-094275a91574-config-data\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.453905 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.453926 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/245e5a26-d143-4e4d-bae8-094275a91574-server-conf\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.453972 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/245e5a26-d143-4e4d-bae8-094275a91574-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.454087 4837 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/13254c8b-516c-435e-9db2-a8d518434f29-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.557325 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/245e5a26-d143-4e4d-bae8-094275a91574-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.557457 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/245e5a26-d143-4e4d-bae8-094275a91574-pod-info\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.557547 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/245e5a26-d143-4e4d-bae8-094275a91574-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.557570 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/245e5a26-d143-4e4d-bae8-094275a91574-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.557602 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/245e5a26-d143-4e4d-bae8-094275a91574-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.557694 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/245e5a26-d143-4e4d-bae8-094275a91574-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.557772 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/245e5a26-d143-4e4d-bae8-094275a91574-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.557795 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvf5h\" (UniqueName: \"kubernetes.io/projected/245e5a26-d143-4e4d-bae8-094275a91574-kube-api-access-jvf5h\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.557826 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/245e5a26-d143-4e4d-bae8-094275a91574-config-data\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.557879 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.557902 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/245e5a26-d143-4e4d-bae8-094275a91574-server-conf\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.559273 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/245e5a26-d143-4e4d-bae8-094275a91574-server-conf\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.560275 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/245e5a26-d143-4e4d-bae8-094275a91574-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.560430 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/245e5a26-d143-4e4d-bae8-094275a91574-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.561045 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/245e5a26-d143-4e4d-bae8-094275a91574-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.561057 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.562605 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/245e5a26-d143-4e4d-bae8-094275a91574-config-data\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.564265 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/245e5a26-d143-4e4d-bae8-094275a91574-pod-info\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.564816 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/245e5a26-d143-4e4d-bae8-094275a91574-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.564944 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/245e5a26-d143-4e4d-bae8-094275a91574-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.571087 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/245e5a26-d143-4e4d-bae8-094275a91574-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.587455 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvf5h\" (UniqueName: \"kubernetes.io/projected/245e5a26-d143-4e4d-bae8-094275a91574-kube-api-access-jvf5h\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.591560 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.605286 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.616864 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.619016 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.624236 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-mb2tp" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.624544 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.624570 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.624650 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.624751 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.624822 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.625001 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.635041 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"245e5a26-d143-4e4d-bae8-094275a91574\") " pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.639103 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.658500 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.761100 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p57rp\" (UniqueName: \"kubernetes.io/projected/90028d66-5134-4c09-af15-71e754f49bf3-kube-api-access-p57rp\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.761182 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/90028d66-5134-4c09-af15-71e754f49bf3-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.761209 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/90028d66-5134-4c09-af15-71e754f49bf3-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.761231 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/90028d66-5134-4c09-af15-71e754f49bf3-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.761277 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/90028d66-5134-4c09-af15-71e754f49bf3-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.761325 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/90028d66-5134-4c09-af15-71e754f49bf3-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.761355 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/90028d66-5134-4c09-af15-71e754f49bf3-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.761374 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/90028d66-5134-4c09-af15-71e754f49bf3-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.761404 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.761423 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/90028d66-5134-4c09-af15-71e754f49bf3-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.761443 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/90028d66-5134-4c09-af15-71e754f49bf3-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.863650 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/90028d66-5134-4c09-af15-71e754f49bf3-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.863715 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/90028d66-5134-4c09-af15-71e754f49bf3-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.863740 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/90028d66-5134-4c09-af15-71e754f49bf3-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.863791 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/90028d66-5134-4c09-af15-71e754f49bf3-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.863841 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/90028d66-5134-4c09-af15-71e754f49bf3-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.863868 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/90028d66-5134-4c09-af15-71e754f49bf3-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.863891 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/90028d66-5134-4c09-af15-71e754f49bf3-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.863930 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.863957 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/90028d66-5134-4c09-af15-71e754f49bf3-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.863982 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/90028d66-5134-4c09-af15-71e754f49bf3-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.864052 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p57rp\" (UniqueName: \"kubernetes.io/projected/90028d66-5134-4c09-af15-71e754f49bf3-kube-api-access-p57rp\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.865614 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/90028d66-5134-4c09-af15-71e754f49bf3-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.865765 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.868873 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/90028d66-5134-4c09-af15-71e754f49bf3-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.869571 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/90028d66-5134-4c09-af15-71e754f49bf3-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.869955 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/90028d66-5134-4c09-af15-71e754f49bf3-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.870201 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/90028d66-5134-4c09-af15-71e754f49bf3-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.871274 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/90028d66-5134-4c09-af15-71e754f49bf3-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.873117 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/90028d66-5134-4c09-af15-71e754f49bf3-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.874789 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/90028d66-5134-4c09-af15-71e754f49bf3-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.875077 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/90028d66-5134-4c09-af15-71e754f49bf3-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.882915 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p57rp\" (UniqueName: \"kubernetes.io/projected/90028d66-5134-4c09-af15-71e754f49bf3-kube-api-access-p57rp\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:20 crc kubenswrapper[4837]: I0313 12:10:20.900798 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"90028d66-5134-4c09-af15-71e754f49bf3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.058707 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13254c8b-516c-435e-9db2-a8d518434f29" path="/var/lib/kubelet/pods/13254c8b-516c-435e-9db2-a8d518434f29/volumes" Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.059717 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7b01be4-73b6-48eb-a06d-4fb38863d982" path="/var/lib/kubelet/pods/e7b01be4-73b6-48eb-a06d-4fb38863d982/volumes" Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.109331 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.140380 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.216867 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"245e5a26-d143-4e4d-bae8-094275a91574","Type":"ContainerStarted","Data":"669ae6a1c242dd485f718081cac25de23c2b27fae539b95b098d625b50bfde0c"} Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.576261 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.707214 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-smx7d"] Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.708921 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.711491 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.721197 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-smx7d"] Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.891579 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-smx7d\" (UID: \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.891669 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-smx7d\" (UID: \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.891776 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkhxz\" (UniqueName: \"kubernetes.io/projected/8561b7f2-0c2e-44bc-8f9f-be22c9624182-kube-api-access-bkhxz\") pod \"dnsmasq-dns-79bd4cc8c9-smx7d\" (UID: \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.891809 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-smx7d\" (UID: \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.891828 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-config\") pod \"dnsmasq-dns-79bd4cc8c9-smx7d\" (UID: \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.891855 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-smx7d\" (UID: \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.891874 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-smx7d\" (UID: \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.993953 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-smx7d\" (UID: \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.994126 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkhxz\" (UniqueName: \"kubernetes.io/projected/8561b7f2-0c2e-44bc-8f9f-be22c9624182-kube-api-access-bkhxz\") pod \"dnsmasq-dns-79bd4cc8c9-smx7d\" (UID: \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.994172 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-smx7d\" (UID: \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.994201 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-config\") pod \"dnsmasq-dns-79bd4cc8c9-smx7d\" (UID: \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.994240 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-smx7d\" (UID: \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.994261 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-smx7d\" (UID: \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.994304 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-smx7d\" (UID: \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.995207 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-smx7d\" (UID: \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.995210 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-smx7d\" (UID: \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.995379 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-config\") pod \"dnsmasq-dns-79bd4cc8c9-smx7d\" (UID: \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.995554 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-smx7d\" (UID: \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.995710 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-smx7d\" (UID: \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" Mar 13 12:10:21 crc kubenswrapper[4837]: I0313 12:10:21.996095 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-smx7d\" (UID: \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" Mar 13 12:10:22 crc kubenswrapper[4837]: I0313 12:10:22.016379 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkhxz\" (UniqueName: \"kubernetes.io/projected/8561b7f2-0c2e-44bc-8f9f-be22c9624182-kube-api-access-bkhxz\") pod \"dnsmasq-dns-79bd4cc8c9-smx7d\" (UID: \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" Mar 13 12:10:22 crc kubenswrapper[4837]: I0313 12:10:22.068252 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" Mar 13 12:10:22 crc kubenswrapper[4837]: I0313 12:10:22.245095 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"90028d66-5134-4c09-af15-71e754f49bf3","Type":"ContainerStarted","Data":"bd5a45941299ca017c1df94156bbe52bd40d508730d729dff288c7e63fa8ac1a"} Mar 13 12:10:22 crc kubenswrapper[4837]: I0313 12:10:22.527161 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-smx7d"] Mar 13 12:10:23 crc kubenswrapper[4837]: I0313 12:10:23.255390 4837 generic.go:334] "Generic (PLEG): container finished" podID="8561b7f2-0c2e-44bc-8f9f-be22c9624182" containerID="7e7bf0350454cdf6a55ff53b12033e233fa5322f6b553c3d41b099505b8fd05e" exitCode=0 Mar 13 12:10:23 crc kubenswrapper[4837]: I0313 12:10:23.255463 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" event={"ID":"8561b7f2-0c2e-44bc-8f9f-be22c9624182","Type":"ContainerDied","Data":"7e7bf0350454cdf6a55ff53b12033e233fa5322f6b553c3d41b099505b8fd05e"} Mar 13 12:10:23 crc kubenswrapper[4837]: I0313 12:10:23.255742 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" event={"ID":"8561b7f2-0c2e-44bc-8f9f-be22c9624182","Type":"ContainerStarted","Data":"b996ce339d9115538429d8844c7070a737c07d9e64960dc8589245407a365c7b"} Mar 13 12:10:23 crc kubenswrapper[4837]: I0313 12:10:23.257431 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"245e5a26-d143-4e4d-bae8-094275a91574","Type":"ContainerStarted","Data":"968024f3e7a34fb2251686b56e03a3d25328f059240b5fd48e62fa0112da765c"} Mar 13 12:10:24 crc kubenswrapper[4837]: I0313 12:10:24.271214 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" event={"ID":"8561b7f2-0c2e-44bc-8f9f-be22c9624182","Type":"ContainerStarted","Data":"36c7996282b2e0069d58fbcaadfb735cbfdb0845c0b839279be73a9a205804d1"} Mar 13 12:10:24 crc kubenswrapper[4837]: I0313 12:10:24.271917 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" Mar 13 12:10:24 crc kubenswrapper[4837]: I0313 12:10:24.274522 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"90028d66-5134-4c09-af15-71e754f49bf3","Type":"ContainerStarted","Data":"3707d1215c01b01e085f415765474c7f7c0f6cccb71ba878a0cb6e1bc6e40be6"} Mar 13 12:10:24 crc kubenswrapper[4837]: I0313 12:10:24.297827 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" podStartSLOduration=3.297807042 podStartE2EDuration="3.297807042s" podCreationTimestamp="2026-03-13 12:10:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:10:24.286674592 +0000 UTC m=+1339.924941365" watchObservedRunningTime="2026-03-13 12:10:24.297807042 +0000 UTC m=+1339.936073805" Mar 13 12:10:27 crc kubenswrapper[4837]: I0313 12:10:27.235144 4837 scope.go:117] "RemoveContainer" containerID="945088ee0e42cd72cf70828366cf9ffb988a0eebcb4e0d5222d7e3f1439eeef4" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.070095 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.125977 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-ql9zn"] Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.126216 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" podUID="6d9c85e6-5c66-4c94-996b-0278453fd29c" containerName="dnsmasq-dns" containerID="cri-o://daf8bcea9fd0562663127a8a93a369152f67e1407f2a8bee704558594419d5d6" gracePeriod=10 Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.297549 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55478c4467-bxc2t"] Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.299469 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55478c4467-bxc2t" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.312478 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55478c4467-bxc2t"] Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.372553 4837 generic.go:334] "Generic (PLEG): container finished" podID="6d9c85e6-5c66-4c94-996b-0278453fd29c" containerID="daf8bcea9fd0562663127a8a93a369152f67e1407f2a8bee704558594419d5d6" exitCode=0 Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.372610 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" event={"ID":"6d9c85e6-5c66-4c94-996b-0278453fd29c","Type":"ContainerDied","Data":"daf8bcea9fd0562663127a8a93a369152f67e1407f2a8bee704558594419d5d6"} Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.402163 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98f4bdc5-6452-4630-a299-6234d8a63bf8-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-bxc2t\" (UID: \"98f4bdc5-6452-4630-a299-6234d8a63bf8\") " pod="openstack/dnsmasq-dns-55478c4467-bxc2t" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.402208 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98f4bdc5-6452-4630-a299-6234d8a63bf8-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-bxc2t\" (UID: \"98f4bdc5-6452-4630-a299-6234d8a63bf8\") " pod="openstack/dnsmasq-dns-55478c4467-bxc2t" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.402252 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98f4bdc5-6452-4630-a299-6234d8a63bf8-dns-svc\") pod \"dnsmasq-dns-55478c4467-bxc2t\" (UID: \"98f4bdc5-6452-4630-a299-6234d8a63bf8\") " pod="openstack/dnsmasq-dns-55478c4467-bxc2t" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.402364 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/98f4bdc5-6452-4630-a299-6234d8a63bf8-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-bxc2t\" (UID: \"98f4bdc5-6452-4630-a299-6234d8a63bf8\") " pod="openstack/dnsmasq-dns-55478c4467-bxc2t" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.402411 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqtvc\" (UniqueName: \"kubernetes.io/projected/98f4bdc5-6452-4630-a299-6234d8a63bf8-kube-api-access-gqtvc\") pod \"dnsmasq-dns-55478c4467-bxc2t\" (UID: \"98f4bdc5-6452-4630-a299-6234d8a63bf8\") " pod="openstack/dnsmasq-dns-55478c4467-bxc2t" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.402465 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98f4bdc5-6452-4630-a299-6234d8a63bf8-config\") pod \"dnsmasq-dns-55478c4467-bxc2t\" (UID: \"98f4bdc5-6452-4630-a299-6234d8a63bf8\") " pod="openstack/dnsmasq-dns-55478c4467-bxc2t" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.402523 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/98f4bdc5-6452-4630-a299-6234d8a63bf8-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-bxc2t\" (UID: \"98f4bdc5-6452-4630-a299-6234d8a63bf8\") " pod="openstack/dnsmasq-dns-55478c4467-bxc2t" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.504268 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98f4bdc5-6452-4630-a299-6234d8a63bf8-config\") pod \"dnsmasq-dns-55478c4467-bxc2t\" (UID: \"98f4bdc5-6452-4630-a299-6234d8a63bf8\") " pod="openstack/dnsmasq-dns-55478c4467-bxc2t" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.504342 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/98f4bdc5-6452-4630-a299-6234d8a63bf8-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-bxc2t\" (UID: \"98f4bdc5-6452-4630-a299-6234d8a63bf8\") " pod="openstack/dnsmasq-dns-55478c4467-bxc2t" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.504414 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98f4bdc5-6452-4630-a299-6234d8a63bf8-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-bxc2t\" (UID: \"98f4bdc5-6452-4630-a299-6234d8a63bf8\") " pod="openstack/dnsmasq-dns-55478c4467-bxc2t" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.504435 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98f4bdc5-6452-4630-a299-6234d8a63bf8-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-bxc2t\" (UID: \"98f4bdc5-6452-4630-a299-6234d8a63bf8\") " pod="openstack/dnsmasq-dns-55478c4467-bxc2t" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.504477 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98f4bdc5-6452-4630-a299-6234d8a63bf8-dns-svc\") pod \"dnsmasq-dns-55478c4467-bxc2t\" (UID: \"98f4bdc5-6452-4630-a299-6234d8a63bf8\") " pod="openstack/dnsmasq-dns-55478c4467-bxc2t" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.504565 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/98f4bdc5-6452-4630-a299-6234d8a63bf8-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-bxc2t\" (UID: \"98f4bdc5-6452-4630-a299-6234d8a63bf8\") " pod="openstack/dnsmasq-dns-55478c4467-bxc2t" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.504606 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqtvc\" (UniqueName: \"kubernetes.io/projected/98f4bdc5-6452-4630-a299-6234d8a63bf8-kube-api-access-gqtvc\") pod \"dnsmasq-dns-55478c4467-bxc2t\" (UID: \"98f4bdc5-6452-4630-a299-6234d8a63bf8\") " pod="openstack/dnsmasq-dns-55478c4467-bxc2t" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.505360 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98f4bdc5-6452-4630-a299-6234d8a63bf8-config\") pod \"dnsmasq-dns-55478c4467-bxc2t\" (UID: \"98f4bdc5-6452-4630-a299-6234d8a63bf8\") " pod="openstack/dnsmasq-dns-55478c4467-bxc2t" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.505393 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98f4bdc5-6452-4630-a299-6234d8a63bf8-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-bxc2t\" (UID: \"98f4bdc5-6452-4630-a299-6234d8a63bf8\") " pod="openstack/dnsmasq-dns-55478c4467-bxc2t" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.505541 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98f4bdc5-6452-4630-a299-6234d8a63bf8-dns-svc\") pod \"dnsmasq-dns-55478c4467-bxc2t\" (UID: \"98f4bdc5-6452-4630-a299-6234d8a63bf8\") " pod="openstack/dnsmasq-dns-55478c4467-bxc2t" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.505562 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/98f4bdc5-6452-4630-a299-6234d8a63bf8-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-bxc2t\" (UID: \"98f4bdc5-6452-4630-a299-6234d8a63bf8\") " pod="openstack/dnsmasq-dns-55478c4467-bxc2t" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.506053 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/98f4bdc5-6452-4630-a299-6234d8a63bf8-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-bxc2t\" (UID: \"98f4bdc5-6452-4630-a299-6234d8a63bf8\") " pod="openstack/dnsmasq-dns-55478c4467-bxc2t" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.506198 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98f4bdc5-6452-4630-a299-6234d8a63bf8-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-bxc2t\" (UID: \"98f4bdc5-6452-4630-a299-6234d8a63bf8\") " pod="openstack/dnsmasq-dns-55478c4467-bxc2t" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.530882 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqtvc\" (UniqueName: \"kubernetes.io/projected/98f4bdc5-6452-4630-a299-6234d8a63bf8-kube-api-access-gqtvc\") pod \"dnsmasq-dns-55478c4467-bxc2t\" (UID: \"98f4bdc5-6452-4630-a299-6234d8a63bf8\") " pod="openstack/dnsmasq-dns-55478c4467-bxc2t" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.633687 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55478c4467-bxc2t" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.771064 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.912716 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-ovsdbserver-sb\") pod \"6d9c85e6-5c66-4c94-996b-0278453fd29c\" (UID: \"6d9c85e6-5c66-4c94-996b-0278453fd29c\") " Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.912758 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-dns-swift-storage-0\") pod \"6d9c85e6-5c66-4c94-996b-0278453fd29c\" (UID: \"6d9c85e6-5c66-4c94-996b-0278453fd29c\") " Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.912798 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdwv8\" (UniqueName: \"kubernetes.io/projected/6d9c85e6-5c66-4c94-996b-0278453fd29c-kube-api-access-zdwv8\") pod \"6d9c85e6-5c66-4c94-996b-0278453fd29c\" (UID: \"6d9c85e6-5c66-4c94-996b-0278453fd29c\") " Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.912878 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-dns-svc\") pod \"6d9c85e6-5c66-4c94-996b-0278453fd29c\" (UID: \"6d9c85e6-5c66-4c94-996b-0278453fd29c\") " Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.912927 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-ovsdbserver-nb\") pod \"6d9c85e6-5c66-4c94-996b-0278453fd29c\" (UID: \"6d9c85e6-5c66-4c94-996b-0278453fd29c\") " Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.913089 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-config\") pod \"6d9c85e6-5c66-4c94-996b-0278453fd29c\" (UID: \"6d9c85e6-5c66-4c94-996b-0278453fd29c\") " Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.918885 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d9c85e6-5c66-4c94-996b-0278453fd29c-kube-api-access-zdwv8" (OuterVolumeSpecName: "kube-api-access-zdwv8") pod "6d9c85e6-5c66-4c94-996b-0278453fd29c" (UID: "6d9c85e6-5c66-4c94-996b-0278453fd29c"). InnerVolumeSpecName "kube-api-access-zdwv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.969235 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-config" (OuterVolumeSpecName: "config") pod "6d9c85e6-5c66-4c94-996b-0278453fd29c" (UID: "6d9c85e6-5c66-4c94-996b-0278453fd29c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.973061 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6d9c85e6-5c66-4c94-996b-0278453fd29c" (UID: "6d9c85e6-5c66-4c94-996b-0278453fd29c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.973594 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6d9c85e6-5c66-4c94-996b-0278453fd29c" (UID: "6d9c85e6-5c66-4c94-996b-0278453fd29c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.974002 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6d9c85e6-5c66-4c94-996b-0278453fd29c" (UID: "6d9c85e6-5c66-4c94-996b-0278453fd29c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:32 crc kubenswrapper[4837]: I0313 12:10:32.981162 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6d9c85e6-5c66-4c94-996b-0278453fd29c" (UID: "6d9c85e6-5c66-4c94-996b-0278453fd29c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:33 crc kubenswrapper[4837]: I0313 12:10:33.015609 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:33 crc kubenswrapper[4837]: I0313 12:10:33.015660 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:33 crc kubenswrapper[4837]: I0313 12:10:33.015672 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:33 crc kubenswrapper[4837]: I0313 12:10:33.015683 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:33 crc kubenswrapper[4837]: I0313 12:10:33.015696 4837 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6d9c85e6-5c66-4c94-996b-0278453fd29c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:33 crc kubenswrapper[4837]: I0313 12:10:33.015707 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdwv8\" (UniqueName: \"kubernetes.io/projected/6d9c85e6-5c66-4c94-996b-0278453fd29c-kube-api-access-zdwv8\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:33 crc kubenswrapper[4837]: I0313 12:10:33.100057 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55478c4467-bxc2t"] Mar 13 12:10:33 crc kubenswrapper[4837]: I0313 12:10:33.382133 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" event={"ID":"6d9c85e6-5c66-4c94-996b-0278453fd29c","Type":"ContainerDied","Data":"b047a2dec8a72a4326c0ddd3270cd5183bb922bd572e2b9241c600e148c1eea7"} Mar 13 12:10:33 crc kubenswrapper[4837]: I0313 12:10:33.382434 4837 scope.go:117] "RemoveContainer" containerID="daf8bcea9fd0562663127a8a93a369152f67e1407f2a8bee704558594419d5d6" Mar 13 12:10:33 crc kubenswrapper[4837]: I0313 12:10:33.382279 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-ql9zn" Mar 13 12:10:33 crc kubenswrapper[4837]: I0313 12:10:33.383997 4837 generic.go:334] "Generic (PLEG): container finished" podID="98f4bdc5-6452-4630-a299-6234d8a63bf8" containerID="def881830f11c938b1f9f72bad0579bbe4bed6b4616080645ffbfc4ade37dd2a" exitCode=0 Mar 13 12:10:33 crc kubenswrapper[4837]: I0313 12:10:33.384059 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-bxc2t" event={"ID":"98f4bdc5-6452-4630-a299-6234d8a63bf8","Type":"ContainerDied","Data":"def881830f11c938b1f9f72bad0579bbe4bed6b4616080645ffbfc4ade37dd2a"} Mar 13 12:10:33 crc kubenswrapper[4837]: I0313 12:10:33.384081 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-bxc2t" event={"ID":"98f4bdc5-6452-4630-a299-6234d8a63bf8","Type":"ContainerStarted","Data":"9df755008f8ca09ac808d1f5e7998ca4a4d73b0d2045552cf2e9dc79be637045"} Mar 13 12:10:33 crc kubenswrapper[4837]: I0313 12:10:33.573458 4837 scope.go:117] "RemoveContainer" containerID="0ac8018727334fad931d8e9b782b5ff6d28c6c9743c0f7da2e79336a427ee5cf" Mar 13 12:10:33 crc kubenswrapper[4837]: I0313 12:10:33.598081 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-ql9zn"] Mar 13 12:10:33 crc kubenswrapper[4837]: I0313 12:10:33.609660 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-ql9zn"] Mar 13 12:10:34 crc kubenswrapper[4837]: I0313 12:10:34.393885 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-bxc2t" event={"ID":"98f4bdc5-6452-4630-a299-6234d8a63bf8","Type":"ContainerStarted","Data":"7c3bcb357ed3d7475a2ec4327761fa5510a02f8b713eea793ac3ea35316b3c01"} Mar 13 12:10:34 crc kubenswrapper[4837]: I0313 12:10:34.394227 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55478c4467-bxc2t" Mar 13 12:10:34 crc kubenswrapper[4837]: I0313 12:10:34.416782 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55478c4467-bxc2t" podStartSLOduration=2.416765755 podStartE2EDuration="2.416765755s" podCreationTimestamp="2026-03-13 12:10:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:10:34.413054958 +0000 UTC m=+1350.051321741" watchObservedRunningTime="2026-03-13 12:10:34.416765755 +0000 UTC m=+1350.055032508" Mar 13 12:10:35 crc kubenswrapper[4837]: I0313 12:10:35.057798 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d9c85e6-5c66-4c94-996b-0278453fd29c" path="/var/lib/kubelet/pods/6d9c85e6-5c66-4c94-996b-0278453fd29c/volumes" Mar 13 12:10:35 crc kubenswrapper[4837]: I0313 12:10:35.484180 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:10:35 crc kubenswrapper[4837]: I0313 12:10:35.484250 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:10:42 crc kubenswrapper[4837]: I0313 12:10:42.635505 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55478c4467-bxc2t" Mar 13 12:10:42 crc kubenswrapper[4837]: I0313 12:10:42.704048 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-smx7d"] Mar 13 12:10:42 crc kubenswrapper[4837]: I0313 12:10:42.704940 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" podUID="8561b7f2-0c2e-44bc-8f9f-be22c9624182" containerName="dnsmasq-dns" containerID="cri-o://36c7996282b2e0069d58fbcaadfb735cbfdb0845c0b839279be73a9a205804d1" gracePeriod=10 Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.182026 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.310949 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-ovsdbserver-sb\") pod \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\" (UID: \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\") " Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.311122 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-dns-svc\") pod \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\" (UID: \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\") " Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.311213 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-dns-swift-storage-0\") pod \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\" (UID: \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\") " Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.311228 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-openstack-edpm-ipam\") pod \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\" (UID: \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\") " Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.311264 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkhxz\" (UniqueName: \"kubernetes.io/projected/8561b7f2-0c2e-44bc-8f9f-be22c9624182-kube-api-access-bkhxz\") pod \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\" (UID: \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\") " Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.311280 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-ovsdbserver-nb\") pod \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\" (UID: \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\") " Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.311347 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-config\") pod \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\" (UID: \"8561b7f2-0c2e-44bc-8f9f-be22c9624182\") " Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.323097 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8561b7f2-0c2e-44bc-8f9f-be22c9624182-kube-api-access-bkhxz" (OuterVolumeSpecName: "kube-api-access-bkhxz") pod "8561b7f2-0c2e-44bc-8f9f-be22c9624182" (UID: "8561b7f2-0c2e-44bc-8f9f-be22c9624182"). InnerVolumeSpecName "kube-api-access-bkhxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.362855 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8561b7f2-0c2e-44bc-8f9f-be22c9624182" (UID: "8561b7f2-0c2e-44bc-8f9f-be22c9624182"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.372950 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8561b7f2-0c2e-44bc-8f9f-be22c9624182" (UID: "8561b7f2-0c2e-44bc-8f9f-be22c9624182"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.374143 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-config" (OuterVolumeSpecName: "config") pod "8561b7f2-0c2e-44bc-8f9f-be22c9624182" (UID: "8561b7f2-0c2e-44bc-8f9f-be22c9624182"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.385913 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8561b7f2-0c2e-44bc-8f9f-be22c9624182" (UID: "8561b7f2-0c2e-44bc-8f9f-be22c9624182"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.386777 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8561b7f2-0c2e-44bc-8f9f-be22c9624182" (UID: "8561b7f2-0c2e-44bc-8f9f-be22c9624182"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.391971 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "8561b7f2-0c2e-44bc-8f9f-be22c9624182" (UID: "8561b7f2-0c2e-44bc-8f9f-be22c9624182"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.414345 4837 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.414391 4837 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.414408 4837 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.414421 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkhxz\" (UniqueName: \"kubernetes.io/projected/8561b7f2-0c2e-44bc-8f9f-be22c9624182-kube-api-access-bkhxz\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.414433 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.414445 4837 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.414459 4837 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8561b7f2-0c2e-44bc-8f9f-be22c9624182-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.472445 4837 generic.go:334] "Generic (PLEG): container finished" podID="8561b7f2-0c2e-44bc-8f9f-be22c9624182" containerID="36c7996282b2e0069d58fbcaadfb735cbfdb0845c0b839279be73a9a205804d1" exitCode=0 Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.472490 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" event={"ID":"8561b7f2-0c2e-44bc-8f9f-be22c9624182","Type":"ContainerDied","Data":"36c7996282b2e0069d58fbcaadfb735cbfdb0845c0b839279be73a9a205804d1"} Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.472523 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" event={"ID":"8561b7f2-0c2e-44bc-8f9f-be22c9624182","Type":"ContainerDied","Data":"b996ce339d9115538429d8844c7070a737c07d9e64960dc8589245407a365c7b"} Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.472548 4837 scope.go:117] "RemoveContainer" containerID="36c7996282b2e0069d58fbcaadfb735cbfdb0845c0b839279be73a9a205804d1" Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.472585 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-smx7d" Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.493679 4837 scope.go:117] "RemoveContainer" containerID="7e7bf0350454cdf6a55ff53b12033e233fa5322f6b553c3d41b099505b8fd05e" Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.509303 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-smx7d"] Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.519043 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-smx7d"] Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.531768 4837 scope.go:117] "RemoveContainer" containerID="36c7996282b2e0069d58fbcaadfb735cbfdb0845c0b839279be73a9a205804d1" Mar 13 12:10:43 crc kubenswrapper[4837]: E0313 12:10:43.532200 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36c7996282b2e0069d58fbcaadfb735cbfdb0845c0b839279be73a9a205804d1\": container with ID starting with 36c7996282b2e0069d58fbcaadfb735cbfdb0845c0b839279be73a9a205804d1 not found: ID does not exist" containerID="36c7996282b2e0069d58fbcaadfb735cbfdb0845c0b839279be73a9a205804d1" Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.532241 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36c7996282b2e0069d58fbcaadfb735cbfdb0845c0b839279be73a9a205804d1"} err="failed to get container status \"36c7996282b2e0069d58fbcaadfb735cbfdb0845c0b839279be73a9a205804d1\": rpc error: code = NotFound desc = could not find container \"36c7996282b2e0069d58fbcaadfb735cbfdb0845c0b839279be73a9a205804d1\": container with ID starting with 36c7996282b2e0069d58fbcaadfb735cbfdb0845c0b839279be73a9a205804d1 not found: ID does not exist" Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.532266 4837 scope.go:117] "RemoveContainer" containerID="7e7bf0350454cdf6a55ff53b12033e233fa5322f6b553c3d41b099505b8fd05e" Mar 13 12:10:43 crc kubenswrapper[4837]: E0313 12:10:43.532490 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e7bf0350454cdf6a55ff53b12033e233fa5322f6b553c3d41b099505b8fd05e\": container with ID starting with 7e7bf0350454cdf6a55ff53b12033e233fa5322f6b553c3d41b099505b8fd05e not found: ID does not exist" containerID="7e7bf0350454cdf6a55ff53b12033e233fa5322f6b553c3d41b099505b8fd05e" Mar 13 12:10:43 crc kubenswrapper[4837]: I0313 12:10:43.532521 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e7bf0350454cdf6a55ff53b12033e233fa5322f6b553c3d41b099505b8fd05e"} err="failed to get container status \"7e7bf0350454cdf6a55ff53b12033e233fa5322f6b553c3d41b099505b8fd05e\": rpc error: code = NotFound desc = could not find container \"7e7bf0350454cdf6a55ff53b12033e233fa5322f6b553c3d41b099505b8fd05e\": container with ID starting with 7e7bf0350454cdf6a55ff53b12033e233fa5322f6b553c3d41b099505b8fd05e not found: ID does not exist" Mar 13 12:10:45 crc kubenswrapper[4837]: I0313 12:10:45.059066 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8561b7f2-0c2e-44bc-8f9f-be22c9624182" path="/var/lib/kubelet/pods/8561b7f2-0c2e-44bc-8f9f-be22c9624182/volumes" Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.583539 4837 generic.go:334] "Generic (PLEG): container finished" podID="90028d66-5134-4c09-af15-71e754f49bf3" containerID="3707d1215c01b01e085f415765474c7f7c0f6cccb71ba878a0cb6e1bc6e40be6" exitCode=0 Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.583652 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"90028d66-5134-4c09-af15-71e754f49bf3","Type":"ContainerDied","Data":"3707d1215c01b01e085f415765474c7f7c0f6cccb71ba878a0cb6e1bc6e40be6"} Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.586077 4837 generic.go:334] "Generic (PLEG): container finished" podID="245e5a26-d143-4e4d-bae8-094275a91574" containerID="968024f3e7a34fb2251686b56e03a3d25328f059240b5fd48e62fa0112da765c" exitCode=0 Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.586123 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"245e5a26-d143-4e4d-bae8-094275a91574","Type":"ContainerDied","Data":"968024f3e7a34fb2251686b56e03a3d25328f059240b5fd48e62fa0112da765c"} Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.766459 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6"] Mar 13 12:10:55 crc kubenswrapper[4837]: E0313 12:10:55.766956 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d9c85e6-5c66-4c94-996b-0278453fd29c" containerName="dnsmasq-dns" Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.766992 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d9c85e6-5c66-4c94-996b-0278453fd29c" containerName="dnsmasq-dns" Mar 13 12:10:55 crc kubenswrapper[4837]: E0313 12:10:55.767013 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d9c85e6-5c66-4c94-996b-0278453fd29c" containerName="init" Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.767022 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d9c85e6-5c66-4c94-996b-0278453fd29c" containerName="init" Mar 13 12:10:55 crc kubenswrapper[4837]: E0313 12:10:55.767078 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8561b7f2-0c2e-44bc-8f9f-be22c9624182" containerName="init" Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.767090 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="8561b7f2-0c2e-44bc-8f9f-be22c9624182" containerName="init" Mar 13 12:10:55 crc kubenswrapper[4837]: E0313 12:10:55.767106 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8561b7f2-0c2e-44bc-8f9f-be22c9624182" containerName="dnsmasq-dns" Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.767114 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="8561b7f2-0c2e-44bc-8f9f-be22c9624182" containerName="dnsmasq-dns" Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.767388 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="8561b7f2-0c2e-44bc-8f9f-be22c9624182" containerName="dnsmasq-dns" Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.767415 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d9c85e6-5c66-4c94-996b-0278453fd29c" containerName="dnsmasq-dns" Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.768181 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6" Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.770732 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.770923 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.771119 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dxdkz" Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.771280 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.801107 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6"] Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.844860 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bfedd3e5-e8d7-4311-9a0d-30276ce40418-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6\" (UID: \"bfedd3e5-e8d7-4311-9a0d-30276ce40418\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6" Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.844973 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfedd3e5-e8d7-4311-9a0d-30276ce40418-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6\" (UID: \"bfedd3e5-e8d7-4311-9a0d-30276ce40418\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6" Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.845308 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfedd3e5-e8d7-4311-9a0d-30276ce40418-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6\" (UID: \"bfedd3e5-e8d7-4311-9a0d-30276ce40418\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6" Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.845388 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc689\" (UniqueName: \"kubernetes.io/projected/bfedd3e5-e8d7-4311-9a0d-30276ce40418-kube-api-access-lc689\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6\" (UID: \"bfedd3e5-e8d7-4311-9a0d-30276ce40418\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6" Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.947028 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lc689\" (UniqueName: \"kubernetes.io/projected/bfedd3e5-e8d7-4311-9a0d-30276ce40418-kube-api-access-lc689\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6\" (UID: \"bfedd3e5-e8d7-4311-9a0d-30276ce40418\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6" Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.947162 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bfedd3e5-e8d7-4311-9a0d-30276ce40418-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6\" (UID: \"bfedd3e5-e8d7-4311-9a0d-30276ce40418\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6" Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.947191 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfedd3e5-e8d7-4311-9a0d-30276ce40418-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6\" (UID: \"bfedd3e5-e8d7-4311-9a0d-30276ce40418\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6" Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.947266 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfedd3e5-e8d7-4311-9a0d-30276ce40418-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6\" (UID: \"bfedd3e5-e8d7-4311-9a0d-30276ce40418\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6" Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.954304 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfedd3e5-e8d7-4311-9a0d-30276ce40418-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6\" (UID: \"bfedd3e5-e8d7-4311-9a0d-30276ce40418\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6" Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.954375 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bfedd3e5-e8d7-4311-9a0d-30276ce40418-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6\" (UID: \"bfedd3e5-e8d7-4311-9a0d-30276ce40418\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6" Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.954930 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfedd3e5-e8d7-4311-9a0d-30276ce40418-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6\" (UID: \"bfedd3e5-e8d7-4311-9a0d-30276ce40418\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6" Mar 13 12:10:55 crc kubenswrapper[4837]: I0313 12:10:55.964803 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc689\" (UniqueName: \"kubernetes.io/projected/bfedd3e5-e8d7-4311-9a0d-30276ce40418-kube-api-access-lc689\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6\" (UID: \"bfedd3e5-e8d7-4311-9a0d-30276ce40418\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6" Mar 13 12:10:56 crc kubenswrapper[4837]: I0313 12:10:56.143870 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6" Mar 13 12:10:56 crc kubenswrapper[4837]: I0313 12:10:56.607074 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"90028d66-5134-4c09-af15-71e754f49bf3","Type":"ContainerStarted","Data":"542ea034b4a23a312c031070a7d0e00e62cbf6d03c66433e44b8f1f7bad49766"} Mar 13 12:10:56 crc kubenswrapper[4837]: I0313 12:10:56.608759 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:10:56 crc kubenswrapper[4837]: I0313 12:10:56.611028 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"245e5a26-d143-4e4d-bae8-094275a91574","Type":"ContainerStarted","Data":"a06e916058107256eb45141b722629ae6ffc45e6bbd11377cfb66669bc921055"} Mar 13 12:10:56 crc kubenswrapper[4837]: I0313 12:10:56.611660 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 13 12:10:56 crc kubenswrapper[4837]: I0313 12:10:56.686038 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.686016439 podStartE2EDuration="36.686016439s" podCreationTimestamp="2026-03-13 12:10:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:10:56.683389956 +0000 UTC m=+1372.321656719" watchObservedRunningTime="2026-03-13 12:10:56.686016439 +0000 UTC m=+1372.324283202" Mar 13 12:10:56 crc kubenswrapper[4837]: I0313 12:10:56.689148 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.689127568 podStartE2EDuration="36.689127568s" podCreationTimestamp="2026-03-13 12:10:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:10:56.6534215 +0000 UTC m=+1372.291688263" watchObservedRunningTime="2026-03-13 12:10:56.689127568 +0000 UTC m=+1372.327394341" Mar 13 12:10:56 crc kubenswrapper[4837]: W0313 12:10:56.730066 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbfedd3e5_e8d7_4311_9a0d_30276ce40418.slice/crio-a866d1865427553eddb46b92dbd201d722a968316d595895a86fe588ce4b2bdc WatchSource:0}: Error finding container a866d1865427553eddb46b92dbd201d722a968316d595895a86fe588ce4b2bdc: Status 404 returned error can't find the container with id a866d1865427553eddb46b92dbd201d722a968316d595895a86fe588ce4b2bdc Mar 13 12:10:56 crc kubenswrapper[4837]: I0313 12:10:56.730492 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6"] Mar 13 12:10:57 crc kubenswrapper[4837]: I0313 12:10:57.628855 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6" event={"ID":"bfedd3e5-e8d7-4311-9a0d-30276ce40418","Type":"ContainerStarted","Data":"a866d1865427553eddb46b92dbd201d722a968316d595895a86fe588ce4b2bdc"} Mar 13 12:11:05 crc kubenswrapper[4837]: I0313 12:11:05.483805 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:11:05 crc kubenswrapper[4837]: I0313 12:11:05.484120 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:11:07 crc kubenswrapper[4837]: I0313 12:11:07.285088 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 12:11:07 crc kubenswrapper[4837]: I0313 12:11:07.765370 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6" event={"ID":"bfedd3e5-e8d7-4311-9a0d-30276ce40418","Type":"ContainerStarted","Data":"d8a71036699a5c429f4958ee9f2fd50fed11eca00708b1169cdeb9b07548dda7"} Mar 13 12:11:07 crc kubenswrapper[4837]: I0313 12:11:07.790610 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6" podStartSLOduration=2.241170697 podStartE2EDuration="12.790585644s" podCreationTimestamp="2026-03-13 12:10:55 +0000 UTC" firstStartedPulling="2026-03-13 12:10:56.73320192 +0000 UTC m=+1372.371468693" lastFinishedPulling="2026-03-13 12:11:07.282616877 +0000 UTC m=+1382.920883640" observedRunningTime="2026-03-13 12:11:07.78695733 +0000 UTC m=+1383.425224093" watchObservedRunningTime="2026-03-13 12:11:07.790585644 +0000 UTC m=+1383.428852407" Mar 13 12:11:10 crc kubenswrapper[4837]: I0313 12:11:10.642876 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 13 12:11:11 crc kubenswrapper[4837]: I0313 12:11:11.112794 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 13 12:11:17 crc kubenswrapper[4837]: I0313 12:11:17.853166 4837 generic.go:334] "Generic (PLEG): container finished" podID="bfedd3e5-e8d7-4311-9a0d-30276ce40418" containerID="d8a71036699a5c429f4958ee9f2fd50fed11eca00708b1169cdeb9b07548dda7" exitCode=0 Mar 13 12:11:17 crc kubenswrapper[4837]: I0313 12:11:17.853250 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6" event={"ID":"bfedd3e5-e8d7-4311-9a0d-30276ce40418","Type":"ContainerDied","Data":"d8a71036699a5c429f4958ee9f2fd50fed11eca00708b1169cdeb9b07548dda7"} Mar 13 12:11:19 crc kubenswrapper[4837]: I0313 12:11:19.294353 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6" Mar 13 12:11:19 crc kubenswrapper[4837]: I0313 12:11:19.383433 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bfedd3e5-e8d7-4311-9a0d-30276ce40418-ssh-key-openstack-edpm-ipam\") pod \"bfedd3e5-e8d7-4311-9a0d-30276ce40418\" (UID: \"bfedd3e5-e8d7-4311-9a0d-30276ce40418\") " Mar 13 12:11:19 crc kubenswrapper[4837]: I0313 12:11:19.383522 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfedd3e5-e8d7-4311-9a0d-30276ce40418-repo-setup-combined-ca-bundle\") pod \"bfedd3e5-e8d7-4311-9a0d-30276ce40418\" (UID: \"bfedd3e5-e8d7-4311-9a0d-30276ce40418\") " Mar 13 12:11:19 crc kubenswrapper[4837]: I0313 12:11:19.383551 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lc689\" (UniqueName: \"kubernetes.io/projected/bfedd3e5-e8d7-4311-9a0d-30276ce40418-kube-api-access-lc689\") pod \"bfedd3e5-e8d7-4311-9a0d-30276ce40418\" (UID: \"bfedd3e5-e8d7-4311-9a0d-30276ce40418\") " Mar 13 12:11:19 crc kubenswrapper[4837]: I0313 12:11:19.383708 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfedd3e5-e8d7-4311-9a0d-30276ce40418-inventory\") pod \"bfedd3e5-e8d7-4311-9a0d-30276ce40418\" (UID: \"bfedd3e5-e8d7-4311-9a0d-30276ce40418\") " Mar 13 12:11:19 crc kubenswrapper[4837]: I0313 12:11:19.389680 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfedd3e5-e8d7-4311-9a0d-30276ce40418-kube-api-access-lc689" (OuterVolumeSpecName: "kube-api-access-lc689") pod "bfedd3e5-e8d7-4311-9a0d-30276ce40418" (UID: "bfedd3e5-e8d7-4311-9a0d-30276ce40418"). InnerVolumeSpecName "kube-api-access-lc689". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:11:19 crc kubenswrapper[4837]: I0313 12:11:19.393730 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfedd3e5-e8d7-4311-9a0d-30276ce40418-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "bfedd3e5-e8d7-4311-9a0d-30276ce40418" (UID: "bfedd3e5-e8d7-4311-9a0d-30276ce40418"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:11:19 crc kubenswrapper[4837]: I0313 12:11:19.416745 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfedd3e5-e8d7-4311-9a0d-30276ce40418-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "bfedd3e5-e8d7-4311-9a0d-30276ce40418" (UID: "bfedd3e5-e8d7-4311-9a0d-30276ce40418"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:11:19 crc kubenswrapper[4837]: I0313 12:11:19.416791 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfedd3e5-e8d7-4311-9a0d-30276ce40418-inventory" (OuterVolumeSpecName: "inventory") pod "bfedd3e5-e8d7-4311-9a0d-30276ce40418" (UID: "bfedd3e5-e8d7-4311-9a0d-30276ce40418"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:11:19 crc kubenswrapper[4837]: I0313 12:11:19.486061 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bfedd3e5-e8d7-4311-9a0d-30276ce40418-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:19 crc kubenswrapper[4837]: I0313 12:11:19.486098 4837 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfedd3e5-e8d7-4311-9a0d-30276ce40418-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:19 crc kubenswrapper[4837]: I0313 12:11:19.486110 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lc689\" (UniqueName: \"kubernetes.io/projected/bfedd3e5-e8d7-4311-9a0d-30276ce40418-kube-api-access-lc689\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:19 crc kubenswrapper[4837]: I0313 12:11:19.486123 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfedd3e5-e8d7-4311-9a0d-30276ce40418-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:19 crc kubenswrapper[4837]: I0313 12:11:19.876450 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6" event={"ID":"bfedd3e5-e8d7-4311-9a0d-30276ce40418","Type":"ContainerDied","Data":"a866d1865427553eddb46b92dbd201d722a968316d595895a86fe588ce4b2bdc"} Mar 13 12:11:19 crc kubenswrapper[4837]: I0313 12:11:19.876497 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a866d1865427553eddb46b92dbd201d722a968316d595895a86fe588ce4b2bdc" Mar 13 12:11:19 crc kubenswrapper[4837]: I0313 12:11:19.876558 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6" Mar 13 12:11:20 crc kubenswrapper[4837]: I0313 12:11:19.992524 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-pz9nt"] Mar 13 12:11:20 crc kubenswrapper[4837]: E0313 12:11:19.993325 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfedd3e5-e8d7-4311-9a0d-30276ce40418" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 13 12:11:20 crc kubenswrapper[4837]: I0313 12:11:19.993341 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfedd3e5-e8d7-4311-9a0d-30276ce40418" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 13 12:11:20 crc kubenswrapper[4837]: I0313 12:11:19.993696 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfedd3e5-e8d7-4311-9a0d-30276ce40418" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 13 12:11:20 crc kubenswrapper[4837]: I0313 12:11:19.994488 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pz9nt" Mar 13 12:11:20 crc kubenswrapper[4837]: I0313 12:11:20.015883 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 12:11:20 crc kubenswrapper[4837]: I0313 12:11:20.015992 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 12:11:20 crc kubenswrapper[4837]: I0313 12:11:20.016036 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dxdkz" Mar 13 12:11:20 crc kubenswrapper[4837]: I0313 12:11:20.016350 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 12:11:20 crc kubenswrapper[4837]: I0313 12:11:20.031535 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-pz9nt"] Mar 13 12:11:20 crc kubenswrapper[4837]: I0313 12:11:20.101703 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b7402b1-0b76-4ffa-b37f-6e014183f6a6-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pz9nt\" (UID: \"0b7402b1-0b76-4ffa-b37f-6e014183f6a6\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pz9nt" Mar 13 12:11:20 crc kubenswrapper[4837]: I0313 12:11:20.101904 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-959mk\" (UniqueName: \"kubernetes.io/projected/0b7402b1-0b76-4ffa-b37f-6e014183f6a6-kube-api-access-959mk\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pz9nt\" (UID: \"0b7402b1-0b76-4ffa-b37f-6e014183f6a6\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pz9nt" Mar 13 12:11:20 crc kubenswrapper[4837]: I0313 12:11:20.101943 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b7402b1-0b76-4ffa-b37f-6e014183f6a6-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pz9nt\" (UID: \"0b7402b1-0b76-4ffa-b37f-6e014183f6a6\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pz9nt" Mar 13 12:11:20 crc kubenswrapper[4837]: I0313 12:11:20.204702 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b7402b1-0b76-4ffa-b37f-6e014183f6a6-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pz9nt\" (UID: \"0b7402b1-0b76-4ffa-b37f-6e014183f6a6\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pz9nt" Mar 13 12:11:20 crc kubenswrapper[4837]: I0313 12:11:20.204918 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-959mk\" (UniqueName: \"kubernetes.io/projected/0b7402b1-0b76-4ffa-b37f-6e014183f6a6-kube-api-access-959mk\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pz9nt\" (UID: \"0b7402b1-0b76-4ffa-b37f-6e014183f6a6\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pz9nt" Mar 13 12:11:20 crc kubenswrapper[4837]: I0313 12:11:20.204981 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b7402b1-0b76-4ffa-b37f-6e014183f6a6-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pz9nt\" (UID: \"0b7402b1-0b76-4ffa-b37f-6e014183f6a6\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pz9nt" Mar 13 12:11:20 crc kubenswrapper[4837]: I0313 12:11:20.208990 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b7402b1-0b76-4ffa-b37f-6e014183f6a6-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pz9nt\" (UID: \"0b7402b1-0b76-4ffa-b37f-6e014183f6a6\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pz9nt" Mar 13 12:11:20 crc kubenswrapper[4837]: I0313 12:11:20.211423 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b7402b1-0b76-4ffa-b37f-6e014183f6a6-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pz9nt\" (UID: \"0b7402b1-0b76-4ffa-b37f-6e014183f6a6\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pz9nt" Mar 13 12:11:20 crc kubenswrapper[4837]: I0313 12:11:20.256861 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-959mk\" (UniqueName: \"kubernetes.io/projected/0b7402b1-0b76-4ffa-b37f-6e014183f6a6-kube-api-access-959mk\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pz9nt\" (UID: \"0b7402b1-0b76-4ffa-b37f-6e014183f6a6\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pz9nt" Mar 13 12:11:20 crc kubenswrapper[4837]: I0313 12:11:20.352729 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pz9nt" Mar 13 12:11:20 crc kubenswrapper[4837]: I0313 12:11:20.871151 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-pz9nt"] Mar 13 12:11:20 crc kubenswrapper[4837]: I0313 12:11:20.889087 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pz9nt" event={"ID":"0b7402b1-0b76-4ffa-b37f-6e014183f6a6","Type":"ContainerStarted","Data":"b89bf6808b23352420b0ada98d22d3af61da2bcbf1e90c1ae0b4a1eaddb6418a"} Mar 13 12:11:21 crc kubenswrapper[4837]: I0313 12:11:21.902766 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pz9nt" event={"ID":"0b7402b1-0b76-4ffa-b37f-6e014183f6a6","Type":"ContainerStarted","Data":"45b5081c34f5edc0075777fcfa00e2d62084726d76624bcb0330bd64e206316b"} Mar 13 12:11:21 crc kubenswrapper[4837]: I0313 12:11:21.940152 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pz9nt" podStartSLOduration=2.496164174 podStartE2EDuration="2.94012824s" podCreationTimestamp="2026-03-13 12:11:19 +0000 UTC" firstStartedPulling="2026-03-13 12:11:20.8758945 +0000 UTC m=+1396.514161263" lastFinishedPulling="2026-03-13 12:11:21.319858566 +0000 UTC m=+1396.958125329" observedRunningTime="2026-03-13 12:11:21.931866699 +0000 UTC m=+1397.570133502" watchObservedRunningTime="2026-03-13 12:11:21.94012824 +0000 UTC m=+1397.578395003" Mar 13 12:11:23 crc kubenswrapper[4837]: I0313 12:11:23.921726 4837 generic.go:334] "Generic (PLEG): container finished" podID="0b7402b1-0b76-4ffa-b37f-6e014183f6a6" containerID="45b5081c34f5edc0075777fcfa00e2d62084726d76624bcb0330bd64e206316b" exitCode=0 Mar 13 12:11:23 crc kubenswrapper[4837]: I0313 12:11:23.921832 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pz9nt" event={"ID":"0b7402b1-0b76-4ffa-b37f-6e014183f6a6","Type":"ContainerDied","Data":"45b5081c34f5edc0075777fcfa00e2d62084726d76624bcb0330bd64e206316b"} Mar 13 12:11:25 crc kubenswrapper[4837]: I0313 12:11:25.337971 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pz9nt" Mar 13 12:11:25 crc kubenswrapper[4837]: I0313 12:11:25.502983 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b7402b1-0b76-4ffa-b37f-6e014183f6a6-inventory\") pod \"0b7402b1-0b76-4ffa-b37f-6e014183f6a6\" (UID: \"0b7402b1-0b76-4ffa-b37f-6e014183f6a6\") " Mar 13 12:11:25 crc kubenswrapper[4837]: I0313 12:11:25.503399 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b7402b1-0b76-4ffa-b37f-6e014183f6a6-ssh-key-openstack-edpm-ipam\") pod \"0b7402b1-0b76-4ffa-b37f-6e014183f6a6\" (UID: \"0b7402b1-0b76-4ffa-b37f-6e014183f6a6\") " Mar 13 12:11:25 crc kubenswrapper[4837]: I0313 12:11:25.503531 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-959mk\" (UniqueName: \"kubernetes.io/projected/0b7402b1-0b76-4ffa-b37f-6e014183f6a6-kube-api-access-959mk\") pod \"0b7402b1-0b76-4ffa-b37f-6e014183f6a6\" (UID: \"0b7402b1-0b76-4ffa-b37f-6e014183f6a6\") " Mar 13 12:11:25 crc kubenswrapper[4837]: I0313 12:11:25.511994 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b7402b1-0b76-4ffa-b37f-6e014183f6a6-kube-api-access-959mk" (OuterVolumeSpecName: "kube-api-access-959mk") pod "0b7402b1-0b76-4ffa-b37f-6e014183f6a6" (UID: "0b7402b1-0b76-4ffa-b37f-6e014183f6a6"). InnerVolumeSpecName "kube-api-access-959mk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:11:25 crc kubenswrapper[4837]: I0313 12:11:25.537153 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b7402b1-0b76-4ffa-b37f-6e014183f6a6-inventory" (OuterVolumeSpecName: "inventory") pod "0b7402b1-0b76-4ffa-b37f-6e014183f6a6" (UID: "0b7402b1-0b76-4ffa-b37f-6e014183f6a6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:11:25 crc kubenswrapper[4837]: I0313 12:11:25.540402 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b7402b1-0b76-4ffa-b37f-6e014183f6a6-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0b7402b1-0b76-4ffa-b37f-6e014183f6a6" (UID: "0b7402b1-0b76-4ffa-b37f-6e014183f6a6"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:11:25 crc kubenswrapper[4837]: I0313 12:11:25.605440 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-959mk\" (UniqueName: \"kubernetes.io/projected/0b7402b1-0b76-4ffa-b37f-6e014183f6a6-kube-api-access-959mk\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:25 crc kubenswrapper[4837]: I0313 12:11:25.605479 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b7402b1-0b76-4ffa-b37f-6e014183f6a6-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:25 crc kubenswrapper[4837]: I0313 12:11:25.605493 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b7402b1-0b76-4ffa-b37f-6e014183f6a6-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 12:11:25 crc kubenswrapper[4837]: I0313 12:11:25.941493 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pz9nt" event={"ID":"0b7402b1-0b76-4ffa-b37f-6e014183f6a6","Type":"ContainerDied","Data":"b89bf6808b23352420b0ada98d22d3af61da2bcbf1e90c1ae0b4a1eaddb6418a"} Mar 13 12:11:25 crc kubenswrapper[4837]: I0313 12:11:25.941549 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b89bf6808b23352420b0ada98d22d3af61da2bcbf1e90c1ae0b4a1eaddb6418a" Mar 13 12:11:25 crc kubenswrapper[4837]: I0313 12:11:25.941824 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pz9nt" Mar 13 12:11:26 crc kubenswrapper[4837]: I0313 12:11:26.417814 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj"] Mar 13 12:11:26 crc kubenswrapper[4837]: E0313 12:11:26.419028 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b7402b1-0b76-4ffa-b37f-6e014183f6a6" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 13 12:11:26 crc kubenswrapper[4837]: I0313 12:11:26.419151 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b7402b1-0b76-4ffa-b37f-6e014183f6a6" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 13 12:11:26 crc kubenswrapper[4837]: I0313 12:11:26.419423 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b7402b1-0b76-4ffa-b37f-6e014183f6a6" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 13 12:11:26 crc kubenswrapper[4837]: I0313 12:11:26.420301 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj" Mar 13 12:11:26 crc kubenswrapper[4837]: I0313 12:11:26.422310 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 12:11:26 crc kubenswrapper[4837]: I0313 12:11:26.422578 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 12:11:26 crc kubenswrapper[4837]: I0313 12:11:26.422777 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 12:11:26 crc kubenswrapper[4837]: I0313 12:11:26.422971 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dxdkz" Mar 13 12:11:26 crc kubenswrapper[4837]: I0313 12:11:26.426181 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj"] Mar 13 12:11:26 crc kubenswrapper[4837]: I0313 12:11:26.520424 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj\" (UID: \"2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj" Mar 13 12:11:26 crc kubenswrapper[4837]: I0313 12:11:26.520568 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srznj\" (UniqueName: \"kubernetes.io/projected/2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6-kube-api-access-srznj\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj\" (UID: \"2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj" Mar 13 12:11:26 crc kubenswrapper[4837]: I0313 12:11:26.520650 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj\" (UID: \"2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj" Mar 13 12:11:26 crc kubenswrapper[4837]: I0313 12:11:26.520687 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj\" (UID: \"2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj" Mar 13 12:11:26 crc kubenswrapper[4837]: I0313 12:11:26.622387 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj\" (UID: \"2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj" Mar 13 12:11:26 crc kubenswrapper[4837]: I0313 12:11:26.622963 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj\" (UID: \"2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj" Mar 13 12:11:26 crc kubenswrapper[4837]: I0313 12:11:26.623097 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj\" (UID: \"2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj" Mar 13 12:11:26 crc kubenswrapper[4837]: I0313 12:11:26.623254 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srznj\" (UniqueName: \"kubernetes.io/projected/2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6-kube-api-access-srznj\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj\" (UID: \"2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj" Mar 13 12:11:26 crc kubenswrapper[4837]: I0313 12:11:26.629365 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj\" (UID: \"2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj" Mar 13 12:11:26 crc kubenswrapper[4837]: I0313 12:11:26.631429 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj\" (UID: \"2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj" Mar 13 12:11:26 crc kubenswrapper[4837]: I0313 12:11:26.632385 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj\" (UID: \"2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj" Mar 13 12:11:26 crc kubenswrapper[4837]: I0313 12:11:26.646704 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srznj\" (UniqueName: \"kubernetes.io/projected/2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6-kube-api-access-srznj\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj\" (UID: \"2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj" Mar 13 12:11:26 crc kubenswrapper[4837]: I0313 12:11:26.737069 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj" Mar 13 12:11:27 crc kubenswrapper[4837]: I0313 12:11:27.231460 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj"] Mar 13 12:11:27 crc kubenswrapper[4837]: I0313 12:11:27.960832 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj" event={"ID":"2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6","Type":"ContainerStarted","Data":"eb1333ce0764b093d73fd17e5289d7edb5cff5fe2036f478ee8e0d94f4ed2a16"} Mar 13 12:11:27 crc kubenswrapper[4837]: I0313 12:11:27.961212 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj" event={"ID":"2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6","Type":"ContainerStarted","Data":"011972819d057043c306599fed7fa5342801862959ffcdf2a7d97969e48cbd76"} Mar 13 12:11:27 crc kubenswrapper[4837]: I0313 12:11:27.985259 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj" podStartSLOduration=1.551860845 podStartE2EDuration="1.985228845s" podCreationTimestamp="2026-03-13 12:11:26 +0000 UTC" firstStartedPulling="2026-03-13 12:11:27.232548828 +0000 UTC m=+1402.870815591" lastFinishedPulling="2026-03-13 12:11:27.665916798 +0000 UTC m=+1403.304183591" observedRunningTime="2026-03-13 12:11:27.98221523 +0000 UTC m=+1403.620481983" watchObservedRunningTime="2026-03-13 12:11:27.985228845 +0000 UTC m=+1403.623495608" Mar 13 12:11:35 crc kubenswrapper[4837]: I0313 12:11:35.484481 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:11:35 crc kubenswrapper[4837]: I0313 12:11:35.485103 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:11:35 crc kubenswrapper[4837]: I0313 12:11:35.485162 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 12:11:35 crc kubenswrapper[4837]: I0313 12:11:35.485978 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1c6d9e7e9de5c8ffd75bcf8d5717605d713d8068815596e54e918770c94282bc"} pod="openshift-machine-config-operator/machine-config-daemon-2td4d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 12:11:35 crc kubenswrapper[4837]: I0313 12:11:35.486034 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" containerID="cri-o://1c6d9e7e9de5c8ffd75bcf8d5717605d713d8068815596e54e918770c94282bc" gracePeriod=600 Mar 13 12:11:36 crc kubenswrapper[4837]: I0313 12:11:36.032467 4837 generic.go:334] "Generic (PLEG): container finished" podID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerID="1c6d9e7e9de5c8ffd75bcf8d5717605d713d8068815596e54e918770c94282bc" exitCode=0 Mar 13 12:11:36 crc kubenswrapper[4837]: I0313 12:11:36.032554 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerDied","Data":"1c6d9e7e9de5c8ffd75bcf8d5717605d713d8068815596e54e918770c94282bc"} Mar 13 12:11:36 crc kubenswrapper[4837]: I0313 12:11:36.032752 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerStarted","Data":"92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a"} Mar 13 12:11:36 crc kubenswrapper[4837]: I0313 12:11:36.032775 4837 scope.go:117] "RemoveContainer" containerID="75c6e15833f1c4c6d83b741f42f4ce0c9378844641d1d149fd75349d257dfc71" Mar 13 12:12:00 crc kubenswrapper[4837]: I0313 12:12:00.148592 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556732-84qfh"] Mar 13 12:12:00 crc kubenswrapper[4837]: I0313 12:12:00.150697 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556732-84qfh" Mar 13 12:12:00 crc kubenswrapper[4837]: I0313 12:12:00.154186 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:12:00 crc kubenswrapper[4837]: I0313 12:12:00.154418 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:12:00 crc kubenswrapper[4837]: I0313 12:12:00.154597 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:12:00 crc kubenswrapper[4837]: I0313 12:12:00.160749 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556732-84qfh"] Mar 13 12:12:00 crc kubenswrapper[4837]: I0313 12:12:00.305380 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9t7g\" (UniqueName: \"kubernetes.io/projected/564139cd-c95b-45c7-bf55-00c944313930-kube-api-access-k9t7g\") pod \"auto-csr-approver-29556732-84qfh\" (UID: \"564139cd-c95b-45c7-bf55-00c944313930\") " pod="openshift-infra/auto-csr-approver-29556732-84qfh" Mar 13 12:12:00 crc kubenswrapper[4837]: I0313 12:12:00.407429 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9t7g\" (UniqueName: \"kubernetes.io/projected/564139cd-c95b-45c7-bf55-00c944313930-kube-api-access-k9t7g\") pod \"auto-csr-approver-29556732-84qfh\" (UID: \"564139cd-c95b-45c7-bf55-00c944313930\") " pod="openshift-infra/auto-csr-approver-29556732-84qfh" Mar 13 12:12:00 crc kubenswrapper[4837]: I0313 12:12:00.426324 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9t7g\" (UniqueName: \"kubernetes.io/projected/564139cd-c95b-45c7-bf55-00c944313930-kube-api-access-k9t7g\") pod \"auto-csr-approver-29556732-84qfh\" (UID: \"564139cd-c95b-45c7-bf55-00c944313930\") " pod="openshift-infra/auto-csr-approver-29556732-84qfh" Mar 13 12:12:00 crc kubenswrapper[4837]: I0313 12:12:00.480695 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556732-84qfh" Mar 13 12:12:00 crc kubenswrapper[4837]: I0313 12:12:00.967856 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556732-84qfh"] Mar 13 12:12:00 crc kubenswrapper[4837]: W0313 12:12:00.970940 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod564139cd_c95b_45c7_bf55_00c944313930.slice/crio-c2159da86fd08eedf43b76b9b65dccf8e53beca1390d1dbd03003e21443c7f69 WatchSource:0}: Error finding container c2159da86fd08eedf43b76b9b65dccf8e53beca1390d1dbd03003e21443c7f69: Status 404 returned error can't find the container with id c2159da86fd08eedf43b76b9b65dccf8e53beca1390d1dbd03003e21443c7f69 Mar 13 12:12:01 crc kubenswrapper[4837]: I0313 12:12:01.262269 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556732-84qfh" event={"ID":"564139cd-c95b-45c7-bf55-00c944313930","Type":"ContainerStarted","Data":"c2159da86fd08eedf43b76b9b65dccf8e53beca1390d1dbd03003e21443c7f69"} Mar 13 12:12:03 crc kubenswrapper[4837]: I0313 12:12:03.296051 4837 generic.go:334] "Generic (PLEG): container finished" podID="564139cd-c95b-45c7-bf55-00c944313930" containerID="e317d41369cc2f3ddf2e1c831d3041b43d32e03d72c05e27b06993576c33a0e8" exitCode=0 Mar 13 12:12:03 crc kubenswrapper[4837]: I0313 12:12:03.296460 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556732-84qfh" event={"ID":"564139cd-c95b-45c7-bf55-00c944313930","Type":"ContainerDied","Data":"e317d41369cc2f3ddf2e1c831d3041b43d32e03d72c05e27b06993576c33a0e8"} Mar 13 12:12:04 crc kubenswrapper[4837]: I0313 12:12:04.614800 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556732-84qfh" Mar 13 12:12:04 crc kubenswrapper[4837]: I0313 12:12:04.792936 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9t7g\" (UniqueName: \"kubernetes.io/projected/564139cd-c95b-45c7-bf55-00c944313930-kube-api-access-k9t7g\") pod \"564139cd-c95b-45c7-bf55-00c944313930\" (UID: \"564139cd-c95b-45c7-bf55-00c944313930\") " Mar 13 12:12:04 crc kubenswrapper[4837]: I0313 12:12:04.800925 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/564139cd-c95b-45c7-bf55-00c944313930-kube-api-access-k9t7g" (OuterVolumeSpecName: "kube-api-access-k9t7g") pod "564139cd-c95b-45c7-bf55-00c944313930" (UID: "564139cd-c95b-45c7-bf55-00c944313930"). InnerVolumeSpecName "kube-api-access-k9t7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:12:04 crc kubenswrapper[4837]: I0313 12:12:04.895664 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9t7g\" (UniqueName: \"kubernetes.io/projected/564139cd-c95b-45c7-bf55-00c944313930-kube-api-access-k9t7g\") on node \"crc\" DevicePath \"\"" Mar 13 12:12:05 crc kubenswrapper[4837]: I0313 12:12:05.316897 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556732-84qfh" event={"ID":"564139cd-c95b-45c7-bf55-00c944313930","Type":"ContainerDied","Data":"c2159da86fd08eedf43b76b9b65dccf8e53beca1390d1dbd03003e21443c7f69"} Mar 13 12:12:05 crc kubenswrapper[4837]: I0313 12:12:05.316944 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2159da86fd08eedf43b76b9b65dccf8e53beca1390d1dbd03003e21443c7f69" Mar 13 12:12:05 crc kubenswrapper[4837]: I0313 12:12:05.317017 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556732-84qfh" Mar 13 12:12:05 crc kubenswrapper[4837]: I0313 12:12:05.693410 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556726-gdbfm"] Mar 13 12:12:05 crc kubenswrapper[4837]: I0313 12:12:05.702611 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556726-gdbfm"] Mar 13 12:12:07 crc kubenswrapper[4837]: I0313 12:12:07.060687 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83f46fff-3510-4758-82a0-30099640fa33" path="/var/lib/kubelet/pods/83f46fff-3510-4758-82a0-30099640fa33/volumes" Mar 13 12:12:27 crc kubenswrapper[4837]: I0313 12:12:27.465546 4837 scope.go:117] "RemoveContainer" containerID="8b3e536f3d4311421b7a8a53f994fc3c95b97d5e112a955f101e290d9b221b2d" Mar 13 12:12:27 crc kubenswrapper[4837]: I0313 12:12:27.524304 4837 scope.go:117] "RemoveContainer" containerID="e03f96aaa50d1c9241f9e2fad6e8df257f1e78642de37d3f89872036b5b55220" Mar 13 12:12:27 crc kubenswrapper[4837]: I0313 12:12:27.544308 4837 scope.go:117] "RemoveContainer" containerID="1858aaffb80ca26b2ecab85a7aa907d93bda6b050db7fd69c55fcebb623536ef" Mar 13 12:12:27 crc kubenswrapper[4837]: I0313 12:12:27.609088 4837 scope.go:117] "RemoveContainer" containerID="a3d9d75be9f89d9ac614473e4e3a4f535965320bd55937576eb6b69f6cb8f8b9" Mar 13 12:12:27 crc kubenswrapper[4837]: I0313 12:12:27.641545 4837 scope.go:117] "RemoveContainer" containerID="57b8ae831c66c62748afbdcfeed21457125293d241eef5e2c9e04fa2bc86f046" Mar 13 12:12:27 crc kubenswrapper[4837]: I0313 12:12:27.661331 4837 scope.go:117] "RemoveContainer" containerID="f8f234cd31d0132024229747ad2a8277b3ce2f09009460632455703d08203032" Mar 13 12:13:27 crc kubenswrapper[4837]: I0313 12:13:27.809314 4837 scope.go:117] "RemoveContainer" containerID="40a0292d1dfe433f0d44082f12bb7e30ff5d447f6e395b1d7a570420f6252eeb" Mar 13 12:13:27 crc kubenswrapper[4837]: I0313 12:13:27.849075 4837 scope.go:117] "RemoveContainer" containerID="ccf4fdc9606b0ae8a6ecc82badd31da8c6fddc1f4294bee13d5805f8da627b43" Mar 13 12:13:35 crc kubenswrapper[4837]: I0313 12:13:35.484198 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:13:35 crc kubenswrapper[4837]: I0313 12:13:35.485209 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:14:00 crc kubenswrapper[4837]: I0313 12:14:00.154436 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556734-g7zt7"] Mar 13 12:14:00 crc kubenswrapper[4837]: E0313 12:14:00.155372 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="564139cd-c95b-45c7-bf55-00c944313930" containerName="oc" Mar 13 12:14:00 crc kubenswrapper[4837]: I0313 12:14:00.155415 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="564139cd-c95b-45c7-bf55-00c944313930" containerName="oc" Mar 13 12:14:00 crc kubenswrapper[4837]: I0313 12:14:00.155668 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="564139cd-c95b-45c7-bf55-00c944313930" containerName="oc" Mar 13 12:14:00 crc kubenswrapper[4837]: I0313 12:14:00.156485 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556734-g7zt7" Mar 13 12:14:00 crc kubenswrapper[4837]: I0313 12:14:00.159104 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:14:00 crc kubenswrapper[4837]: I0313 12:14:00.159289 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:14:00 crc kubenswrapper[4837]: I0313 12:14:00.159462 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:14:00 crc kubenswrapper[4837]: I0313 12:14:00.174526 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556734-g7zt7"] Mar 13 12:14:00 crc kubenswrapper[4837]: I0313 12:14:00.177279 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvpjq\" (UniqueName: \"kubernetes.io/projected/b41b916d-46ab-43e8-b624-bb1fb6aaf2f8-kube-api-access-kvpjq\") pod \"auto-csr-approver-29556734-g7zt7\" (UID: \"b41b916d-46ab-43e8-b624-bb1fb6aaf2f8\") " pod="openshift-infra/auto-csr-approver-29556734-g7zt7" Mar 13 12:14:00 crc kubenswrapper[4837]: I0313 12:14:00.279341 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvpjq\" (UniqueName: \"kubernetes.io/projected/b41b916d-46ab-43e8-b624-bb1fb6aaf2f8-kube-api-access-kvpjq\") pod \"auto-csr-approver-29556734-g7zt7\" (UID: \"b41b916d-46ab-43e8-b624-bb1fb6aaf2f8\") " pod="openshift-infra/auto-csr-approver-29556734-g7zt7" Mar 13 12:14:00 crc kubenswrapper[4837]: I0313 12:14:00.310347 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvpjq\" (UniqueName: \"kubernetes.io/projected/b41b916d-46ab-43e8-b624-bb1fb6aaf2f8-kube-api-access-kvpjq\") pod \"auto-csr-approver-29556734-g7zt7\" (UID: \"b41b916d-46ab-43e8-b624-bb1fb6aaf2f8\") " pod="openshift-infra/auto-csr-approver-29556734-g7zt7" Mar 13 12:14:00 crc kubenswrapper[4837]: I0313 12:14:00.490363 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556734-g7zt7" Mar 13 12:14:00 crc kubenswrapper[4837]: I0313 12:14:00.926576 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556734-g7zt7"] Mar 13 12:14:00 crc kubenswrapper[4837]: I0313 12:14:00.934558 4837 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 12:14:01 crc kubenswrapper[4837]: I0313 12:14:01.587993 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556734-g7zt7" event={"ID":"b41b916d-46ab-43e8-b624-bb1fb6aaf2f8","Type":"ContainerStarted","Data":"3f46648b6ee317959b85aab3af5418a8561594a01ed81bb2dd102242f281029f"} Mar 13 12:14:02 crc kubenswrapper[4837]: I0313 12:14:02.598662 4837 generic.go:334] "Generic (PLEG): container finished" podID="b41b916d-46ab-43e8-b624-bb1fb6aaf2f8" containerID="618f29cef46a018933eff3564372eb6b93270ae38a4b8bb52de53e9e241ebfba" exitCode=0 Mar 13 12:14:02 crc kubenswrapper[4837]: I0313 12:14:02.598765 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556734-g7zt7" event={"ID":"b41b916d-46ab-43e8-b624-bb1fb6aaf2f8","Type":"ContainerDied","Data":"618f29cef46a018933eff3564372eb6b93270ae38a4b8bb52de53e9e241ebfba"} Mar 13 12:14:03 crc kubenswrapper[4837]: I0313 12:14:03.914717 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556734-g7zt7" Mar 13 12:14:03 crc kubenswrapper[4837]: I0313 12:14:03.963260 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvpjq\" (UniqueName: \"kubernetes.io/projected/b41b916d-46ab-43e8-b624-bb1fb6aaf2f8-kube-api-access-kvpjq\") pod \"b41b916d-46ab-43e8-b624-bb1fb6aaf2f8\" (UID: \"b41b916d-46ab-43e8-b624-bb1fb6aaf2f8\") " Mar 13 12:14:03 crc kubenswrapper[4837]: I0313 12:14:03.970726 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b41b916d-46ab-43e8-b624-bb1fb6aaf2f8-kube-api-access-kvpjq" (OuterVolumeSpecName: "kube-api-access-kvpjq") pod "b41b916d-46ab-43e8-b624-bb1fb6aaf2f8" (UID: "b41b916d-46ab-43e8-b624-bb1fb6aaf2f8"). InnerVolumeSpecName "kube-api-access-kvpjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:14:04 crc kubenswrapper[4837]: I0313 12:14:04.065694 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvpjq\" (UniqueName: \"kubernetes.io/projected/b41b916d-46ab-43e8-b624-bb1fb6aaf2f8-kube-api-access-kvpjq\") on node \"crc\" DevicePath \"\"" Mar 13 12:14:04 crc kubenswrapper[4837]: I0313 12:14:04.618510 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556734-g7zt7" event={"ID":"b41b916d-46ab-43e8-b624-bb1fb6aaf2f8","Type":"ContainerDied","Data":"3f46648b6ee317959b85aab3af5418a8561594a01ed81bb2dd102242f281029f"} Mar 13 12:14:04 crc kubenswrapper[4837]: I0313 12:14:04.618590 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f46648b6ee317959b85aab3af5418a8561594a01ed81bb2dd102242f281029f" Mar 13 12:14:04 crc kubenswrapper[4837]: I0313 12:14:04.618595 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556734-g7zt7" Mar 13 12:14:04 crc kubenswrapper[4837]: I0313 12:14:04.982329 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556728-7n29h"] Mar 13 12:14:04 crc kubenswrapper[4837]: I0313 12:14:04.991074 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556728-7n29h"] Mar 13 12:14:05 crc kubenswrapper[4837]: I0313 12:14:05.059382 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47ae408b-faad-4a52-ad09-428242645381" path="/var/lib/kubelet/pods/47ae408b-faad-4a52-ad09-428242645381/volumes" Mar 13 12:14:05 crc kubenswrapper[4837]: I0313 12:14:05.484471 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:14:05 crc kubenswrapper[4837]: I0313 12:14:05.484532 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:14:22 crc kubenswrapper[4837]: I0313 12:14:22.799249 4837 generic.go:334] "Generic (PLEG): container finished" podID="2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6" containerID="eb1333ce0764b093d73fd17e5289d7edb5cff5fe2036f478ee8e0d94f4ed2a16" exitCode=0 Mar 13 12:14:22 crc kubenswrapper[4837]: I0313 12:14:22.799929 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj" event={"ID":"2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6","Type":"ContainerDied","Data":"eb1333ce0764b093d73fd17e5289d7edb5cff5fe2036f478ee8e0d94f4ed2a16"} Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.200193 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj" Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.284492 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6-ssh-key-openstack-edpm-ipam\") pod \"2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6\" (UID: \"2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6\") " Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.284660 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6-bootstrap-combined-ca-bundle\") pod \"2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6\" (UID: \"2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6\") " Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.284859 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srznj\" (UniqueName: \"kubernetes.io/projected/2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6-kube-api-access-srznj\") pod \"2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6\" (UID: \"2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6\") " Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.284950 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6-inventory\") pod \"2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6\" (UID: \"2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6\") " Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.300365 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6" (UID: "2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.300470 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6-kube-api-access-srznj" (OuterVolumeSpecName: "kube-api-access-srznj") pod "2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6" (UID: "2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6"). InnerVolumeSpecName "kube-api-access-srznj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.312524 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6-inventory" (OuterVolumeSpecName: "inventory") pod "2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6" (UID: "2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.313091 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6" (UID: "2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.387982 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.388020 4837 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.388030 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srznj\" (UniqueName: \"kubernetes.io/projected/2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6-kube-api-access-srznj\") on node \"crc\" DevicePath \"\"" Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.388040 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.817263 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj" event={"ID":"2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6","Type":"ContainerDied","Data":"011972819d057043c306599fed7fa5342801862959ffcdf2a7d97969e48cbd76"} Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.817312 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="011972819d057043c306599fed7fa5342801862959ffcdf2a7d97969e48cbd76" Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.817350 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj" Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.901615 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts"] Mar 13 12:14:24 crc kubenswrapper[4837]: E0313 12:14:24.902345 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.902470 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 13 12:14:24 crc kubenswrapper[4837]: E0313 12:14:24.902592 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b41b916d-46ab-43e8-b624-bb1fb6aaf2f8" containerName="oc" Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.902691 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="b41b916d-46ab-43e8-b624-bb1fb6aaf2f8" containerName="oc" Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.903048 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="b41b916d-46ab-43e8-b624-bb1fb6aaf2f8" containerName="oc" Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.903169 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.904044 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts" Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.907463 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.907514 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dxdkz" Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.907521 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.907914 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.916304 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts"] Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.998247 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6b22\" (UniqueName: \"kubernetes.io/projected/121f6d1b-1277-4d68-8a48-6c4630dd6fe5-kube-api-access-g6b22\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts\" (UID: \"121f6d1b-1277-4d68-8a48-6c4630dd6fe5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts" Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.998316 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/121f6d1b-1277-4d68-8a48-6c4630dd6fe5-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts\" (UID: \"121f6d1b-1277-4d68-8a48-6c4630dd6fe5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts" Mar 13 12:14:24 crc kubenswrapper[4837]: I0313 12:14:24.998406 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/121f6d1b-1277-4d68-8a48-6c4630dd6fe5-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts\" (UID: \"121f6d1b-1277-4d68-8a48-6c4630dd6fe5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts" Mar 13 12:14:25 crc kubenswrapper[4837]: I0313 12:14:25.100492 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/121f6d1b-1277-4d68-8a48-6c4630dd6fe5-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts\" (UID: \"121f6d1b-1277-4d68-8a48-6c4630dd6fe5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts" Mar 13 12:14:25 crc kubenswrapper[4837]: I0313 12:14:25.100722 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/121f6d1b-1277-4d68-8a48-6c4630dd6fe5-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts\" (UID: \"121f6d1b-1277-4d68-8a48-6c4630dd6fe5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts" Mar 13 12:14:25 crc kubenswrapper[4837]: I0313 12:14:25.100816 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6b22\" (UniqueName: \"kubernetes.io/projected/121f6d1b-1277-4d68-8a48-6c4630dd6fe5-kube-api-access-g6b22\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts\" (UID: \"121f6d1b-1277-4d68-8a48-6c4630dd6fe5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts" Mar 13 12:14:25 crc kubenswrapper[4837]: I0313 12:14:25.106018 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/121f6d1b-1277-4d68-8a48-6c4630dd6fe5-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts\" (UID: \"121f6d1b-1277-4d68-8a48-6c4630dd6fe5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts" Mar 13 12:14:25 crc kubenswrapper[4837]: I0313 12:14:25.106022 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/121f6d1b-1277-4d68-8a48-6c4630dd6fe5-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts\" (UID: \"121f6d1b-1277-4d68-8a48-6c4630dd6fe5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts" Mar 13 12:14:25 crc kubenswrapper[4837]: I0313 12:14:25.121180 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6b22\" (UniqueName: \"kubernetes.io/projected/121f6d1b-1277-4d68-8a48-6c4630dd6fe5-kube-api-access-g6b22\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts\" (UID: \"121f6d1b-1277-4d68-8a48-6c4630dd6fe5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts" Mar 13 12:14:25 crc kubenswrapper[4837]: I0313 12:14:25.222434 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts" Mar 13 12:14:25 crc kubenswrapper[4837]: I0313 12:14:25.726340 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts"] Mar 13 12:14:25 crc kubenswrapper[4837]: I0313 12:14:25.825445 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts" event={"ID":"121f6d1b-1277-4d68-8a48-6c4630dd6fe5","Type":"ContainerStarted","Data":"ef1d62890ff7c257d9b17342f3219ac6a0097e8282e72a43ca3f00b10ba0d794"} Mar 13 12:14:26 crc kubenswrapper[4837]: I0313 12:14:26.836601 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts" event={"ID":"121f6d1b-1277-4d68-8a48-6c4630dd6fe5","Type":"ContainerStarted","Data":"846a6e75ac7966b1f1da247e3de2868e0139228fe381ae0693bde11ff4d07f27"} Mar 13 12:14:26 crc kubenswrapper[4837]: I0313 12:14:26.862404 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts" podStartSLOduration=2.033479004 podStartE2EDuration="2.862383208s" podCreationTimestamp="2026-03-13 12:14:24 +0000 UTC" firstStartedPulling="2026-03-13 12:14:25.722573792 +0000 UTC m=+1581.360840555" lastFinishedPulling="2026-03-13 12:14:26.551477956 +0000 UTC m=+1582.189744759" observedRunningTime="2026-03-13 12:14:26.853420486 +0000 UTC m=+1582.491687249" watchObservedRunningTime="2026-03-13 12:14:26.862383208 +0000 UTC m=+1582.500649971" Mar 13 12:14:28 crc kubenswrapper[4837]: I0313 12:14:28.094329 4837 scope.go:117] "RemoveContainer" containerID="d2184d47fa1ce72a82da97184468ccee1cece609eb9ab8fb1194680ef9c8ea21" Mar 13 12:14:35 crc kubenswrapper[4837]: I0313 12:14:35.484293 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:14:35 crc kubenswrapper[4837]: I0313 12:14:35.484940 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:14:35 crc kubenswrapper[4837]: I0313 12:14:35.485001 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 12:14:35 crc kubenswrapper[4837]: I0313 12:14:35.485899 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a"} pod="openshift-machine-config-operator/machine-config-daemon-2td4d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 12:14:35 crc kubenswrapper[4837]: I0313 12:14:35.485958 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" containerID="cri-o://92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a" gracePeriod=600 Mar 13 12:14:35 crc kubenswrapper[4837]: E0313 12:14:35.629686 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:14:35 crc kubenswrapper[4837]: I0313 12:14:35.927384 4837 generic.go:334] "Generic (PLEG): container finished" podID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerID="92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a" exitCode=0 Mar 13 12:14:35 crc kubenswrapper[4837]: I0313 12:14:35.927438 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerDied","Data":"92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a"} Mar 13 12:14:35 crc kubenswrapper[4837]: I0313 12:14:35.927487 4837 scope.go:117] "RemoveContainer" containerID="1c6d9e7e9de5c8ffd75bcf8d5717605d713d8068815596e54e918770c94282bc" Mar 13 12:14:35 crc kubenswrapper[4837]: I0313 12:14:35.928219 4837 scope.go:117] "RemoveContainer" containerID="92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a" Mar 13 12:14:35 crc kubenswrapper[4837]: E0313 12:14:35.928484 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:14:49 crc kubenswrapper[4837]: I0313 12:14:49.048373 4837 scope.go:117] "RemoveContainer" containerID="92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a" Mar 13 12:14:49 crc kubenswrapper[4837]: E0313 12:14:49.049098 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:15:00 crc kubenswrapper[4837]: I0313 12:15:00.150780 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556735-548l8"] Mar 13 12:15:00 crc kubenswrapper[4837]: I0313 12:15:00.152394 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556735-548l8" Mar 13 12:15:00 crc kubenswrapper[4837]: I0313 12:15:00.154555 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 12:15:00 crc kubenswrapper[4837]: I0313 12:15:00.155273 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 12:15:00 crc kubenswrapper[4837]: I0313 12:15:00.177333 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556735-548l8"] Mar 13 12:15:00 crc kubenswrapper[4837]: I0313 12:15:00.314917 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3c6ce131-8677-48bc-8f07-b53837bd751b-config-volume\") pod \"collect-profiles-29556735-548l8\" (UID: \"3c6ce131-8677-48bc-8f07-b53837bd751b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556735-548l8" Mar 13 12:15:00 crc kubenswrapper[4837]: I0313 12:15:00.314990 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dht88\" (UniqueName: \"kubernetes.io/projected/3c6ce131-8677-48bc-8f07-b53837bd751b-kube-api-access-dht88\") pod \"collect-profiles-29556735-548l8\" (UID: \"3c6ce131-8677-48bc-8f07-b53837bd751b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556735-548l8" Mar 13 12:15:00 crc kubenswrapper[4837]: I0313 12:15:00.315364 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3c6ce131-8677-48bc-8f07-b53837bd751b-secret-volume\") pod \"collect-profiles-29556735-548l8\" (UID: \"3c6ce131-8677-48bc-8f07-b53837bd751b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556735-548l8" Mar 13 12:15:00 crc kubenswrapper[4837]: I0313 12:15:00.416586 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dht88\" (UniqueName: \"kubernetes.io/projected/3c6ce131-8677-48bc-8f07-b53837bd751b-kube-api-access-dht88\") pod \"collect-profiles-29556735-548l8\" (UID: \"3c6ce131-8677-48bc-8f07-b53837bd751b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556735-548l8" Mar 13 12:15:00 crc kubenswrapper[4837]: I0313 12:15:00.416736 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3c6ce131-8677-48bc-8f07-b53837bd751b-secret-volume\") pod \"collect-profiles-29556735-548l8\" (UID: \"3c6ce131-8677-48bc-8f07-b53837bd751b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556735-548l8" Mar 13 12:15:00 crc kubenswrapper[4837]: I0313 12:15:00.416834 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3c6ce131-8677-48bc-8f07-b53837bd751b-config-volume\") pod \"collect-profiles-29556735-548l8\" (UID: \"3c6ce131-8677-48bc-8f07-b53837bd751b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556735-548l8" Mar 13 12:15:00 crc kubenswrapper[4837]: I0313 12:15:00.417604 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3c6ce131-8677-48bc-8f07-b53837bd751b-config-volume\") pod \"collect-profiles-29556735-548l8\" (UID: \"3c6ce131-8677-48bc-8f07-b53837bd751b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556735-548l8" Mar 13 12:15:00 crc kubenswrapper[4837]: I0313 12:15:00.428480 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3c6ce131-8677-48bc-8f07-b53837bd751b-secret-volume\") pod \"collect-profiles-29556735-548l8\" (UID: \"3c6ce131-8677-48bc-8f07-b53837bd751b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556735-548l8" Mar 13 12:15:00 crc kubenswrapper[4837]: I0313 12:15:00.434576 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dht88\" (UniqueName: \"kubernetes.io/projected/3c6ce131-8677-48bc-8f07-b53837bd751b-kube-api-access-dht88\") pod \"collect-profiles-29556735-548l8\" (UID: \"3c6ce131-8677-48bc-8f07-b53837bd751b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556735-548l8" Mar 13 12:15:00 crc kubenswrapper[4837]: I0313 12:15:00.473468 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556735-548l8" Mar 13 12:15:00 crc kubenswrapper[4837]: I0313 12:15:00.930497 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556735-548l8"] Mar 13 12:15:01 crc kubenswrapper[4837]: I0313 12:15:01.050528 4837 scope.go:117] "RemoveContainer" containerID="92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a" Mar 13 12:15:01 crc kubenswrapper[4837]: E0313 12:15:01.050812 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:15:01 crc kubenswrapper[4837]: I0313 12:15:01.160465 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556735-548l8" event={"ID":"3c6ce131-8677-48bc-8f07-b53837bd751b","Type":"ContainerStarted","Data":"d444c3350e9c8e5cb2de80b3a01e0398d12ecfb5f36d63e1963c4019db354d3b"} Mar 13 12:15:01 crc kubenswrapper[4837]: I0313 12:15:01.160514 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556735-548l8" event={"ID":"3c6ce131-8677-48bc-8f07-b53837bd751b","Type":"ContainerStarted","Data":"2919ea9a2d0b93f84edb2e77c66e275804e6976a3a742b763c61cae4d408b047"} Mar 13 12:15:01 crc kubenswrapper[4837]: I0313 12:15:01.187524 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29556735-548l8" podStartSLOduration=1.187501544 podStartE2EDuration="1.187501544s" podCreationTimestamp="2026-03-13 12:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:15:01.176348452 +0000 UTC m=+1616.814615215" watchObservedRunningTime="2026-03-13 12:15:01.187501544 +0000 UTC m=+1616.825768307" Mar 13 12:15:02 crc kubenswrapper[4837]: I0313 12:15:02.177606 4837 generic.go:334] "Generic (PLEG): container finished" podID="3c6ce131-8677-48bc-8f07-b53837bd751b" containerID="d444c3350e9c8e5cb2de80b3a01e0398d12ecfb5f36d63e1963c4019db354d3b" exitCode=0 Mar 13 12:15:02 crc kubenswrapper[4837]: I0313 12:15:02.177670 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556735-548l8" event={"ID":"3c6ce131-8677-48bc-8f07-b53837bd751b","Type":"ContainerDied","Data":"d444c3350e9c8e5cb2de80b3a01e0398d12ecfb5f36d63e1963c4019db354d3b"} Mar 13 12:15:03 crc kubenswrapper[4837]: I0313 12:15:03.563206 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556735-548l8" Mar 13 12:15:03 crc kubenswrapper[4837]: I0313 12:15:03.675887 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3c6ce131-8677-48bc-8f07-b53837bd751b-secret-volume\") pod \"3c6ce131-8677-48bc-8f07-b53837bd751b\" (UID: \"3c6ce131-8677-48bc-8f07-b53837bd751b\") " Mar 13 12:15:03 crc kubenswrapper[4837]: I0313 12:15:03.675976 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dht88\" (UniqueName: \"kubernetes.io/projected/3c6ce131-8677-48bc-8f07-b53837bd751b-kube-api-access-dht88\") pod \"3c6ce131-8677-48bc-8f07-b53837bd751b\" (UID: \"3c6ce131-8677-48bc-8f07-b53837bd751b\") " Mar 13 12:15:03 crc kubenswrapper[4837]: I0313 12:15:03.676216 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3c6ce131-8677-48bc-8f07-b53837bd751b-config-volume\") pod \"3c6ce131-8677-48bc-8f07-b53837bd751b\" (UID: \"3c6ce131-8677-48bc-8f07-b53837bd751b\") " Mar 13 12:15:03 crc kubenswrapper[4837]: I0313 12:15:03.676613 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c6ce131-8677-48bc-8f07-b53837bd751b-config-volume" (OuterVolumeSpecName: "config-volume") pod "3c6ce131-8677-48bc-8f07-b53837bd751b" (UID: "3c6ce131-8677-48bc-8f07-b53837bd751b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:15:03 crc kubenswrapper[4837]: I0313 12:15:03.682139 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c6ce131-8677-48bc-8f07-b53837bd751b-kube-api-access-dht88" (OuterVolumeSpecName: "kube-api-access-dht88") pod "3c6ce131-8677-48bc-8f07-b53837bd751b" (UID: "3c6ce131-8677-48bc-8f07-b53837bd751b"). InnerVolumeSpecName "kube-api-access-dht88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:15:03 crc kubenswrapper[4837]: I0313 12:15:03.682263 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c6ce131-8677-48bc-8f07-b53837bd751b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3c6ce131-8677-48bc-8f07-b53837bd751b" (UID: "3c6ce131-8677-48bc-8f07-b53837bd751b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:15:03 crc kubenswrapper[4837]: I0313 12:15:03.779088 4837 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3c6ce131-8677-48bc-8f07-b53837bd751b-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 12:15:03 crc kubenswrapper[4837]: I0313 12:15:03.779398 4837 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3c6ce131-8677-48bc-8f07-b53837bd751b-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 12:15:03 crc kubenswrapper[4837]: I0313 12:15:03.779412 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dht88\" (UniqueName: \"kubernetes.io/projected/3c6ce131-8677-48bc-8f07-b53837bd751b-kube-api-access-dht88\") on node \"crc\" DevicePath \"\"" Mar 13 12:15:04 crc kubenswrapper[4837]: I0313 12:15:04.198161 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556735-548l8" event={"ID":"3c6ce131-8677-48bc-8f07-b53837bd751b","Type":"ContainerDied","Data":"2919ea9a2d0b93f84edb2e77c66e275804e6976a3a742b763c61cae4d408b047"} Mar 13 12:15:04 crc kubenswrapper[4837]: I0313 12:15:04.198211 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2919ea9a2d0b93f84edb2e77c66e275804e6976a3a742b763c61cae4d408b047" Mar 13 12:15:04 crc kubenswrapper[4837]: I0313 12:15:04.198265 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556735-548l8" Mar 13 12:15:14 crc kubenswrapper[4837]: I0313 12:15:14.053518 4837 scope.go:117] "RemoveContainer" containerID="92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a" Mar 13 12:15:14 crc kubenswrapper[4837]: E0313 12:15:14.054898 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:15:28 crc kubenswrapper[4837]: I0313 12:15:28.048842 4837 scope.go:117] "RemoveContainer" containerID="92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a" Mar 13 12:15:28 crc kubenswrapper[4837]: E0313 12:15:28.049605 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:15:28 crc kubenswrapper[4837]: I0313 12:15:28.180352 4837 scope.go:117] "RemoveContainer" containerID="2b72c4b74ac632994ae39578139216d840009de89378dfe0823503769ad992b6" Mar 13 12:15:38 crc kubenswrapper[4837]: I0313 12:15:38.045130 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-k4549"] Mar 13 12:15:38 crc kubenswrapper[4837]: E0313 12:15:38.045853 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c6ce131-8677-48bc-8f07-b53837bd751b" containerName="collect-profiles" Mar 13 12:15:38 crc kubenswrapper[4837]: I0313 12:15:38.045871 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c6ce131-8677-48bc-8f07-b53837bd751b" containerName="collect-profiles" Mar 13 12:15:38 crc kubenswrapper[4837]: I0313 12:15:38.046128 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c6ce131-8677-48bc-8f07-b53837bd751b" containerName="collect-profiles" Mar 13 12:15:38 crc kubenswrapper[4837]: I0313 12:15:38.048015 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k4549" Mar 13 12:15:38 crc kubenswrapper[4837]: I0313 12:15:38.064699 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k4549"] Mar 13 12:15:38 crc kubenswrapper[4837]: I0313 12:15:38.162311 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66-utilities\") pod \"community-operators-k4549\" (UID: \"b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66\") " pod="openshift-marketplace/community-operators-k4549" Mar 13 12:15:38 crc kubenswrapper[4837]: I0313 12:15:38.162399 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66-catalog-content\") pod \"community-operators-k4549\" (UID: \"b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66\") " pod="openshift-marketplace/community-operators-k4549" Mar 13 12:15:38 crc kubenswrapper[4837]: I0313 12:15:38.162617 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgz7b\" (UniqueName: \"kubernetes.io/projected/b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66-kube-api-access-cgz7b\") pod \"community-operators-k4549\" (UID: \"b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66\") " pod="openshift-marketplace/community-operators-k4549" Mar 13 12:15:38 crc kubenswrapper[4837]: I0313 12:15:38.264625 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66-utilities\") pod \"community-operators-k4549\" (UID: \"b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66\") " pod="openshift-marketplace/community-operators-k4549" Mar 13 12:15:38 crc kubenswrapper[4837]: I0313 12:15:38.264968 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66-catalog-content\") pod \"community-operators-k4549\" (UID: \"b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66\") " pod="openshift-marketplace/community-operators-k4549" Mar 13 12:15:38 crc kubenswrapper[4837]: I0313 12:15:38.265182 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66-utilities\") pod \"community-operators-k4549\" (UID: \"b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66\") " pod="openshift-marketplace/community-operators-k4549" Mar 13 12:15:38 crc kubenswrapper[4837]: I0313 12:15:38.265339 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgz7b\" (UniqueName: \"kubernetes.io/projected/b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66-kube-api-access-cgz7b\") pod \"community-operators-k4549\" (UID: \"b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66\") " pod="openshift-marketplace/community-operators-k4549" Mar 13 12:15:38 crc kubenswrapper[4837]: I0313 12:15:38.265538 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66-catalog-content\") pod \"community-operators-k4549\" (UID: \"b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66\") " pod="openshift-marketplace/community-operators-k4549" Mar 13 12:15:38 crc kubenswrapper[4837]: I0313 12:15:38.284387 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgz7b\" (UniqueName: \"kubernetes.io/projected/b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66-kube-api-access-cgz7b\") pod \"community-operators-k4549\" (UID: \"b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66\") " pod="openshift-marketplace/community-operators-k4549" Mar 13 12:15:38 crc kubenswrapper[4837]: I0313 12:15:38.374498 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k4549" Mar 13 12:15:38 crc kubenswrapper[4837]: I0313 12:15:38.874137 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k4549"] Mar 13 12:15:39 crc kubenswrapper[4837]: I0313 12:15:39.552792 4837 generic.go:334] "Generic (PLEG): container finished" podID="b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66" containerID="cfa7b2fa10a0f7aa076cbf31c5b4937fe47ce2198c6b9fe0219231b848a6a60d" exitCode=0 Mar 13 12:15:39 crc kubenswrapper[4837]: I0313 12:15:39.552925 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k4549" event={"ID":"b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66","Type":"ContainerDied","Data":"cfa7b2fa10a0f7aa076cbf31c5b4937fe47ce2198c6b9fe0219231b848a6a60d"} Mar 13 12:15:39 crc kubenswrapper[4837]: I0313 12:15:39.553133 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k4549" event={"ID":"b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66","Type":"ContainerStarted","Data":"dcd16009bc1432999865446905a1cb6d82986e53bcf2a6d43b335dbf163a3472"} Mar 13 12:15:40 crc kubenswrapper[4837]: I0313 12:15:40.048145 4837 scope.go:117] "RemoveContainer" containerID="92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a" Mar 13 12:15:40 crc kubenswrapper[4837]: E0313 12:15:40.048445 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:15:40 crc kubenswrapper[4837]: I0313 12:15:40.566976 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k4549" event={"ID":"b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66","Type":"ContainerStarted","Data":"7a4a3be12aa2fb097f4306fe7869dec5b8e0644f7a601b973aa63aeea41540bf"} Mar 13 12:15:41 crc kubenswrapper[4837]: I0313 12:15:41.581481 4837 generic.go:334] "Generic (PLEG): container finished" podID="b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66" containerID="7a4a3be12aa2fb097f4306fe7869dec5b8e0644f7a601b973aa63aeea41540bf" exitCode=0 Mar 13 12:15:41 crc kubenswrapper[4837]: I0313 12:15:41.581626 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k4549" event={"ID":"b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66","Type":"ContainerDied","Data":"7a4a3be12aa2fb097f4306fe7869dec5b8e0644f7a601b973aa63aeea41540bf"} Mar 13 12:15:42 crc kubenswrapper[4837]: I0313 12:15:42.591991 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k4549" event={"ID":"b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66","Type":"ContainerStarted","Data":"699d7f4f1dce9e09c45dda674b26e74507cafaedac64cc20ccd0ff7ab71a9827"} Mar 13 12:15:42 crc kubenswrapper[4837]: I0313 12:15:42.612698 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-k4549" podStartSLOduration=2.098638901 podStartE2EDuration="4.61267534s" podCreationTimestamp="2026-03-13 12:15:38 +0000 UTC" firstStartedPulling="2026-03-13 12:15:39.554951186 +0000 UTC m=+1655.193217949" lastFinishedPulling="2026-03-13 12:15:42.068987625 +0000 UTC m=+1657.707254388" observedRunningTime="2026-03-13 12:15:42.608271281 +0000 UTC m=+1658.246538044" watchObservedRunningTime="2026-03-13 12:15:42.61267534 +0000 UTC m=+1658.250942103" Mar 13 12:15:45 crc kubenswrapper[4837]: I0313 12:15:45.428581 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-66kqz"] Mar 13 12:15:45 crc kubenswrapper[4837]: I0313 12:15:45.432000 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-66kqz" Mar 13 12:15:45 crc kubenswrapper[4837]: I0313 12:15:45.450814 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-66kqz"] Mar 13 12:15:45 crc kubenswrapper[4837]: I0313 12:15:45.498925 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39642113-74ee-406e-9ffa-5b1f8a86f0a3-utilities\") pod \"redhat-marketplace-66kqz\" (UID: \"39642113-74ee-406e-9ffa-5b1f8a86f0a3\") " pod="openshift-marketplace/redhat-marketplace-66kqz" Mar 13 12:15:45 crc kubenswrapper[4837]: I0313 12:15:45.499146 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c96br\" (UniqueName: \"kubernetes.io/projected/39642113-74ee-406e-9ffa-5b1f8a86f0a3-kube-api-access-c96br\") pod \"redhat-marketplace-66kqz\" (UID: \"39642113-74ee-406e-9ffa-5b1f8a86f0a3\") " pod="openshift-marketplace/redhat-marketplace-66kqz" Mar 13 12:15:45 crc kubenswrapper[4837]: I0313 12:15:45.500062 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39642113-74ee-406e-9ffa-5b1f8a86f0a3-catalog-content\") pod \"redhat-marketplace-66kqz\" (UID: \"39642113-74ee-406e-9ffa-5b1f8a86f0a3\") " pod="openshift-marketplace/redhat-marketplace-66kqz" Mar 13 12:15:45 crc kubenswrapper[4837]: I0313 12:15:45.602234 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39642113-74ee-406e-9ffa-5b1f8a86f0a3-catalog-content\") pod \"redhat-marketplace-66kqz\" (UID: \"39642113-74ee-406e-9ffa-5b1f8a86f0a3\") " pod="openshift-marketplace/redhat-marketplace-66kqz" Mar 13 12:15:45 crc kubenswrapper[4837]: I0313 12:15:45.602313 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39642113-74ee-406e-9ffa-5b1f8a86f0a3-utilities\") pod \"redhat-marketplace-66kqz\" (UID: \"39642113-74ee-406e-9ffa-5b1f8a86f0a3\") " pod="openshift-marketplace/redhat-marketplace-66kqz" Mar 13 12:15:45 crc kubenswrapper[4837]: I0313 12:15:45.602374 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c96br\" (UniqueName: \"kubernetes.io/projected/39642113-74ee-406e-9ffa-5b1f8a86f0a3-kube-api-access-c96br\") pod \"redhat-marketplace-66kqz\" (UID: \"39642113-74ee-406e-9ffa-5b1f8a86f0a3\") " pod="openshift-marketplace/redhat-marketplace-66kqz" Mar 13 12:15:45 crc kubenswrapper[4837]: I0313 12:15:45.602729 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39642113-74ee-406e-9ffa-5b1f8a86f0a3-catalog-content\") pod \"redhat-marketplace-66kqz\" (UID: \"39642113-74ee-406e-9ffa-5b1f8a86f0a3\") " pod="openshift-marketplace/redhat-marketplace-66kqz" Mar 13 12:15:45 crc kubenswrapper[4837]: I0313 12:15:45.602776 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39642113-74ee-406e-9ffa-5b1f8a86f0a3-utilities\") pod \"redhat-marketplace-66kqz\" (UID: \"39642113-74ee-406e-9ffa-5b1f8a86f0a3\") " pod="openshift-marketplace/redhat-marketplace-66kqz" Mar 13 12:15:45 crc kubenswrapper[4837]: I0313 12:15:45.621792 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c96br\" (UniqueName: \"kubernetes.io/projected/39642113-74ee-406e-9ffa-5b1f8a86f0a3-kube-api-access-c96br\") pod \"redhat-marketplace-66kqz\" (UID: \"39642113-74ee-406e-9ffa-5b1f8a86f0a3\") " pod="openshift-marketplace/redhat-marketplace-66kqz" Mar 13 12:15:45 crc kubenswrapper[4837]: I0313 12:15:45.767196 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-66kqz" Mar 13 12:15:46 crc kubenswrapper[4837]: I0313 12:15:46.053467 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-n42jz"] Mar 13 12:15:46 crc kubenswrapper[4837]: I0313 12:15:46.076436 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-a48b-account-create-update-ckblt"] Mar 13 12:15:46 crc kubenswrapper[4837]: I0313 12:15:46.091727 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-n42jz"] Mar 13 12:15:46 crc kubenswrapper[4837]: I0313 12:15:46.101431 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-a48b-account-create-update-ckblt"] Mar 13 12:15:46 crc kubenswrapper[4837]: I0313 12:15:46.215363 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-66kqz"] Mar 13 12:15:46 crc kubenswrapper[4837]: W0313 12:15:46.217219 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39642113_74ee_406e_9ffa_5b1f8a86f0a3.slice/crio-65f583c268758f75b55fc8d8b1b7841f85ff3b2eb56e34fd425055fd65c76ae8 WatchSource:0}: Error finding container 65f583c268758f75b55fc8d8b1b7841f85ff3b2eb56e34fd425055fd65c76ae8: Status 404 returned error can't find the container with id 65f583c268758f75b55fc8d8b1b7841f85ff3b2eb56e34fd425055fd65c76ae8 Mar 13 12:15:46 crc kubenswrapper[4837]: I0313 12:15:46.638155 4837 generic.go:334] "Generic (PLEG): container finished" podID="39642113-74ee-406e-9ffa-5b1f8a86f0a3" containerID="376b0d55207197a286fb352a3a99bb1a2ae96c88906c793388f162a064b37a64" exitCode=0 Mar 13 12:15:46 crc kubenswrapper[4837]: I0313 12:15:46.638206 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-66kqz" event={"ID":"39642113-74ee-406e-9ffa-5b1f8a86f0a3","Type":"ContainerDied","Data":"376b0d55207197a286fb352a3a99bb1a2ae96c88906c793388f162a064b37a64"} Mar 13 12:15:46 crc kubenswrapper[4837]: I0313 12:15:46.638232 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-66kqz" event={"ID":"39642113-74ee-406e-9ffa-5b1f8a86f0a3","Type":"ContainerStarted","Data":"65f583c268758f75b55fc8d8b1b7841f85ff3b2eb56e34fd425055fd65c76ae8"} Mar 13 12:15:47 crc kubenswrapper[4837]: I0313 12:15:47.066447 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28320b08-9dde-491d-b151-21f93395bf10" path="/var/lib/kubelet/pods/28320b08-9dde-491d-b151-21f93395bf10/volumes" Mar 13 12:15:47 crc kubenswrapper[4837]: I0313 12:15:47.067427 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2936dcb-f1fa-446b-b20f-87e09a9c03ee" path="/var/lib/kubelet/pods/f2936dcb-f1fa-446b-b20f-87e09a9c03ee/volumes" Mar 13 12:15:48 crc kubenswrapper[4837]: I0313 12:15:48.375848 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-k4549" Mar 13 12:15:48 crc kubenswrapper[4837]: I0313 12:15:48.376357 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-k4549" Mar 13 12:15:48 crc kubenswrapper[4837]: I0313 12:15:48.427924 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-k4549" Mar 13 12:15:48 crc kubenswrapper[4837]: I0313 12:15:48.656314 4837 generic.go:334] "Generic (PLEG): container finished" podID="39642113-74ee-406e-9ffa-5b1f8a86f0a3" containerID="90f4bdf2a9038cc5810f20ee433a0ea49eba585b46346568ad1d765738079425" exitCode=0 Mar 13 12:15:48 crc kubenswrapper[4837]: I0313 12:15:48.656825 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-66kqz" event={"ID":"39642113-74ee-406e-9ffa-5b1f8a86f0a3","Type":"ContainerDied","Data":"90f4bdf2a9038cc5810f20ee433a0ea49eba585b46346568ad1d765738079425"} Mar 13 12:15:48 crc kubenswrapper[4837]: I0313 12:15:48.720450 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-k4549" Mar 13 12:15:49 crc kubenswrapper[4837]: I0313 12:15:49.668051 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-66kqz" event={"ID":"39642113-74ee-406e-9ffa-5b1f8a86f0a3","Type":"ContainerStarted","Data":"89780f9b8ffa8f283d677aebde3bc5fc7e658a3ddfba1bebd349e0674ebc91eb"} Mar 13 12:15:49 crc kubenswrapper[4837]: I0313 12:15:49.686740 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-66kqz" podStartSLOduration=2.160361541 podStartE2EDuration="4.68672236s" podCreationTimestamp="2026-03-13 12:15:45 +0000 UTC" firstStartedPulling="2026-03-13 12:15:46.639932671 +0000 UTC m=+1662.278199434" lastFinishedPulling="2026-03-13 12:15:49.16629349 +0000 UTC m=+1664.804560253" observedRunningTime="2026-03-13 12:15:49.684142579 +0000 UTC m=+1665.322409362" watchObservedRunningTime="2026-03-13 12:15:49.68672236 +0000 UTC m=+1665.324989123" Mar 13 12:15:50 crc kubenswrapper[4837]: I0313 12:15:50.420549 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k4549"] Mar 13 12:15:50 crc kubenswrapper[4837]: I0313 12:15:50.675317 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-k4549" podUID="b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66" containerName="registry-server" containerID="cri-o://699d7f4f1dce9e09c45dda674b26e74507cafaedac64cc20ccd0ff7ab71a9827" gracePeriod=2 Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.045080 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-d9fb-account-create-update-5jvwd"] Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.080127 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-d9fb-account-create-update-5jvwd"] Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.211918 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k4549" Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.349177 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66-catalog-content\") pod \"b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66\" (UID: \"b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66\") " Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.349235 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66-utilities\") pod \"b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66\" (UID: \"b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66\") " Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.349427 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgz7b\" (UniqueName: \"kubernetes.io/projected/b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66-kube-api-access-cgz7b\") pod \"b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66\" (UID: \"b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66\") " Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.350078 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66-utilities" (OuterVolumeSpecName: "utilities") pod "b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66" (UID: "b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.360920 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66-kube-api-access-cgz7b" (OuterVolumeSpecName: "kube-api-access-cgz7b") pod "b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66" (UID: "b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66"). InnerVolumeSpecName "kube-api-access-cgz7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.398157 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66" (UID: "b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.451809 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgz7b\" (UniqueName: \"kubernetes.io/projected/b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66-kube-api-access-cgz7b\") on node \"crc\" DevicePath \"\"" Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.451852 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.451864 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.684590 4837 generic.go:334] "Generic (PLEG): container finished" podID="b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66" containerID="699d7f4f1dce9e09c45dda674b26e74507cafaedac64cc20ccd0ff7ab71a9827" exitCode=0 Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.684682 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k4549" event={"ID":"b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66","Type":"ContainerDied","Data":"699d7f4f1dce9e09c45dda674b26e74507cafaedac64cc20ccd0ff7ab71a9827"} Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.684743 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k4549" event={"ID":"b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66","Type":"ContainerDied","Data":"dcd16009bc1432999865446905a1cb6d82986e53bcf2a6d43b335dbf163a3472"} Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.684717 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k4549" Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.684765 4837 scope.go:117] "RemoveContainer" containerID="699d7f4f1dce9e09c45dda674b26e74507cafaedac64cc20ccd0ff7ab71a9827" Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.687992 4837 generic.go:334] "Generic (PLEG): container finished" podID="121f6d1b-1277-4d68-8a48-6c4630dd6fe5" containerID="846a6e75ac7966b1f1da247e3de2868e0139228fe381ae0693bde11ff4d07f27" exitCode=0 Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.688044 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts" event={"ID":"121f6d1b-1277-4d68-8a48-6c4630dd6fe5","Type":"ContainerDied","Data":"846a6e75ac7966b1f1da247e3de2868e0139228fe381ae0693bde11ff4d07f27"} Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.710320 4837 scope.go:117] "RemoveContainer" containerID="7a4a3be12aa2fb097f4306fe7869dec5b8e0644f7a601b973aa63aeea41540bf" Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.748284 4837 scope.go:117] "RemoveContainer" containerID="cfa7b2fa10a0f7aa076cbf31c5b4937fe47ce2198c6b9fe0219231b848a6a60d" Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.764051 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k4549"] Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.775056 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-k4549"] Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.794663 4837 scope.go:117] "RemoveContainer" containerID="699d7f4f1dce9e09c45dda674b26e74507cafaedac64cc20ccd0ff7ab71a9827" Mar 13 12:15:51 crc kubenswrapper[4837]: E0313 12:15:51.795219 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"699d7f4f1dce9e09c45dda674b26e74507cafaedac64cc20ccd0ff7ab71a9827\": container with ID starting with 699d7f4f1dce9e09c45dda674b26e74507cafaedac64cc20ccd0ff7ab71a9827 not found: ID does not exist" containerID="699d7f4f1dce9e09c45dda674b26e74507cafaedac64cc20ccd0ff7ab71a9827" Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.795273 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"699d7f4f1dce9e09c45dda674b26e74507cafaedac64cc20ccd0ff7ab71a9827"} err="failed to get container status \"699d7f4f1dce9e09c45dda674b26e74507cafaedac64cc20ccd0ff7ab71a9827\": rpc error: code = NotFound desc = could not find container \"699d7f4f1dce9e09c45dda674b26e74507cafaedac64cc20ccd0ff7ab71a9827\": container with ID starting with 699d7f4f1dce9e09c45dda674b26e74507cafaedac64cc20ccd0ff7ab71a9827 not found: ID does not exist" Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.795303 4837 scope.go:117] "RemoveContainer" containerID="7a4a3be12aa2fb097f4306fe7869dec5b8e0644f7a601b973aa63aeea41540bf" Mar 13 12:15:51 crc kubenswrapper[4837]: E0313 12:15:51.795664 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a4a3be12aa2fb097f4306fe7869dec5b8e0644f7a601b973aa63aeea41540bf\": container with ID starting with 7a4a3be12aa2fb097f4306fe7869dec5b8e0644f7a601b973aa63aeea41540bf not found: ID does not exist" containerID="7a4a3be12aa2fb097f4306fe7869dec5b8e0644f7a601b973aa63aeea41540bf" Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.795734 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a4a3be12aa2fb097f4306fe7869dec5b8e0644f7a601b973aa63aeea41540bf"} err="failed to get container status \"7a4a3be12aa2fb097f4306fe7869dec5b8e0644f7a601b973aa63aeea41540bf\": rpc error: code = NotFound desc = could not find container \"7a4a3be12aa2fb097f4306fe7869dec5b8e0644f7a601b973aa63aeea41540bf\": container with ID starting with 7a4a3be12aa2fb097f4306fe7869dec5b8e0644f7a601b973aa63aeea41540bf not found: ID does not exist" Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.795780 4837 scope.go:117] "RemoveContainer" containerID="cfa7b2fa10a0f7aa076cbf31c5b4937fe47ce2198c6b9fe0219231b848a6a60d" Mar 13 12:15:51 crc kubenswrapper[4837]: E0313 12:15:51.796172 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfa7b2fa10a0f7aa076cbf31c5b4937fe47ce2198c6b9fe0219231b848a6a60d\": container with ID starting with cfa7b2fa10a0f7aa076cbf31c5b4937fe47ce2198c6b9fe0219231b848a6a60d not found: ID does not exist" containerID="cfa7b2fa10a0f7aa076cbf31c5b4937fe47ce2198c6b9fe0219231b848a6a60d" Mar 13 12:15:51 crc kubenswrapper[4837]: I0313 12:15:51.796218 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfa7b2fa10a0f7aa076cbf31c5b4937fe47ce2198c6b9fe0219231b848a6a60d"} err="failed to get container status \"cfa7b2fa10a0f7aa076cbf31c5b4937fe47ce2198c6b9fe0219231b848a6a60d\": rpc error: code = NotFound desc = could not find container \"cfa7b2fa10a0f7aa076cbf31c5b4937fe47ce2198c6b9fe0219231b848a6a60d\": container with ID starting with cfa7b2fa10a0f7aa076cbf31c5b4937fe47ce2198c6b9fe0219231b848a6a60d not found: ID does not exist" Mar 13 12:15:52 crc kubenswrapper[4837]: I0313 12:15:52.037466 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-rb248"] Mar 13 12:15:52 crc kubenswrapper[4837]: I0313 12:15:52.063723 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-d970-account-create-update-lkc7z"] Mar 13 12:15:52 crc kubenswrapper[4837]: I0313 12:15:52.077429 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-gmczg"] Mar 13 12:15:52 crc kubenswrapper[4837]: I0313 12:15:52.089180 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-rb248"] Mar 13 12:15:52 crc kubenswrapper[4837]: I0313 12:15:52.099705 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-d970-account-create-update-lkc7z"] Mar 13 12:15:52 crc kubenswrapper[4837]: I0313 12:15:52.107798 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-gmczg"] Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.079340 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2230cdcb-087e-4882-8aea-c5d850b711ac" path="/var/lib/kubelet/pods/2230cdcb-087e-4882-8aea-c5d850b711ac/volumes" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.080740 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e" path="/var/lib/kubelet/pods/5c7727a9-3fb1-4a27-8bcd-721f8d5aeb9e/volumes" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.082212 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="737740b8-437c-4c6a-a16f-ac0afcf40b95" path="/var/lib/kubelet/pods/737740b8-437c-4c6a-a16f-ac0afcf40b95/volumes" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.083026 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66" path="/var/lib/kubelet/pods/b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66/volumes" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.083916 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5601ea4-ee81-4e2a-b370-268652332465" path="/var/lib/kubelet/pods/c5601ea4-ee81-4e2a-b370-268652332465/volumes" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.227461 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.386098 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/121f6d1b-1277-4d68-8a48-6c4630dd6fe5-inventory\") pod \"121f6d1b-1277-4d68-8a48-6c4630dd6fe5\" (UID: \"121f6d1b-1277-4d68-8a48-6c4630dd6fe5\") " Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.386940 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6b22\" (UniqueName: \"kubernetes.io/projected/121f6d1b-1277-4d68-8a48-6c4630dd6fe5-kube-api-access-g6b22\") pod \"121f6d1b-1277-4d68-8a48-6c4630dd6fe5\" (UID: \"121f6d1b-1277-4d68-8a48-6c4630dd6fe5\") " Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.387098 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/121f6d1b-1277-4d68-8a48-6c4630dd6fe5-ssh-key-openstack-edpm-ipam\") pod \"121f6d1b-1277-4d68-8a48-6c4630dd6fe5\" (UID: \"121f6d1b-1277-4d68-8a48-6c4630dd6fe5\") " Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.392052 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/121f6d1b-1277-4d68-8a48-6c4630dd6fe5-kube-api-access-g6b22" (OuterVolumeSpecName: "kube-api-access-g6b22") pod "121f6d1b-1277-4d68-8a48-6c4630dd6fe5" (UID: "121f6d1b-1277-4d68-8a48-6c4630dd6fe5"). InnerVolumeSpecName "kube-api-access-g6b22". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.420048 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/121f6d1b-1277-4d68-8a48-6c4630dd6fe5-inventory" (OuterVolumeSpecName: "inventory") pod "121f6d1b-1277-4d68-8a48-6c4630dd6fe5" (UID: "121f6d1b-1277-4d68-8a48-6c4630dd6fe5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.420442 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/121f6d1b-1277-4d68-8a48-6c4630dd6fe5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "121f6d1b-1277-4d68-8a48-6c4630dd6fe5" (UID: "121f6d1b-1277-4d68-8a48-6c4630dd6fe5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.488982 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6b22\" (UniqueName: \"kubernetes.io/projected/121f6d1b-1277-4d68-8a48-6c4630dd6fe5-kube-api-access-g6b22\") on node \"crc\" DevicePath \"\"" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.489015 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/121f6d1b-1277-4d68-8a48-6c4630dd6fe5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.489026 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/121f6d1b-1277-4d68-8a48-6c4630dd6fe5-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.707237 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts" event={"ID":"121f6d1b-1277-4d68-8a48-6c4630dd6fe5","Type":"ContainerDied","Data":"ef1d62890ff7c257d9b17342f3219ac6a0097e8282e72a43ca3f00b10ba0d794"} Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.707284 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef1d62890ff7c257d9b17342f3219ac6a0097e8282e72a43ca3f00b10ba0d794" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.707289 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.785037 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s95mk"] Mar 13 12:15:53 crc kubenswrapper[4837]: E0313 12:15:53.785453 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66" containerName="extract-utilities" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.785475 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66" containerName="extract-utilities" Mar 13 12:15:53 crc kubenswrapper[4837]: E0313 12:15:53.785490 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66" containerName="registry-server" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.785497 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66" containerName="registry-server" Mar 13 12:15:53 crc kubenswrapper[4837]: E0313 12:15:53.785525 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66" containerName="extract-content" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.785536 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66" containerName="extract-content" Mar 13 12:15:53 crc kubenswrapper[4837]: E0313 12:15:53.785548 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="121f6d1b-1277-4d68-8a48-6c4630dd6fe5" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.785555 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="121f6d1b-1277-4d68-8a48-6c4630dd6fe5" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.785777 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="121f6d1b-1277-4d68-8a48-6c4630dd6fe5" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.785790 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9eacd7b-4868-4fbe-aa9f-5f0f05f07e66" containerName="registry-server" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.786493 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s95mk" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.792006 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.792065 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.792145 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dxdkz" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.792378 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.800297 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s95mk"] Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.896219 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-s95mk\" (UID: \"875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s95mk" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.896283 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-s95mk\" (UID: \"875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s95mk" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.896311 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwrrn\" (UniqueName: \"kubernetes.io/projected/875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4-kube-api-access-vwrrn\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-s95mk\" (UID: \"875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s95mk" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.998247 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-s95mk\" (UID: \"875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s95mk" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.998339 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-s95mk\" (UID: \"875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s95mk" Mar 13 12:15:53 crc kubenswrapper[4837]: I0313 12:15:53.998373 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwrrn\" (UniqueName: \"kubernetes.io/projected/875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4-kube-api-access-vwrrn\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-s95mk\" (UID: \"875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s95mk" Mar 13 12:15:54 crc kubenswrapper[4837]: I0313 12:15:54.002420 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-s95mk\" (UID: \"875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s95mk" Mar 13 12:15:54 crc kubenswrapper[4837]: I0313 12:15:54.004223 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-s95mk\" (UID: \"875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s95mk" Mar 13 12:15:54 crc kubenswrapper[4837]: I0313 12:15:54.014831 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwrrn\" (UniqueName: \"kubernetes.io/projected/875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4-kube-api-access-vwrrn\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-s95mk\" (UID: \"875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s95mk" Mar 13 12:15:54 crc kubenswrapper[4837]: I0313 12:15:54.108791 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s95mk" Mar 13 12:15:54 crc kubenswrapper[4837]: I0313 12:15:54.624111 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s95mk"] Mar 13 12:15:54 crc kubenswrapper[4837]: I0313 12:15:54.717460 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s95mk" event={"ID":"875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4","Type":"ContainerStarted","Data":"0b26b64e22cc52cf65f26e43050b957ac334bc911e43a00d74e320b57110468c"} Mar 13 12:15:55 crc kubenswrapper[4837]: I0313 12:15:55.056549 4837 scope.go:117] "RemoveContainer" containerID="92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a" Mar 13 12:15:55 crc kubenswrapper[4837]: E0313 12:15:55.056829 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:15:55 crc kubenswrapper[4837]: I0313 12:15:55.728784 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s95mk" event={"ID":"875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4","Type":"ContainerStarted","Data":"3519a322b7f03b5bd477d8dd194033af493b40ab4c32f95974aae213419d2bf1"} Mar 13 12:15:55 crc kubenswrapper[4837]: I0313 12:15:55.746181 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s95mk" podStartSLOduration=2.282869183 podStartE2EDuration="2.746164508s" podCreationTimestamp="2026-03-13 12:15:53 +0000 UTC" firstStartedPulling="2026-03-13 12:15:54.628409039 +0000 UTC m=+1670.266675802" lastFinishedPulling="2026-03-13 12:15:55.091704364 +0000 UTC m=+1670.729971127" observedRunningTime="2026-03-13 12:15:55.744597269 +0000 UTC m=+1671.382864042" watchObservedRunningTime="2026-03-13 12:15:55.746164508 +0000 UTC m=+1671.384431271" Mar 13 12:15:55 crc kubenswrapper[4837]: I0313 12:15:55.769866 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-66kqz" Mar 13 12:15:55 crc kubenswrapper[4837]: I0313 12:15:55.772052 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-66kqz" Mar 13 12:15:55 crc kubenswrapper[4837]: I0313 12:15:55.820837 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-66kqz" Mar 13 12:15:56 crc kubenswrapper[4837]: I0313 12:15:56.789593 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-66kqz" Mar 13 12:15:56 crc kubenswrapper[4837]: I0313 12:15:56.845342 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-66kqz"] Mar 13 12:15:58 crc kubenswrapper[4837]: I0313 12:15:58.751360 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-66kqz" podUID="39642113-74ee-406e-9ffa-5b1f8a86f0a3" containerName="registry-server" containerID="cri-o://89780f9b8ffa8f283d677aebde3bc5fc7e658a3ddfba1bebd349e0674ebc91eb" gracePeriod=2 Mar 13 12:15:59 crc kubenswrapper[4837]: I0313 12:15:59.037423 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-zgdc9"] Mar 13 12:15:59 crc kubenswrapper[4837]: I0313 12:15:59.061890 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-zgdc9"] Mar 13 12:15:59 crc kubenswrapper[4837]: I0313 12:15:59.217714 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-66kqz" Mar 13 12:15:59 crc kubenswrapper[4837]: I0313 12:15:59.306868 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39642113-74ee-406e-9ffa-5b1f8a86f0a3-catalog-content\") pod \"39642113-74ee-406e-9ffa-5b1f8a86f0a3\" (UID: \"39642113-74ee-406e-9ffa-5b1f8a86f0a3\") " Mar 13 12:15:59 crc kubenswrapper[4837]: I0313 12:15:59.307353 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c96br\" (UniqueName: \"kubernetes.io/projected/39642113-74ee-406e-9ffa-5b1f8a86f0a3-kube-api-access-c96br\") pod \"39642113-74ee-406e-9ffa-5b1f8a86f0a3\" (UID: \"39642113-74ee-406e-9ffa-5b1f8a86f0a3\") " Mar 13 12:15:59 crc kubenswrapper[4837]: I0313 12:15:59.307602 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39642113-74ee-406e-9ffa-5b1f8a86f0a3-utilities\") pod \"39642113-74ee-406e-9ffa-5b1f8a86f0a3\" (UID: \"39642113-74ee-406e-9ffa-5b1f8a86f0a3\") " Mar 13 12:15:59 crc kubenswrapper[4837]: I0313 12:15:59.308557 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39642113-74ee-406e-9ffa-5b1f8a86f0a3-utilities" (OuterVolumeSpecName: "utilities") pod "39642113-74ee-406e-9ffa-5b1f8a86f0a3" (UID: "39642113-74ee-406e-9ffa-5b1f8a86f0a3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:15:59 crc kubenswrapper[4837]: I0313 12:15:59.316392 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39642113-74ee-406e-9ffa-5b1f8a86f0a3-kube-api-access-c96br" (OuterVolumeSpecName: "kube-api-access-c96br") pod "39642113-74ee-406e-9ffa-5b1f8a86f0a3" (UID: "39642113-74ee-406e-9ffa-5b1f8a86f0a3"). InnerVolumeSpecName "kube-api-access-c96br". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:15:59 crc kubenswrapper[4837]: I0313 12:15:59.335660 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39642113-74ee-406e-9ffa-5b1f8a86f0a3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "39642113-74ee-406e-9ffa-5b1f8a86f0a3" (UID: "39642113-74ee-406e-9ffa-5b1f8a86f0a3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:15:59 crc kubenswrapper[4837]: I0313 12:15:59.409853 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39642113-74ee-406e-9ffa-5b1f8a86f0a3-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:15:59 crc kubenswrapper[4837]: I0313 12:15:59.409904 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39642113-74ee-406e-9ffa-5b1f8a86f0a3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:15:59 crc kubenswrapper[4837]: I0313 12:15:59.409922 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c96br\" (UniqueName: \"kubernetes.io/projected/39642113-74ee-406e-9ffa-5b1f8a86f0a3-kube-api-access-c96br\") on node \"crc\" DevicePath \"\"" Mar 13 12:15:59 crc kubenswrapper[4837]: I0313 12:15:59.768449 4837 generic.go:334] "Generic (PLEG): container finished" podID="39642113-74ee-406e-9ffa-5b1f8a86f0a3" containerID="89780f9b8ffa8f283d677aebde3bc5fc7e658a3ddfba1bebd349e0674ebc91eb" exitCode=0 Mar 13 12:15:59 crc kubenswrapper[4837]: I0313 12:15:59.768508 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-66kqz" event={"ID":"39642113-74ee-406e-9ffa-5b1f8a86f0a3","Type":"ContainerDied","Data":"89780f9b8ffa8f283d677aebde3bc5fc7e658a3ddfba1bebd349e0674ebc91eb"} Mar 13 12:15:59 crc kubenswrapper[4837]: I0313 12:15:59.768542 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-66kqz" event={"ID":"39642113-74ee-406e-9ffa-5b1f8a86f0a3","Type":"ContainerDied","Data":"65f583c268758f75b55fc8d8b1b7841f85ff3b2eb56e34fd425055fd65c76ae8"} Mar 13 12:15:59 crc kubenswrapper[4837]: I0313 12:15:59.768580 4837 scope.go:117] "RemoveContainer" containerID="89780f9b8ffa8f283d677aebde3bc5fc7e658a3ddfba1bebd349e0674ebc91eb" Mar 13 12:15:59 crc kubenswrapper[4837]: I0313 12:15:59.768847 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-66kqz" Mar 13 12:15:59 crc kubenswrapper[4837]: I0313 12:15:59.815501 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-66kqz"] Mar 13 12:15:59 crc kubenswrapper[4837]: I0313 12:15:59.817572 4837 scope.go:117] "RemoveContainer" containerID="90f4bdf2a9038cc5810f20ee433a0ea49eba585b46346568ad1d765738079425" Mar 13 12:15:59 crc kubenswrapper[4837]: I0313 12:15:59.826743 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-66kqz"] Mar 13 12:15:59 crc kubenswrapper[4837]: I0313 12:15:59.853968 4837 scope.go:117] "RemoveContainer" containerID="376b0d55207197a286fb352a3a99bb1a2ae96c88906c793388f162a064b37a64" Mar 13 12:15:59 crc kubenswrapper[4837]: I0313 12:15:59.879552 4837 scope.go:117] "RemoveContainer" containerID="89780f9b8ffa8f283d677aebde3bc5fc7e658a3ddfba1bebd349e0674ebc91eb" Mar 13 12:15:59 crc kubenswrapper[4837]: E0313 12:15:59.880109 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89780f9b8ffa8f283d677aebde3bc5fc7e658a3ddfba1bebd349e0674ebc91eb\": container with ID starting with 89780f9b8ffa8f283d677aebde3bc5fc7e658a3ddfba1bebd349e0674ebc91eb not found: ID does not exist" containerID="89780f9b8ffa8f283d677aebde3bc5fc7e658a3ddfba1bebd349e0674ebc91eb" Mar 13 12:15:59 crc kubenswrapper[4837]: I0313 12:15:59.880153 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89780f9b8ffa8f283d677aebde3bc5fc7e658a3ddfba1bebd349e0674ebc91eb"} err="failed to get container status \"89780f9b8ffa8f283d677aebde3bc5fc7e658a3ddfba1bebd349e0674ebc91eb\": rpc error: code = NotFound desc = could not find container \"89780f9b8ffa8f283d677aebde3bc5fc7e658a3ddfba1bebd349e0674ebc91eb\": container with ID starting with 89780f9b8ffa8f283d677aebde3bc5fc7e658a3ddfba1bebd349e0674ebc91eb not found: ID does not exist" Mar 13 12:15:59 crc kubenswrapper[4837]: I0313 12:15:59.880184 4837 scope.go:117] "RemoveContainer" containerID="90f4bdf2a9038cc5810f20ee433a0ea49eba585b46346568ad1d765738079425" Mar 13 12:15:59 crc kubenswrapper[4837]: E0313 12:15:59.880616 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90f4bdf2a9038cc5810f20ee433a0ea49eba585b46346568ad1d765738079425\": container with ID starting with 90f4bdf2a9038cc5810f20ee433a0ea49eba585b46346568ad1d765738079425 not found: ID does not exist" containerID="90f4bdf2a9038cc5810f20ee433a0ea49eba585b46346568ad1d765738079425" Mar 13 12:15:59 crc kubenswrapper[4837]: I0313 12:15:59.880669 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90f4bdf2a9038cc5810f20ee433a0ea49eba585b46346568ad1d765738079425"} err="failed to get container status \"90f4bdf2a9038cc5810f20ee433a0ea49eba585b46346568ad1d765738079425\": rpc error: code = NotFound desc = could not find container \"90f4bdf2a9038cc5810f20ee433a0ea49eba585b46346568ad1d765738079425\": container with ID starting with 90f4bdf2a9038cc5810f20ee433a0ea49eba585b46346568ad1d765738079425 not found: ID does not exist" Mar 13 12:15:59 crc kubenswrapper[4837]: I0313 12:15:59.880696 4837 scope.go:117] "RemoveContainer" containerID="376b0d55207197a286fb352a3a99bb1a2ae96c88906c793388f162a064b37a64" Mar 13 12:15:59 crc kubenswrapper[4837]: E0313 12:15:59.881031 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"376b0d55207197a286fb352a3a99bb1a2ae96c88906c793388f162a064b37a64\": container with ID starting with 376b0d55207197a286fb352a3a99bb1a2ae96c88906c793388f162a064b37a64 not found: ID does not exist" containerID="376b0d55207197a286fb352a3a99bb1a2ae96c88906c793388f162a064b37a64" Mar 13 12:15:59 crc kubenswrapper[4837]: I0313 12:15:59.881060 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"376b0d55207197a286fb352a3a99bb1a2ae96c88906c793388f162a064b37a64"} err="failed to get container status \"376b0d55207197a286fb352a3a99bb1a2ae96c88906c793388f162a064b37a64\": rpc error: code = NotFound desc = could not find container \"376b0d55207197a286fb352a3a99bb1a2ae96c88906c793388f162a064b37a64\": container with ID starting with 376b0d55207197a286fb352a3a99bb1a2ae96c88906c793388f162a064b37a64 not found: ID does not exist" Mar 13 12:16:00 crc kubenswrapper[4837]: I0313 12:16:00.138343 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556736-26kwx"] Mar 13 12:16:00 crc kubenswrapper[4837]: E0313 12:16:00.138816 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39642113-74ee-406e-9ffa-5b1f8a86f0a3" containerName="extract-content" Mar 13 12:16:00 crc kubenswrapper[4837]: I0313 12:16:00.138837 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="39642113-74ee-406e-9ffa-5b1f8a86f0a3" containerName="extract-content" Mar 13 12:16:00 crc kubenswrapper[4837]: E0313 12:16:00.138862 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39642113-74ee-406e-9ffa-5b1f8a86f0a3" containerName="extract-utilities" Mar 13 12:16:00 crc kubenswrapper[4837]: I0313 12:16:00.138871 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="39642113-74ee-406e-9ffa-5b1f8a86f0a3" containerName="extract-utilities" Mar 13 12:16:00 crc kubenswrapper[4837]: E0313 12:16:00.138883 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39642113-74ee-406e-9ffa-5b1f8a86f0a3" containerName="registry-server" Mar 13 12:16:00 crc kubenswrapper[4837]: I0313 12:16:00.138891 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="39642113-74ee-406e-9ffa-5b1f8a86f0a3" containerName="registry-server" Mar 13 12:16:00 crc kubenswrapper[4837]: I0313 12:16:00.139066 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="39642113-74ee-406e-9ffa-5b1f8a86f0a3" containerName="registry-server" Mar 13 12:16:00 crc kubenswrapper[4837]: I0313 12:16:00.139695 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556736-26kwx" Mar 13 12:16:00 crc kubenswrapper[4837]: I0313 12:16:00.141966 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:16:00 crc kubenswrapper[4837]: I0313 12:16:00.142026 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:16:00 crc kubenswrapper[4837]: I0313 12:16:00.142153 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:16:00 crc kubenswrapper[4837]: I0313 12:16:00.148398 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556736-26kwx"] Mar 13 12:16:00 crc kubenswrapper[4837]: I0313 12:16:00.226704 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9gzw\" (UniqueName: \"kubernetes.io/projected/a2c1518c-d031-4597-ab77-8626e068bcda-kube-api-access-n9gzw\") pod \"auto-csr-approver-29556736-26kwx\" (UID: \"a2c1518c-d031-4597-ab77-8626e068bcda\") " pod="openshift-infra/auto-csr-approver-29556736-26kwx" Mar 13 12:16:00 crc kubenswrapper[4837]: I0313 12:16:00.329134 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9gzw\" (UniqueName: \"kubernetes.io/projected/a2c1518c-d031-4597-ab77-8626e068bcda-kube-api-access-n9gzw\") pod \"auto-csr-approver-29556736-26kwx\" (UID: \"a2c1518c-d031-4597-ab77-8626e068bcda\") " pod="openshift-infra/auto-csr-approver-29556736-26kwx" Mar 13 12:16:00 crc kubenswrapper[4837]: I0313 12:16:00.345499 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9gzw\" (UniqueName: \"kubernetes.io/projected/a2c1518c-d031-4597-ab77-8626e068bcda-kube-api-access-n9gzw\") pod \"auto-csr-approver-29556736-26kwx\" (UID: \"a2c1518c-d031-4597-ab77-8626e068bcda\") " pod="openshift-infra/auto-csr-approver-29556736-26kwx" Mar 13 12:16:00 crc kubenswrapper[4837]: I0313 12:16:00.458615 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556736-26kwx" Mar 13 12:16:00 crc kubenswrapper[4837]: I0313 12:16:00.892177 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556736-26kwx"] Mar 13 12:16:01 crc kubenswrapper[4837]: I0313 12:16:01.062049 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f" path="/var/lib/kubelet/pods/2f1ebb88-e1c9-4839-9c66-8bd86e4b0d5f/volumes" Mar 13 12:16:01 crc kubenswrapper[4837]: I0313 12:16:01.062814 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39642113-74ee-406e-9ffa-5b1f8a86f0a3" path="/var/lib/kubelet/pods/39642113-74ee-406e-9ffa-5b1f8a86f0a3/volumes" Mar 13 12:16:01 crc kubenswrapper[4837]: I0313 12:16:01.790399 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556736-26kwx" event={"ID":"a2c1518c-d031-4597-ab77-8626e068bcda","Type":"ContainerStarted","Data":"00518416b47b957ef8c5e7a34d278e0fa923687f5d90d57d3fd691ac9771deeb"} Mar 13 12:16:02 crc kubenswrapper[4837]: I0313 12:16:02.799676 4837 generic.go:334] "Generic (PLEG): container finished" podID="a2c1518c-d031-4597-ab77-8626e068bcda" containerID="eab36df7c6a9acf9dc7560368f9674c4b5510068e382ff493b327a540b10eb38" exitCode=0 Mar 13 12:16:02 crc kubenswrapper[4837]: I0313 12:16:02.799761 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556736-26kwx" event={"ID":"a2c1518c-d031-4597-ab77-8626e068bcda","Type":"ContainerDied","Data":"eab36df7c6a9acf9dc7560368f9674c4b5510068e382ff493b327a540b10eb38"} Mar 13 12:16:04 crc kubenswrapper[4837]: I0313 12:16:04.193472 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556736-26kwx" Mar 13 12:16:04 crc kubenswrapper[4837]: I0313 12:16:04.323713 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9gzw\" (UniqueName: \"kubernetes.io/projected/a2c1518c-d031-4597-ab77-8626e068bcda-kube-api-access-n9gzw\") pod \"a2c1518c-d031-4597-ab77-8626e068bcda\" (UID: \"a2c1518c-d031-4597-ab77-8626e068bcda\") " Mar 13 12:16:04 crc kubenswrapper[4837]: I0313 12:16:04.332005 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2c1518c-d031-4597-ab77-8626e068bcda-kube-api-access-n9gzw" (OuterVolumeSpecName: "kube-api-access-n9gzw") pod "a2c1518c-d031-4597-ab77-8626e068bcda" (UID: "a2c1518c-d031-4597-ab77-8626e068bcda"). InnerVolumeSpecName "kube-api-access-n9gzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:16:04 crc kubenswrapper[4837]: I0313 12:16:04.425425 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9gzw\" (UniqueName: \"kubernetes.io/projected/a2c1518c-d031-4597-ab77-8626e068bcda-kube-api-access-n9gzw\") on node \"crc\" DevicePath \"\"" Mar 13 12:16:04 crc kubenswrapper[4837]: I0313 12:16:04.829491 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556736-26kwx" event={"ID":"a2c1518c-d031-4597-ab77-8626e068bcda","Type":"ContainerDied","Data":"00518416b47b957ef8c5e7a34d278e0fa923687f5d90d57d3fd691ac9771deeb"} Mar 13 12:16:04 crc kubenswrapper[4837]: I0313 12:16:04.829532 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00518416b47b957ef8c5e7a34d278e0fa923687f5d90d57d3fd691ac9771deeb" Mar 13 12:16:04 crc kubenswrapper[4837]: I0313 12:16:04.829568 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556736-26kwx" Mar 13 12:16:05 crc kubenswrapper[4837]: I0313 12:16:05.263777 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556730-jvprz"] Mar 13 12:16:05 crc kubenswrapper[4837]: I0313 12:16:05.279418 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556730-jvprz"] Mar 13 12:16:07 crc kubenswrapper[4837]: I0313 12:16:07.078466 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="348878ea-aa9f-4306-af10-6a56583447a4" path="/var/lib/kubelet/pods/348878ea-aa9f-4306-af10-6a56583447a4/volumes" Mar 13 12:16:09 crc kubenswrapper[4837]: I0313 12:16:09.050992 4837 scope.go:117] "RemoveContainer" containerID="92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a" Mar 13 12:16:09 crc kubenswrapper[4837]: E0313 12:16:09.052173 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:16:18 crc kubenswrapper[4837]: I0313 12:16:18.050344 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-mbps4"] Mar 13 12:16:18 crc kubenswrapper[4837]: I0313 12:16:18.062692 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-mbps4"] Mar 13 12:16:19 crc kubenswrapper[4837]: I0313 12:16:19.059332 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="685f13a4-d293-4199-8049-67b02c0162c1" path="/var/lib/kubelet/pods/685f13a4-d293-4199-8049-67b02c0162c1/volumes" Mar 13 12:16:22 crc kubenswrapper[4837]: I0313 12:16:22.043741 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-45j5g"] Mar 13 12:16:22 crc kubenswrapper[4837]: I0313 12:16:22.054023 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-g24hg"] Mar 13 12:16:22 crc kubenswrapper[4837]: I0313 12:16:22.062010 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-6b07-account-create-update-wxqsd"] Mar 13 12:16:22 crc kubenswrapper[4837]: I0313 12:16:22.074277 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-330b-account-create-update-snkff"] Mar 13 12:16:22 crc kubenswrapper[4837]: I0313 12:16:22.084521 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-g24hg"] Mar 13 12:16:22 crc kubenswrapper[4837]: I0313 12:16:22.094710 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-6b07-account-create-update-wxqsd"] Mar 13 12:16:22 crc kubenswrapper[4837]: I0313 12:16:22.104564 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-45j5g"] Mar 13 12:16:22 crc kubenswrapper[4837]: I0313 12:16:22.113982 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-9a59-account-create-update-hqxzk"] Mar 13 12:16:22 crc kubenswrapper[4837]: I0313 12:16:22.121878 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-330b-account-create-update-snkff"] Mar 13 12:16:22 crc kubenswrapper[4837]: I0313 12:16:22.131810 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-9a59-account-create-update-hqxzk"] Mar 13 12:16:23 crc kubenswrapper[4837]: I0313 12:16:23.064769 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72768daf-a5fa-4c8e-b9c3-49cd5f87fe30" path="/var/lib/kubelet/pods/72768daf-a5fa-4c8e-b9c3-49cd5f87fe30/volumes" Mar 13 12:16:23 crc kubenswrapper[4837]: I0313 12:16:23.065589 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77cef7b0-af86-456f-973b-923cb901b88d" path="/var/lib/kubelet/pods/77cef7b0-af86-456f-973b-923cb901b88d/volumes" Mar 13 12:16:23 crc kubenswrapper[4837]: I0313 12:16:23.066145 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a78456e1-6f14-45d4-ab3f-1fea88af4749" path="/var/lib/kubelet/pods/a78456e1-6f14-45d4-ab3f-1fea88af4749/volumes" Mar 13 12:16:23 crc kubenswrapper[4837]: I0313 12:16:23.066747 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6b37e8b-50ec-402e-ae31-27ff0d84e0be" path="/var/lib/kubelet/pods/e6b37e8b-50ec-402e-ae31-27ff0d84e0be/volumes" Mar 13 12:16:23 crc kubenswrapper[4837]: I0313 12:16:23.067970 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f029b52a-1a09-44b3-affe-9449cd6a5944" path="/var/lib/kubelet/pods/f029b52a-1a09-44b3-affe-9449cd6a5944/volumes" Mar 13 12:16:24 crc kubenswrapper[4837]: I0313 12:16:24.047938 4837 scope.go:117] "RemoveContainer" containerID="92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a" Mar 13 12:16:24 crc kubenswrapper[4837]: E0313 12:16:24.048466 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:16:26 crc kubenswrapper[4837]: I0313 12:16:26.035184 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-2dlt8"] Mar 13 12:16:26 crc kubenswrapper[4837]: I0313 12:16:26.043763 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-2dlt8"] Mar 13 12:16:27 crc kubenswrapper[4837]: I0313 12:16:27.064078 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe" path="/var/lib/kubelet/pods/19cfb16d-f7a7-4f5d-baa9-b00eaecf1dfe/volumes" Mar 13 12:16:28 crc kubenswrapper[4837]: I0313 12:16:28.235171 4837 scope.go:117] "RemoveContainer" containerID="258afa8ad3c4b205a4d5ebbc2dad025a8beb1c8bcd26054b8547c8dad13f8f6c" Mar 13 12:16:28 crc kubenswrapper[4837]: I0313 12:16:28.256908 4837 scope.go:117] "RemoveContainer" containerID="fbb8d3067503d33b0b6e6a915789395c7b9c10818b3ce84f4506b15f77d6207f" Mar 13 12:16:28 crc kubenswrapper[4837]: I0313 12:16:28.300461 4837 scope.go:117] "RemoveContainer" containerID="4ef4f42482f9efbb7e95ba0aa3a8a4567cffbb3946a12623724cae5ed211d4e1" Mar 13 12:16:28 crc kubenswrapper[4837]: I0313 12:16:28.340397 4837 scope.go:117] "RemoveContainer" containerID="4f6fb24113c34cb08d7bf34817309c7c27eeab0cdaee4f12683e138394d254b1" Mar 13 12:16:28 crc kubenswrapper[4837]: I0313 12:16:28.388049 4837 scope.go:117] "RemoveContainer" containerID="40deea41e769b1017207ec620ac05bd1eeae7028c9b2f3cacb4bc02a7f4fffdf" Mar 13 12:16:28 crc kubenswrapper[4837]: I0313 12:16:28.438064 4837 scope.go:117] "RemoveContainer" containerID="248bbf02c11ba4d4459897916fec2f24105abad663f25d012f6888d993c3fbac" Mar 13 12:16:28 crc kubenswrapper[4837]: I0313 12:16:28.481438 4837 scope.go:117] "RemoveContainer" containerID="400f25fc20473b4a0989af2562c9f1940f8ca26a8e2532da0bcde1d8c359bf39" Mar 13 12:16:28 crc kubenswrapper[4837]: I0313 12:16:28.526622 4837 scope.go:117] "RemoveContainer" containerID="4278a43d1836aa1abbebaa7d3b0197dd5fc3373adc2b4d3124d2a223104eef56" Mar 13 12:16:28 crc kubenswrapper[4837]: I0313 12:16:28.579324 4837 scope.go:117] "RemoveContainer" containerID="42bfa52d5c8c4ce4aaf6212f222930fd5d442e727a1a7b492df691e11a1e81f6" Mar 13 12:16:28 crc kubenswrapper[4837]: I0313 12:16:28.617356 4837 scope.go:117] "RemoveContainer" containerID="8295d45762eef27ce4120c578b478e84691da779f8c9457d397485b5b46c5eba" Mar 13 12:16:28 crc kubenswrapper[4837]: I0313 12:16:28.666443 4837 scope.go:117] "RemoveContainer" containerID="e460ab529bcbaef415dda78934a987cdd80d8b23f4cad796d19dcd468ce2d5f7" Mar 13 12:16:28 crc kubenswrapper[4837]: I0313 12:16:28.691013 4837 scope.go:117] "RemoveContainer" containerID="d01d04b228faf7f13c332e53f55aacbde9b692f0da2cccf686b1a57f52fa8fe2" Mar 13 12:16:28 crc kubenswrapper[4837]: I0313 12:16:28.709261 4837 scope.go:117] "RemoveContainer" containerID="286a6a1365f30df6b40943e24ec3066d64b002e22ec98bea016b42eeee5b1160" Mar 13 12:16:28 crc kubenswrapper[4837]: I0313 12:16:28.726182 4837 scope.go:117] "RemoveContainer" containerID="7c2129e0048255a871372a3d7023ed828ca0d6f1f4e610da012f5353ff07c822" Mar 13 12:16:28 crc kubenswrapper[4837]: I0313 12:16:28.743359 4837 scope.go:117] "RemoveContainer" containerID="9c444d34c403a2440618afe6e0c75ef9551c465f012f8ba4f50c5bde9744bb16" Mar 13 12:16:39 crc kubenswrapper[4837]: I0313 12:16:39.048305 4837 scope.go:117] "RemoveContainer" containerID="92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a" Mar 13 12:16:39 crc kubenswrapper[4837]: E0313 12:16:39.049964 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:16:52 crc kubenswrapper[4837]: I0313 12:16:52.046105 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-jkthw"] Mar 13 12:16:52 crc kubenswrapper[4837]: I0313 12:16:52.059455 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-jkthw"] Mar 13 12:16:52 crc kubenswrapper[4837]: I0313 12:16:52.242611 4837 generic.go:334] "Generic (PLEG): container finished" podID="875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4" containerID="3519a322b7f03b5bd477d8dd194033af493b40ab4c32f95974aae213419d2bf1" exitCode=0 Mar 13 12:16:52 crc kubenswrapper[4837]: I0313 12:16:52.242667 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s95mk" event={"ID":"875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4","Type":"ContainerDied","Data":"3519a322b7f03b5bd477d8dd194033af493b40ab4c32f95974aae213419d2bf1"} Mar 13 12:16:53 crc kubenswrapper[4837]: I0313 12:16:53.072973 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4490fb3-45d7-4b40-ad34-5bf33ba88491" path="/var/lib/kubelet/pods/b4490fb3-45d7-4b40-ad34-5bf33ba88491/volumes" Mar 13 12:16:53 crc kubenswrapper[4837]: I0313 12:16:53.664765 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s95mk" Mar 13 12:16:53 crc kubenswrapper[4837]: I0313 12:16:53.730357 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwrrn\" (UniqueName: \"kubernetes.io/projected/875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4-kube-api-access-vwrrn\") pod \"875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4\" (UID: \"875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4\") " Mar 13 12:16:53 crc kubenswrapper[4837]: I0313 12:16:53.730456 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4-ssh-key-openstack-edpm-ipam\") pod \"875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4\" (UID: \"875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4\") " Mar 13 12:16:53 crc kubenswrapper[4837]: I0313 12:16:53.730555 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4-inventory\") pod \"875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4\" (UID: \"875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4\") " Mar 13 12:16:53 crc kubenswrapper[4837]: I0313 12:16:53.749446 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4-kube-api-access-vwrrn" (OuterVolumeSpecName: "kube-api-access-vwrrn") pod "875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4" (UID: "875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4"). InnerVolumeSpecName "kube-api-access-vwrrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:16:53 crc kubenswrapper[4837]: I0313 12:16:53.757751 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4" (UID: "875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:16:53 crc kubenswrapper[4837]: I0313 12:16:53.757764 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4-inventory" (OuterVolumeSpecName: "inventory") pod "875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4" (UID: "875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:16:53 crc kubenswrapper[4837]: I0313 12:16:53.832885 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwrrn\" (UniqueName: \"kubernetes.io/projected/875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4-kube-api-access-vwrrn\") on node \"crc\" DevicePath \"\"" Mar 13 12:16:53 crc kubenswrapper[4837]: I0313 12:16:53.832918 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 12:16:53 crc kubenswrapper[4837]: I0313 12:16:53.832931 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 12:16:54 crc kubenswrapper[4837]: I0313 12:16:54.034226 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-wdwg2"] Mar 13 12:16:54 crc kubenswrapper[4837]: I0313 12:16:54.044018 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-wdwg2"] Mar 13 12:16:54 crc kubenswrapper[4837]: I0313 12:16:54.048790 4837 scope.go:117] "RemoveContainer" containerID="92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a" Mar 13 12:16:54 crc kubenswrapper[4837]: E0313 12:16:54.049056 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:16:54 crc kubenswrapper[4837]: I0313 12:16:54.259494 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s95mk" event={"ID":"875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4","Type":"ContainerDied","Data":"0b26b64e22cc52cf65f26e43050b957ac334bc911e43a00d74e320b57110468c"} Mar 13 12:16:54 crc kubenswrapper[4837]: I0313 12:16:54.259533 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b26b64e22cc52cf65f26e43050b957ac334bc911e43a00d74e320b57110468c" Mar 13 12:16:54 crc kubenswrapper[4837]: I0313 12:16:54.259543 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-s95mk" Mar 13 12:16:54 crc kubenswrapper[4837]: I0313 12:16:54.338038 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42br8"] Mar 13 12:16:54 crc kubenswrapper[4837]: E0313 12:16:54.338513 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 13 12:16:54 crc kubenswrapper[4837]: I0313 12:16:54.338530 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 13 12:16:54 crc kubenswrapper[4837]: E0313 12:16:54.338554 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2c1518c-d031-4597-ab77-8626e068bcda" containerName="oc" Mar 13 12:16:54 crc kubenswrapper[4837]: I0313 12:16:54.338561 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2c1518c-d031-4597-ab77-8626e068bcda" containerName="oc" Mar 13 12:16:54 crc kubenswrapper[4837]: I0313 12:16:54.338763 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2c1518c-d031-4597-ab77-8626e068bcda" containerName="oc" Mar 13 12:16:54 crc kubenswrapper[4837]: I0313 12:16:54.338793 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 13 12:16:54 crc kubenswrapper[4837]: I0313 12:16:54.339396 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42br8" Mar 13 12:16:54 crc kubenswrapper[4837]: I0313 12:16:54.342462 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 12:16:54 crc kubenswrapper[4837]: I0313 12:16:54.342587 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 12:16:54 crc kubenswrapper[4837]: I0313 12:16:54.342749 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dxdkz" Mar 13 12:16:54 crc kubenswrapper[4837]: I0313 12:16:54.342749 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 12:16:54 crc kubenswrapper[4837]: I0313 12:16:54.358418 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42br8"] Mar 13 12:16:54 crc kubenswrapper[4837]: I0313 12:16:54.442479 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bh6vg\" (UniqueName: \"kubernetes.io/projected/e3ec33da-9091-4eb1-aafa-62b9bdf16072-kube-api-access-bh6vg\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-42br8\" (UID: \"e3ec33da-9091-4eb1-aafa-62b9bdf16072\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42br8" Mar 13 12:16:54 crc kubenswrapper[4837]: I0313 12:16:54.442811 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3ec33da-9091-4eb1-aafa-62b9bdf16072-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-42br8\" (UID: \"e3ec33da-9091-4eb1-aafa-62b9bdf16072\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42br8" Mar 13 12:16:54 crc kubenswrapper[4837]: I0313 12:16:54.442866 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e3ec33da-9091-4eb1-aafa-62b9bdf16072-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-42br8\" (UID: \"e3ec33da-9091-4eb1-aafa-62b9bdf16072\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42br8" Mar 13 12:16:54 crc kubenswrapper[4837]: I0313 12:16:54.544621 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bh6vg\" (UniqueName: \"kubernetes.io/projected/e3ec33da-9091-4eb1-aafa-62b9bdf16072-kube-api-access-bh6vg\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-42br8\" (UID: \"e3ec33da-9091-4eb1-aafa-62b9bdf16072\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42br8" Mar 13 12:16:54 crc kubenswrapper[4837]: I0313 12:16:54.544798 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3ec33da-9091-4eb1-aafa-62b9bdf16072-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-42br8\" (UID: \"e3ec33da-9091-4eb1-aafa-62b9bdf16072\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42br8" Mar 13 12:16:54 crc kubenswrapper[4837]: I0313 12:16:54.544870 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e3ec33da-9091-4eb1-aafa-62b9bdf16072-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-42br8\" (UID: \"e3ec33da-9091-4eb1-aafa-62b9bdf16072\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42br8" Mar 13 12:16:54 crc kubenswrapper[4837]: I0313 12:16:54.550158 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e3ec33da-9091-4eb1-aafa-62b9bdf16072-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-42br8\" (UID: \"e3ec33da-9091-4eb1-aafa-62b9bdf16072\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42br8" Mar 13 12:16:54 crc kubenswrapper[4837]: I0313 12:16:54.550400 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3ec33da-9091-4eb1-aafa-62b9bdf16072-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-42br8\" (UID: \"e3ec33da-9091-4eb1-aafa-62b9bdf16072\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42br8" Mar 13 12:16:54 crc kubenswrapper[4837]: I0313 12:16:54.563742 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bh6vg\" (UniqueName: \"kubernetes.io/projected/e3ec33da-9091-4eb1-aafa-62b9bdf16072-kube-api-access-bh6vg\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-42br8\" (UID: \"e3ec33da-9091-4eb1-aafa-62b9bdf16072\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42br8" Mar 13 12:16:54 crc kubenswrapper[4837]: I0313 12:16:54.664740 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42br8" Mar 13 12:16:55 crc kubenswrapper[4837]: I0313 12:16:55.059448 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2d0a770-288f-40d8-832e-f5463863bef1" path="/var/lib/kubelet/pods/d2d0a770-288f-40d8-832e-f5463863bef1/volumes" Mar 13 12:16:55 crc kubenswrapper[4837]: I0313 12:16:55.191799 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42br8"] Mar 13 12:16:55 crc kubenswrapper[4837]: I0313 12:16:55.267277 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42br8" event={"ID":"e3ec33da-9091-4eb1-aafa-62b9bdf16072","Type":"ContainerStarted","Data":"7e407174b52c46f6e2f99460d322d25ed15b3491c843625b34f53f2a1491b8e6"} Mar 13 12:16:56 crc kubenswrapper[4837]: I0313 12:16:56.284135 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42br8" event={"ID":"e3ec33da-9091-4eb1-aafa-62b9bdf16072","Type":"ContainerStarted","Data":"fdf1000310c4e8eefb8a2b2cf15e340f600507dfb833c98fdddfad4aa86a3d48"} Mar 13 12:17:00 crc kubenswrapper[4837]: I0313 12:17:00.317123 4837 generic.go:334] "Generic (PLEG): container finished" podID="e3ec33da-9091-4eb1-aafa-62b9bdf16072" containerID="fdf1000310c4e8eefb8a2b2cf15e340f600507dfb833c98fdddfad4aa86a3d48" exitCode=0 Mar 13 12:17:00 crc kubenswrapper[4837]: I0313 12:17:00.317589 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42br8" event={"ID":"e3ec33da-9091-4eb1-aafa-62b9bdf16072","Type":"ContainerDied","Data":"fdf1000310c4e8eefb8a2b2cf15e340f600507dfb833c98fdddfad4aa86a3d48"} Mar 13 12:17:01 crc kubenswrapper[4837]: I0313 12:17:01.744927 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42br8" Mar 13 12:17:01 crc kubenswrapper[4837]: I0313 12:17:01.802817 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bh6vg\" (UniqueName: \"kubernetes.io/projected/e3ec33da-9091-4eb1-aafa-62b9bdf16072-kube-api-access-bh6vg\") pod \"e3ec33da-9091-4eb1-aafa-62b9bdf16072\" (UID: \"e3ec33da-9091-4eb1-aafa-62b9bdf16072\") " Mar 13 12:17:01 crc kubenswrapper[4837]: I0313 12:17:01.803023 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3ec33da-9091-4eb1-aafa-62b9bdf16072-inventory\") pod \"e3ec33da-9091-4eb1-aafa-62b9bdf16072\" (UID: \"e3ec33da-9091-4eb1-aafa-62b9bdf16072\") " Mar 13 12:17:01 crc kubenswrapper[4837]: I0313 12:17:01.803154 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e3ec33da-9091-4eb1-aafa-62b9bdf16072-ssh-key-openstack-edpm-ipam\") pod \"e3ec33da-9091-4eb1-aafa-62b9bdf16072\" (UID: \"e3ec33da-9091-4eb1-aafa-62b9bdf16072\") " Mar 13 12:17:01 crc kubenswrapper[4837]: I0313 12:17:01.810907 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3ec33da-9091-4eb1-aafa-62b9bdf16072-kube-api-access-bh6vg" (OuterVolumeSpecName: "kube-api-access-bh6vg") pod "e3ec33da-9091-4eb1-aafa-62b9bdf16072" (UID: "e3ec33da-9091-4eb1-aafa-62b9bdf16072"). InnerVolumeSpecName "kube-api-access-bh6vg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:17:01 crc kubenswrapper[4837]: I0313 12:17:01.835624 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3ec33da-9091-4eb1-aafa-62b9bdf16072-inventory" (OuterVolumeSpecName: "inventory") pod "e3ec33da-9091-4eb1-aafa-62b9bdf16072" (UID: "e3ec33da-9091-4eb1-aafa-62b9bdf16072"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:17:01 crc kubenswrapper[4837]: I0313 12:17:01.844289 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3ec33da-9091-4eb1-aafa-62b9bdf16072-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e3ec33da-9091-4eb1-aafa-62b9bdf16072" (UID: "e3ec33da-9091-4eb1-aafa-62b9bdf16072"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:17:01 crc kubenswrapper[4837]: I0313 12:17:01.905828 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3ec33da-9091-4eb1-aafa-62b9bdf16072-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 12:17:01 crc kubenswrapper[4837]: I0313 12:17:01.905874 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e3ec33da-9091-4eb1-aafa-62b9bdf16072-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 12:17:01 crc kubenswrapper[4837]: I0313 12:17:01.905889 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bh6vg\" (UniqueName: \"kubernetes.io/projected/e3ec33da-9091-4eb1-aafa-62b9bdf16072-kube-api-access-bh6vg\") on node \"crc\" DevicePath \"\"" Mar 13 12:17:02 crc kubenswrapper[4837]: I0313 12:17:02.337469 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42br8" event={"ID":"e3ec33da-9091-4eb1-aafa-62b9bdf16072","Type":"ContainerDied","Data":"7e407174b52c46f6e2f99460d322d25ed15b3491c843625b34f53f2a1491b8e6"} Mar 13 12:17:02 crc kubenswrapper[4837]: I0313 12:17:02.337878 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e407174b52c46f6e2f99460d322d25ed15b3491c843625b34f53f2a1491b8e6" Mar 13 12:17:02 crc kubenswrapper[4837]: I0313 12:17:02.337528 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42br8" Mar 13 12:17:02 crc kubenswrapper[4837]: I0313 12:17:02.404919 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-2b48q"] Mar 13 12:17:02 crc kubenswrapper[4837]: E0313 12:17:02.405377 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3ec33da-9091-4eb1-aafa-62b9bdf16072" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 13 12:17:02 crc kubenswrapper[4837]: I0313 12:17:02.405393 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3ec33da-9091-4eb1-aafa-62b9bdf16072" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 13 12:17:02 crc kubenswrapper[4837]: I0313 12:17:02.405577 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3ec33da-9091-4eb1-aafa-62b9bdf16072" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 13 12:17:02 crc kubenswrapper[4837]: I0313 12:17:02.406217 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2b48q" Mar 13 12:17:02 crc kubenswrapper[4837]: I0313 12:17:02.408070 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 12:17:02 crc kubenswrapper[4837]: I0313 12:17:02.408233 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 12:17:02 crc kubenswrapper[4837]: I0313 12:17:02.408355 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dxdkz" Mar 13 12:17:02 crc kubenswrapper[4837]: I0313 12:17:02.408536 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 12:17:02 crc kubenswrapper[4837]: I0313 12:17:02.414259 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-2b48q"] Mar 13 12:17:02 crc kubenswrapper[4837]: I0313 12:17:02.517604 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9vmj\" (UniqueName: \"kubernetes.io/projected/033a02c2-cbe4-4676-ae46-f9b9b17a60fb-kube-api-access-w9vmj\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-2b48q\" (UID: \"033a02c2-cbe4-4676-ae46-f9b9b17a60fb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2b48q" Mar 13 12:17:02 crc kubenswrapper[4837]: I0313 12:17:02.517697 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/033a02c2-cbe4-4676-ae46-f9b9b17a60fb-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-2b48q\" (UID: \"033a02c2-cbe4-4676-ae46-f9b9b17a60fb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2b48q" Mar 13 12:17:02 crc kubenswrapper[4837]: I0313 12:17:02.518030 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/033a02c2-cbe4-4676-ae46-f9b9b17a60fb-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-2b48q\" (UID: \"033a02c2-cbe4-4676-ae46-f9b9b17a60fb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2b48q" Mar 13 12:17:02 crc kubenswrapper[4837]: I0313 12:17:02.620255 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9vmj\" (UniqueName: \"kubernetes.io/projected/033a02c2-cbe4-4676-ae46-f9b9b17a60fb-kube-api-access-w9vmj\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-2b48q\" (UID: \"033a02c2-cbe4-4676-ae46-f9b9b17a60fb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2b48q" Mar 13 12:17:02 crc kubenswrapper[4837]: I0313 12:17:02.620322 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/033a02c2-cbe4-4676-ae46-f9b9b17a60fb-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-2b48q\" (UID: \"033a02c2-cbe4-4676-ae46-f9b9b17a60fb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2b48q" Mar 13 12:17:02 crc kubenswrapper[4837]: I0313 12:17:02.620491 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/033a02c2-cbe4-4676-ae46-f9b9b17a60fb-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-2b48q\" (UID: \"033a02c2-cbe4-4676-ae46-f9b9b17a60fb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2b48q" Mar 13 12:17:02 crc kubenswrapper[4837]: I0313 12:17:02.624075 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/033a02c2-cbe4-4676-ae46-f9b9b17a60fb-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-2b48q\" (UID: \"033a02c2-cbe4-4676-ae46-f9b9b17a60fb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2b48q" Mar 13 12:17:02 crc kubenswrapper[4837]: I0313 12:17:02.625193 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/033a02c2-cbe4-4676-ae46-f9b9b17a60fb-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-2b48q\" (UID: \"033a02c2-cbe4-4676-ae46-f9b9b17a60fb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2b48q" Mar 13 12:17:02 crc kubenswrapper[4837]: I0313 12:17:02.644715 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9vmj\" (UniqueName: \"kubernetes.io/projected/033a02c2-cbe4-4676-ae46-f9b9b17a60fb-kube-api-access-w9vmj\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-2b48q\" (UID: \"033a02c2-cbe4-4676-ae46-f9b9b17a60fb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2b48q" Mar 13 12:17:02 crc kubenswrapper[4837]: I0313 12:17:02.723207 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2b48q" Mar 13 12:17:03 crc kubenswrapper[4837]: I0313 12:17:03.217958 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-2b48q"] Mar 13 12:17:03 crc kubenswrapper[4837]: I0313 12:17:03.348254 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2b48q" event={"ID":"033a02c2-cbe4-4676-ae46-f9b9b17a60fb","Type":"ContainerStarted","Data":"91032604b3d0fae03f8ede54a282bb2194bdb5ffb6e2e1270d8112c4c0b7f064"} Mar 13 12:17:04 crc kubenswrapper[4837]: I0313 12:17:04.357070 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2b48q" event={"ID":"033a02c2-cbe4-4676-ae46-f9b9b17a60fb","Type":"ContainerStarted","Data":"5fbab873202bfa50ec52c325b0665c702c9bbe3e9c1e6e487145d1a320c5bf54"} Mar 13 12:17:04 crc kubenswrapper[4837]: I0313 12:17:04.380393 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2b48q" podStartSLOduration=1.939548697 podStartE2EDuration="2.380371752s" podCreationTimestamp="2026-03-13 12:17:02 +0000 UTC" firstStartedPulling="2026-03-13 12:17:03.217141126 +0000 UTC m=+1738.855407889" lastFinishedPulling="2026-03-13 12:17:03.657964171 +0000 UTC m=+1739.296230944" observedRunningTime="2026-03-13 12:17:04.374360713 +0000 UTC m=+1740.012627496" watchObservedRunningTime="2026-03-13 12:17:04.380371752 +0000 UTC m=+1740.018638515" Mar 13 12:17:05 crc kubenswrapper[4837]: I0313 12:17:05.060715 4837 scope.go:117] "RemoveContainer" containerID="92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a" Mar 13 12:17:05 crc kubenswrapper[4837]: E0313 12:17:05.060972 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:17:06 crc kubenswrapper[4837]: I0313 12:17:06.031357 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-s7m97"] Mar 13 12:17:06 crc kubenswrapper[4837]: I0313 12:17:06.039005 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-s7m97"] Mar 13 12:17:07 crc kubenswrapper[4837]: I0313 12:17:07.061263 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3af4ac68-a437-4be7-adab-1ef336f0cbda" path="/var/lib/kubelet/pods/3af4ac68-a437-4be7-adab-1ef336f0cbda/volumes" Mar 13 12:17:16 crc kubenswrapper[4837]: I0313 12:17:16.046699 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-8vx8g"] Mar 13 12:17:16 crc kubenswrapper[4837]: I0313 12:17:16.056445 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-8vx8g"] Mar 13 12:17:17 crc kubenswrapper[4837]: I0313 12:17:17.040987 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-b6qnm"] Mar 13 12:17:17 crc kubenswrapper[4837]: I0313 12:17:17.066415 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08c7b2a5-b0b8-433f-b55d-c64eaeea8b76" path="/var/lib/kubelet/pods/08c7b2a5-b0b8-433f-b55d-c64eaeea8b76/volumes" Mar 13 12:17:17 crc kubenswrapper[4837]: I0313 12:17:17.067285 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-b6qnm"] Mar 13 12:17:18 crc kubenswrapper[4837]: I0313 12:17:18.037929 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-qdzjz"] Mar 13 12:17:18 crc kubenswrapper[4837]: I0313 12:17:18.045934 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-qdzjz"] Mar 13 12:17:19 crc kubenswrapper[4837]: I0313 12:17:19.048390 4837 scope.go:117] "RemoveContainer" containerID="92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a" Mar 13 12:17:19 crc kubenswrapper[4837]: E0313 12:17:19.049010 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:17:19 crc kubenswrapper[4837]: I0313 12:17:19.062376 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95b808e7-674f-4592-af6e-f7c8682f6a17" path="/var/lib/kubelet/pods/95b808e7-674f-4592-af6e-f7c8682f6a17/volumes" Mar 13 12:17:19 crc kubenswrapper[4837]: I0313 12:17:19.063011 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a44db1d6-6da2-41a5-a37f-ffc602f0d55a" path="/var/lib/kubelet/pods/a44db1d6-6da2-41a5-a37f-ffc602f0d55a/volumes" Mar 13 12:17:29 crc kubenswrapper[4837]: I0313 12:17:29.031925 4837 scope.go:117] "RemoveContainer" containerID="117a085c3636a60886a4974e5b0fb9b17907bfbb02c0f28e14e88a6a4aada355" Mar 13 12:17:29 crc kubenswrapper[4837]: I0313 12:17:29.083599 4837 scope.go:117] "RemoveContainer" containerID="167d2264a85f4435f333e5de927afa95b020419521d018ef924666fe1959c6ff" Mar 13 12:17:29 crc kubenswrapper[4837]: I0313 12:17:29.146596 4837 scope.go:117] "RemoveContainer" containerID="483a91e4e8aeb62a4bc9d00fab2fa3f3452e90337b10ae7eb6d6d40d39b495c8" Mar 13 12:17:29 crc kubenswrapper[4837]: I0313 12:17:29.183517 4837 scope.go:117] "RemoveContainer" containerID="2d322ad3eeeb347ecc17c10b7e12064f45bbd098c57202ba37c2350f75cdbf0c" Mar 13 12:17:29 crc kubenswrapper[4837]: I0313 12:17:29.233252 4837 scope.go:117] "RemoveContainer" containerID="843cf40344096a3f0565478be09bc819697f7ebe87515db62c711cd361ef6ce2" Mar 13 12:17:29 crc kubenswrapper[4837]: I0313 12:17:29.270175 4837 scope.go:117] "RemoveContainer" containerID="ba3dda01a90b7b0d00508491184e90f099c7ae7bc849213376ebbc68b88ffd0f" Mar 13 12:17:32 crc kubenswrapper[4837]: I0313 12:17:32.048726 4837 scope.go:117] "RemoveContainer" containerID="92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a" Mar 13 12:17:32 crc kubenswrapper[4837]: E0313 12:17:32.049524 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:17:36 crc kubenswrapper[4837]: I0313 12:17:36.634219 4837 generic.go:334] "Generic (PLEG): container finished" podID="033a02c2-cbe4-4676-ae46-f9b9b17a60fb" containerID="5fbab873202bfa50ec52c325b0665c702c9bbe3e9c1e6e487145d1a320c5bf54" exitCode=0 Mar 13 12:17:36 crc kubenswrapper[4837]: I0313 12:17:36.634315 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2b48q" event={"ID":"033a02c2-cbe4-4676-ae46-f9b9b17a60fb","Type":"ContainerDied","Data":"5fbab873202bfa50ec52c325b0665c702c9bbe3e9c1e6e487145d1a320c5bf54"} Mar 13 12:17:38 crc kubenswrapper[4837]: I0313 12:17:38.029930 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2b48q" Mar 13 12:17:38 crc kubenswrapper[4837]: I0313 12:17:38.178766 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/033a02c2-cbe4-4676-ae46-f9b9b17a60fb-ssh-key-openstack-edpm-ipam\") pod \"033a02c2-cbe4-4676-ae46-f9b9b17a60fb\" (UID: \"033a02c2-cbe4-4676-ae46-f9b9b17a60fb\") " Mar 13 12:17:38 crc kubenswrapper[4837]: I0313 12:17:38.178857 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/033a02c2-cbe4-4676-ae46-f9b9b17a60fb-inventory\") pod \"033a02c2-cbe4-4676-ae46-f9b9b17a60fb\" (UID: \"033a02c2-cbe4-4676-ae46-f9b9b17a60fb\") " Mar 13 12:17:38 crc kubenswrapper[4837]: I0313 12:17:38.179102 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9vmj\" (UniqueName: \"kubernetes.io/projected/033a02c2-cbe4-4676-ae46-f9b9b17a60fb-kube-api-access-w9vmj\") pod \"033a02c2-cbe4-4676-ae46-f9b9b17a60fb\" (UID: \"033a02c2-cbe4-4676-ae46-f9b9b17a60fb\") " Mar 13 12:17:38 crc kubenswrapper[4837]: I0313 12:17:38.184125 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/033a02c2-cbe4-4676-ae46-f9b9b17a60fb-kube-api-access-w9vmj" (OuterVolumeSpecName: "kube-api-access-w9vmj") pod "033a02c2-cbe4-4676-ae46-f9b9b17a60fb" (UID: "033a02c2-cbe4-4676-ae46-f9b9b17a60fb"). InnerVolumeSpecName "kube-api-access-w9vmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:17:38 crc kubenswrapper[4837]: I0313 12:17:38.207765 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/033a02c2-cbe4-4676-ae46-f9b9b17a60fb-inventory" (OuterVolumeSpecName: "inventory") pod "033a02c2-cbe4-4676-ae46-f9b9b17a60fb" (UID: "033a02c2-cbe4-4676-ae46-f9b9b17a60fb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:17:38 crc kubenswrapper[4837]: I0313 12:17:38.213525 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/033a02c2-cbe4-4676-ae46-f9b9b17a60fb-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "033a02c2-cbe4-4676-ae46-f9b9b17a60fb" (UID: "033a02c2-cbe4-4676-ae46-f9b9b17a60fb"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:17:38 crc kubenswrapper[4837]: I0313 12:17:38.281422 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/033a02c2-cbe4-4676-ae46-f9b9b17a60fb-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 12:17:38 crc kubenswrapper[4837]: I0313 12:17:38.281459 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/033a02c2-cbe4-4676-ae46-f9b9b17a60fb-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 12:17:38 crc kubenswrapper[4837]: I0313 12:17:38.281472 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9vmj\" (UniqueName: \"kubernetes.io/projected/033a02c2-cbe4-4676-ae46-f9b9b17a60fb-kube-api-access-w9vmj\") on node \"crc\" DevicePath \"\"" Mar 13 12:17:38 crc kubenswrapper[4837]: I0313 12:17:38.650469 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2b48q" event={"ID":"033a02c2-cbe4-4676-ae46-f9b9b17a60fb","Type":"ContainerDied","Data":"91032604b3d0fae03f8ede54a282bb2194bdb5ffb6e2e1270d8112c4c0b7f064"} Mar 13 12:17:38 crc kubenswrapper[4837]: I0313 12:17:38.650765 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91032604b3d0fae03f8ede54a282bb2194bdb5ffb6e2e1270d8112c4c0b7f064" Mar 13 12:17:38 crc kubenswrapper[4837]: I0313 12:17:38.650519 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-2b48q" Mar 13 12:17:38 crc kubenswrapper[4837]: I0313 12:17:38.736423 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp"] Mar 13 12:17:38 crc kubenswrapper[4837]: E0313 12:17:38.737398 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="033a02c2-cbe4-4676-ae46-f9b9b17a60fb" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 13 12:17:38 crc kubenswrapper[4837]: I0313 12:17:38.737423 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="033a02c2-cbe4-4676-ae46-f9b9b17a60fb" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 13 12:17:38 crc kubenswrapper[4837]: I0313 12:17:38.737630 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="033a02c2-cbe4-4676-ae46-f9b9b17a60fb" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 13 12:17:38 crc kubenswrapper[4837]: I0313 12:17:38.738442 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp" Mar 13 12:17:38 crc kubenswrapper[4837]: I0313 12:17:38.740757 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 12:17:38 crc kubenswrapper[4837]: I0313 12:17:38.740954 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 12:17:38 crc kubenswrapper[4837]: I0313 12:17:38.742917 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 12:17:38 crc kubenswrapper[4837]: I0313 12:17:38.743880 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dxdkz" Mar 13 12:17:38 crc kubenswrapper[4837]: I0313 12:17:38.752974 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp"] Mar 13 12:17:38 crc kubenswrapper[4837]: I0313 12:17:38.893806 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp\" (UID: \"0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp" Mar 13 12:17:38 crc kubenswrapper[4837]: I0313 12:17:38.893884 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qps56\" (UniqueName: \"kubernetes.io/projected/0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5-kube-api-access-qps56\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp\" (UID: \"0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp" Mar 13 12:17:38 crc kubenswrapper[4837]: I0313 12:17:38.893935 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp\" (UID: \"0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp" Mar 13 12:17:38 crc kubenswrapper[4837]: I0313 12:17:38.996297 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp\" (UID: \"0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp" Mar 13 12:17:38 crc kubenswrapper[4837]: I0313 12:17:38.996377 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qps56\" (UniqueName: \"kubernetes.io/projected/0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5-kube-api-access-qps56\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp\" (UID: \"0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp" Mar 13 12:17:38 crc kubenswrapper[4837]: I0313 12:17:38.996432 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp\" (UID: \"0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp" Mar 13 12:17:39 crc kubenswrapper[4837]: I0313 12:17:39.001385 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp\" (UID: \"0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp" Mar 13 12:17:39 crc kubenswrapper[4837]: I0313 12:17:39.005285 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp\" (UID: \"0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp" Mar 13 12:17:39 crc kubenswrapper[4837]: I0313 12:17:39.013807 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qps56\" (UniqueName: \"kubernetes.io/projected/0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5-kube-api-access-qps56\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp\" (UID: \"0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp" Mar 13 12:17:39 crc kubenswrapper[4837]: I0313 12:17:39.068903 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp" Mar 13 12:17:39 crc kubenswrapper[4837]: I0313 12:17:39.560479 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp"] Mar 13 12:17:39 crc kubenswrapper[4837]: I0313 12:17:39.660010 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp" event={"ID":"0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5","Type":"ContainerStarted","Data":"56bfbf56a4b06e90c9acfeb1c1dfdd56d79a35d972d01a9b46ddb65137df22e0"} Mar 13 12:17:40 crc kubenswrapper[4837]: I0313 12:17:40.668552 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp" event={"ID":"0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5","Type":"ContainerStarted","Data":"7e0a021bc48a043f4211761a5ed2921942e6c5d4b5e12cae51fb489e9f145645"} Mar 13 12:17:40 crc kubenswrapper[4837]: I0313 12:17:40.707964 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp" podStartSLOduration=2.230347053 podStartE2EDuration="2.707937113s" podCreationTimestamp="2026-03-13 12:17:38 +0000 UTC" firstStartedPulling="2026-03-13 12:17:39.567406978 +0000 UTC m=+1775.205673741" lastFinishedPulling="2026-03-13 12:17:40.044997038 +0000 UTC m=+1775.683263801" observedRunningTime="2026-03-13 12:17:40.686774689 +0000 UTC m=+1776.325041452" watchObservedRunningTime="2026-03-13 12:17:40.707937113 +0000 UTC m=+1776.346203876" Mar 13 12:17:44 crc kubenswrapper[4837]: I0313 12:17:44.048610 4837 scope.go:117] "RemoveContainer" containerID="92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a" Mar 13 12:17:44 crc kubenswrapper[4837]: E0313 12:17:44.049228 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:17:55 crc kubenswrapper[4837]: I0313 12:17:55.056219 4837 scope.go:117] "RemoveContainer" containerID="92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a" Mar 13 12:17:55 crc kubenswrapper[4837]: E0313 12:17:55.057071 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:18:00 crc kubenswrapper[4837]: I0313 12:18:00.153388 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556738-trdfn"] Mar 13 12:18:00 crc kubenswrapper[4837]: I0313 12:18:00.155159 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556738-trdfn" Mar 13 12:18:00 crc kubenswrapper[4837]: I0313 12:18:00.157317 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:18:00 crc kubenswrapper[4837]: I0313 12:18:00.157401 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:18:00 crc kubenswrapper[4837]: I0313 12:18:00.158010 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:18:00 crc kubenswrapper[4837]: I0313 12:18:00.163367 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556738-trdfn"] Mar 13 12:18:00 crc kubenswrapper[4837]: I0313 12:18:00.308943 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78p67\" (UniqueName: \"kubernetes.io/projected/abf39778-b981-4807-916d-f62ff0a03ac9-kube-api-access-78p67\") pod \"auto-csr-approver-29556738-trdfn\" (UID: \"abf39778-b981-4807-916d-f62ff0a03ac9\") " pod="openshift-infra/auto-csr-approver-29556738-trdfn" Mar 13 12:18:00 crc kubenswrapper[4837]: I0313 12:18:00.411127 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78p67\" (UniqueName: \"kubernetes.io/projected/abf39778-b981-4807-916d-f62ff0a03ac9-kube-api-access-78p67\") pod \"auto-csr-approver-29556738-trdfn\" (UID: \"abf39778-b981-4807-916d-f62ff0a03ac9\") " pod="openshift-infra/auto-csr-approver-29556738-trdfn" Mar 13 12:18:00 crc kubenswrapper[4837]: I0313 12:18:00.432425 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78p67\" (UniqueName: \"kubernetes.io/projected/abf39778-b981-4807-916d-f62ff0a03ac9-kube-api-access-78p67\") pod \"auto-csr-approver-29556738-trdfn\" (UID: \"abf39778-b981-4807-916d-f62ff0a03ac9\") " pod="openshift-infra/auto-csr-approver-29556738-trdfn" Mar 13 12:18:00 crc kubenswrapper[4837]: I0313 12:18:00.482385 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556738-trdfn" Mar 13 12:18:00 crc kubenswrapper[4837]: I0313 12:18:00.916103 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556738-trdfn"] Mar 13 12:18:01 crc kubenswrapper[4837]: I0313 12:18:01.040819 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-78jtc"] Mar 13 12:18:01 crc kubenswrapper[4837]: I0313 12:18:01.058841 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-t8qk9"] Mar 13 12:18:01 crc kubenswrapper[4837]: I0313 12:18:01.058881 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-78jtc"] Mar 13 12:18:01 crc kubenswrapper[4837]: I0313 12:18:01.066000 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-mqgjq"] Mar 13 12:18:01 crc kubenswrapper[4837]: I0313 12:18:01.074093 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-t8qk9"] Mar 13 12:18:01 crc kubenswrapper[4837]: I0313 12:18:01.082338 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-mqgjq"] Mar 13 12:18:01 crc kubenswrapper[4837]: I0313 12:18:01.846449 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556738-trdfn" event={"ID":"abf39778-b981-4807-916d-f62ff0a03ac9","Type":"ContainerStarted","Data":"8892a5742b94a7e9f8652d254b030f77a3f8d616c87aead90b427a6d5a291a73"} Mar 13 12:18:02 crc kubenswrapper[4837]: I0313 12:18:02.051143 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-c124-account-create-update-8zqgg"] Mar 13 12:18:02 crc kubenswrapper[4837]: I0313 12:18:02.052955 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-c124-account-create-update-8zqgg"] Mar 13 12:18:02 crc kubenswrapper[4837]: I0313 12:18:02.856971 4837 generic.go:334] "Generic (PLEG): container finished" podID="abf39778-b981-4807-916d-f62ff0a03ac9" containerID="7e866ef5a9a2608fd8aa30e6d573f07172996e7b068a978cf3d3449b179bd748" exitCode=0 Mar 13 12:18:02 crc kubenswrapper[4837]: I0313 12:18:02.857093 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556738-trdfn" event={"ID":"abf39778-b981-4807-916d-f62ff0a03ac9","Type":"ContainerDied","Data":"7e866ef5a9a2608fd8aa30e6d573f07172996e7b068a978cf3d3449b179bd748"} Mar 13 12:18:03 crc kubenswrapper[4837]: I0313 12:18:03.030600 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-8886-account-create-update-ljcrw"] Mar 13 12:18:03 crc kubenswrapper[4837]: I0313 12:18:03.040902 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-8886-account-create-update-ljcrw"] Mar 13 12:18:03 crc kubenswrapper[4837]: I0313 12:18:03.061478 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ac843c1-9934-4711-aae6-7f6920596cb3" path="/var/lib/kubelet/pods/6ac843c1-9934-4711-aae6-7f6920596cb3/volumes" Mar 13 12:18:03 crc kubenswrapper[4837]: I0313 12:18:03.062296 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb" path="/var/lib/kubelet/pods/8b3f58d1-98f1-4f3c-be58-e64a1d4e9bdb/volumes" Mar 13 12:18:03 crc kubenswrapper[4837]: I0313 12:18:03.062986 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e397db42-b505-4447-87a2-4c12ed412f28" path="/var/lib/kubelet/pods/e397db42-b505-4447-87a2-4c12ed412f28/volumes" Mar 13 12:18:03 crc kubenswrapper[4837]: I0313 12:18:03.063653 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e51457d7-9619-4179-8f01-de6ffe5ceb82" path="/var/lib/kubelet/pods/e51457d7-9619-4179-8f01-de6ffe5ceb82/volumes" Mar 13 12:18:03 crc kubenswrapper[4837]: I0313 12:18:03.065143 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff8550d6-aacb-4848-928d-b1581a66d499" path="/var/lib/kubelet/pods/ff8550d6-aacb-4848-928d-b1581a66d499/volumes" Mar 13 12:18:03 crc kubenswrapper[4837]: I0313 12:18:03.065714 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-4581-account-create-update-w6tc2"] Mar 13 12:18:03 crc kubenswrapper[4837]: I0313 12:18:03.065747 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-4581-account-create-update-w6tc2"] Mar 13 12:18:04 crc kubenswrapper[4837]: I0313 12:18:04.206010 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556738-trdfn" Mar 13 12:18:04 crc kubenswrapper[4837]: I0313 12:18:04.292282 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78p67\" (UniqueName: \"kubernetes.io/projected/abf39778-b981-4807-916d-f62ff0a03ac9-kube-api-access-78p67\") pod \"abf39778-b981-4807-916d-f62ff0a03ac9\" (UID: \"abf39778-b981-4807-916d-f62ff0a03ac9\") " Mar 13 12:18:04 crc kubenswrapper[4837]: I0313 12:18:04.297841 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abf39778-b981-4807-916d-f62ff0a03ac9-kube-api-access-78p67" (OuterVolumeSpecName: "kube-api-access-78p67") pod "abf39778-b981-4807-916d-f62ff0a03ac9" (UID: "abf39778-b981-4807-916d-f62ff0a03ac9"). InnerVolumeSpecName "kube-api-access-78p67". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:18:04 crc kubenswrapper[4837]: I0313 12:18:04.394130 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78p67\" (UniqueName: \"kubernetes.io/projected/abf39778-b981-4807-916d-f62ff0a03ac9-kube-api-access-78p67\") on node \"crc\" DevicePath \"\"" Mar 13 12:18:04 crc kubenswrapper[4837]: I0313 12:18:04.876722 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556738-trdfn" event={"ID":"abf39778-b981-4807-916d-f62ff0a03ac9","Type":"ContainerDied","Data":"8892a5742b94a7e9f8652d254b030f77a3f8d616c87aead90b427a6d5a291a73"} Mar 13 12:18:04 crc kubenswrapper[4837]: I0313 12:18:04.876796 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8892a5742b94a7e9f8652d254b030f77a3f8d616c87aead90b427a6d5a291a73" Mar 13 12:18:04 crc kubenswrapper[4837]: I0313 12:18:04.876828 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556738-trdfn" Mar 13 12:18:05 crc kubenswrapper[4837]: I0313 12:18:05.060469 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec46ef58-a8e9-4354-b9a1-568535879964" path="/var/lib/kubelet/pods/ec46ef58-a8e9-4354-b9a1-568535879964/volumes" Mar 13 12:18:05 crc kubenswrapper[4837]: I0313 12:18:05.275589 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556732-84qfh"] Mar 13 12:18:05 crc kubenswrapper[4837]: I0313 12:18:05.297348 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556732-84qfh"] Mar 13 12:18:06 crc kubenswrapper[4837]: I0313 12:18:06.047739 4837 scope.go:117] "RemoveContainer" containerID="92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a" Mar 13 12:18:06 crc kubenswrapper[4837]: E0313 12:18:06.048075 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:18:07 crc kubenswrapper[4837]: I0313 12:18:07.060791 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="564139cd-c95b-45c7-bf55-00c944313930" path="/var/lib/kubelet/pods/564139cd-c95b-45c7-bf55-00c944313930/volumes" Mar 13 12:18:21 crc kubenswrapper[4837]: I0313 12:18:21.051671 4837 scope.go:117] "RemoveContainer" containerID="92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a" Mar 13 12:18:21 crc kubenswrapper[4837]: E0313 12:18:21.052394 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:18:22 crc kubenswrapper[4837]: I0313 12:18:22.038343 4837 generic.go:334] "Generic (PLEG): container finished" podID="0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5" containerID="7e0a021bc48a043f4211761a5ed2921942e6c5d4b5e12cae51fb489e9f145645" exitCode=0 Mar 13 12:18:22 crc kubenswrapper[4837]: I0313 12:18:22.038453 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp" event={"ID":"0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5","Type":"ContainerDied","Data":"7e0a021bc48a043f4211761a5ed2921942e6c5d4b5e12cae51fb489e9f145645"} Mar 13 12:18:23 crc kubenswrapper[4837]: I0313 12:18:23.468479 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp" Mar 13 12:18:23 crc kubenswrapper[4837]: I0313 12:18:23.553664 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5-ssh-key-openstack-edpm-ipam\") pod \"0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5\" (UID: \"0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5\") " Mar 13 12:18:23 crc kubenswrapper[4837]: I0313 12:18:23.553767 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5-inventory\") pod \"0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5\" (UID: \"0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5\") " Mar 13 12:18:23 crc kubenswrapper[4837]: I0313 12:18:23.553823 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qps56\" (UniqueName: \"kubernetes.io/projected/0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5-kube-api-access-qps56\") pod \"0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5\" (UID: \"0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5\") " Mar 13 12:18:23 crc kubenswrapper[4837]: I0313 12:18:23.564880 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5-kube-api-access-qps56" (OuterVolumeSpecName: "kube-api-access-qps56") pod "0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5" (UID: "0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5"). InnerVolumeSpecName "kube-api-access-qps56". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:18:23 crc kubenswrapper[4837]: E0313 12:18:23.581132 4837 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5-ssh-key-openstack-edpm-ipam podName:0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5 nodeName:}" failed. No retries permitted until 2026-03-13 12:18:24.081097516 +0000 UTC m=+1819.719364279 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ssh-key-openstack-edpm-ipam" (UniqueName: "kubernetes.io/secret/0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5-ssh-key-openstack-edpm-ipam") pod "0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5" (UID: "0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5") : error deleting /var/lib/kubelet/pods/0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5/volume-subpaths: remove /var/lib/kubelet/pods/0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5/volume-subpaths: no such file or directory Mar 13 12:18:23 crc kubenswrapper[4837]: I0313 12:18:23.583919 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5-inventory" (OuterVolumeSpecName: "inventory") pod "0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5" (UID: "0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:18:23 crc kubenswrapper[4837]: I0313 12:18:23.656075 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 12:18:23 crc kubenswrapper[4837]: I0313 12:18:23.656129 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qps56\" (UniqueName: \"kubernetes.io/projected/0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5-kube-api-access-qps56\") on node \"crc\" DevicePath \"\"" Mar 13 12:18:24 crc kubenswrapper[4837]: I0313 12:18:24.060559 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp" event={"ID":"0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5","Type":"ContainerDied","Data":"56bfbf56a4b06e90c9acfeb1c1dfdd56d79a35d972d01a9b46ddb65137df22e0"} Mar 13 12:18:24 crc kubenswrapper[4837]: I0313 12:18:24.060605 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56bfbf56a4b06e90c9acfeb1c1dfdd56d79a35d972d01a9b46ddb65137df22e0" Mar 13 12:18:24 crc kubenswrapper[4837]: I0313 12:18:24.060625 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp" Mar 13 12:18:24 crc kubenswrapper[4837]: I0313 12:18:24.143010 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-vjnpx"] Mar 13 12:18:24 crc kubenswrapper[4837]: E0313 12:18:24.143773 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 13 12:18:24 crc kubenswrapper[4837]: I0313 12:18:24.143792 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 13 12:18:24 crc kubenswrapper[4837]: E0313 12:18:24.143829 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abf39778-b981-4807-916d-f62ff0a03ac9" containerName="oc" Mar 13 12:18:24 crc kubenswrapper[4837]: I0313 12:18:24.143836 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="abf39778-b981-4807-916d-f62ff0a03ac9" containerName="oc" Mar 13 12:18:24 crc kubenswrapper[4837]: I0313 12:18:24.144056 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="abf39778-b981-4807-916d-f62ff0a03ac9" containerName="oc" Mar 13 12:18:24 crc kubenswrapper[4837]: I0313 12:18:24.144083 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 13 12:18:24 crc kubenswrapper[4837]: I0313 12:18:24.145073 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-vjnpx" Mar 13 12:18:24 crc kubenswrapper[4837]: I0313 12:18:24.153911 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-vjnpx"] Mar 13 12:18:24 crc kubenswrapper[4837]: I0313 12:18:24.166604 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5-ssh-key-openstack-edpm-ipam\") pod \"0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5\" (UID: \"0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5\") " Mar 13 12:18:24 crc kubenswrapper[4837]: I0313 12:18:24.171452 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5" (UID: "0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:18:24 crc kubenswrapper[4837]: I0313 12:18:24.269279 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4ddcb794-ab03-4308-a93c-c5929ed96e01-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-vjnpx\" (UID: \"4ddcb794-ab03-4308-a93c-c5929ed96e01\") " pod="openstack/ssh-known-hosts-edpm-deployment-vjnpx" Mar 13 12:18:24 crc kubenswrapper[4837]: I0313 12:18:24.269327 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4ddcb794-ab03-4308-a93c-c5929ed96e01-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-vjnpx\" (UID: \"4ddcb794-ab03-4308-a93c-c5929ed96e01\") " pod="openstack/ssh-known-hosts-edpm-deployment-vjnpx" Mar 13 12:18:24 crc kubenswrapper[4837]: I0313 12:18:24.269410 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt4ls\" (UniqueName: \"kubernetes.io/projected/4ddcb794-ab03-4308-a93c-c5929ed96e01-kube-api-access-bt4ls\") pod \"ssh-known-hosts-edpm-deployment-vjnpx\" (UID: \"4ddcb794-ab03-4308-a93c-c5929ed96e01\") " pod="openstack/ssh-known-hosts-edpm-deployment-vjnpx" Mar 13 12:18:24 crc kubenswrapper[4837]: I0313 12:18:24.269489 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 12:18:24 crc kubenswrapper[4837]: I0313 12:18:24.371976 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4ddcb794-ab03-4308-a93c-c5929ed96e01-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-vjnpx\" (UID: \"4ddcb794-ab03-4308-a93c-c5929ed96e01\") " pod="openstack/ssh-known-hosts-edpm-deployment-vjnpx" Mar 13 12:18:24 crc kubenswrapper[4837]: I0313 12:18:24.372059 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4ddcb794-ab03-4308-a93c-c5929ed96e01-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-vjnpx\" (UID: \"4ddcb794-ab03-4308-a93c-c5929ed96e01\") " pod="openstack/ssh-known-hosts-edpm-deployment-vjnpx" Mar 13 12:18:24 crc kubenswrapper[4837]: I0313 12:18:24.372325 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bt4ls\" (UniqueName: \"kubernetes.io/projected/4ddcb794-ab03-4308-a93c-c5929ed96e01-kube-api-access-bt4ls\") pod \"ssh-known-hosts-edpm-deployment-vjnpx\" (UID: \"4ddcb794-ab03-4308-a93c-c5929ed96e01\") " pod="openstack/ssh-known-hosts-edpm-deployment-vjnpx" Mar 13 12:18:24 crc kubenswrapper[4837]: I0313 12:18:24.376182 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4ddcb794-ab03-4308-a93c-c5929ed96e01-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-vjnpx\" (UID: \"4ddcb794-ab03-4308-a93c-c5929ed96e01\") " pod="openstack/ssh-known-hosts-edpm-deployment-vjnpx" Mar 13 12:18:24 crc kubenswrapper[4837]: I0313 12:18:24.377243 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4ddcb794-ab03-4308-a93c-c5929ed96e01-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-vjnpx\" (UID: \"4ddcb794-ab03-4308-a93c-c5929ed96e01\") " pod="openstack/ssh-known-hosts-edpm-deployment-vjnpx" Mar 13 12:18:24 crc kubenswrapper[4837]: I0313 12:18:24.391429 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt4ls\" (UniqueName: \"kubernetes.io/projected/4ddcb794-ab03-4308-a93c-c5929ed96e01-kube-api-access-bt4ls\") pod \"ssh-known-hosts-edpm-deployment-vjnpx\" (UID: \"4ddcb794-ab03-4308-a93c-c5929ed96e01\") " pod="openstack/ssh-known-hosts-edpm-deployment-vjnpx" Mar 13 12:18:24 crc kubenswrapper[4837]: I0313 12:18:24.474825 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-vjnpx" Mar 13 12:18:24 crc kubenswrapper[4837]: I0313 12:18:24.981432 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-vjnpx"] Mar 13 12:18:25 crc kubenswrapper[4837]: I0313 12:18:25.072316 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-vjnpx" event={"ID":"4ddcb794-ab03-4308-a93c-c5929ed96e01","Type":"ContainerStarted","Data":"451be746bf2063e51cbed79829c6492e0e00242a709c0fe8864eca7fe5d169bc"} Mar 13 12:18:26 crc kubenswrapper[4837]: I0313 12:18:26.080421 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-vjnpx" event={"ID":"4ddcb794-ab03-4308-a93c-c5929ed96e01","Type":"ContainerStarted","Data":"f83f1fa87b51b5557d732f083ed0b520911c865e056a38c97d7c668251609759"} Mar 13 12:18:26 crc kubenswrapper[4837]: I0313 12:18:26.101202 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-vjnpx" podStartSLOduration=1.70186305 podStartE2EDuration="2.101184749s" podCreationTimestamp="2026-03-13 12:18:24 +0000 UTC" firstStartedPulling="2026-03-13 12:18:24.993625011 +0000 UTC m=+1820.631891774" lastFinishedPulling="2026-03-13 12:18:25.39294671 +0000 UTC m=+1821.031213473" observedRunningTime="2026-03-13 12:18:26.09454251 +0000 UTC m=+1821.732809273" watchObservedRunningTime="2026-03-13 12:18:26.101184749 +0000 UTC m=+1821.739451512" Mar 13 12:18:29 crc kubenswrapper[4837]: I0313 12:18:29.391881 4837 scope.go:117] "RemoveContainer" containerID="e317d41369cc2f3ddf2e1c831d3041b43d32e03d72c05e27b06993576c33a0e8" Mar 13 12:18:29 crc kubenswrapper[4837]: I0313 12:18:29.460917 4837 scope.go:117] "RemoveContainer" containerID="c2cc081c6cf65b0ab460d8cc6143c9f0d5447d7db94e85de44cfe2121792b6a0" Mar 13 12:18:29 crc kubenswrapper[4837]: I0313 12:18:29.481874 4837 scope.go:117] "RemoveContainer" containerID="1c35974102ee9d500e8bf603751d70cec13d07eed47dcd00ec2798fe9d358807" Mar 13 12:18:29 crc kubenswrapper[4837]: I0313 12:18:29.534298 4837 scope.go:117] "RemoveContainer" containerID="5e6f4da7142b59c465f13069e8abffd32ebc3f04eeb6b88f772977ed584113c2" Mar 13 12:18:29 crc kubenswrapper[4837]: I0313 12:18:29.565947 4837 scope.go:117] "RemoveContainer" containerID="5d76ffad79a0d1339467174946f42bf027114aea75c47bb037057ca882b93f88" Mar 13 12:18:29 crc kubenswrapper[4837]: I0313 12:18:29.610571 4837 scope.go:117] "RemoveContainer" containerID="76d8bcdb73b13d595e4c37de91e0da9193b0dfe32e04f54fbcbfc723d4f95d1f" Mar 13 12:18:29 crc kubenswrapper[4837]: I0313 12:18:29.653176 4837 scope.go:117] "RemoveContainer" containerID="a98015db97ff0f5b37e30b833d1fc53c9a24f182fbe7bafcf011e2544e8dd80d" Mar 13 12:18:31 crc kubenswrapper[4837]: I0313 12:18:31.062138 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-f6gwd"] Mar 13 12:18:31 crc kubenswrapper[4837]: I0313 12:18:31.062431 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-f6gwd"] Mar 13 12:18:32 crc kubenswrapper[4837]: I0313 12:18:32.048716 4837 scope.go:117] "RemoveContainer" containerID="92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a" Mar 13 12:18:32 crc kubenswrapper[4837]: E0313 12:18:32.049050 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:18:32 crc kubenswrapper[4837]: I0313 12:18:32.131089 4837 generic.go:334] "Generic (PLEG): container finished" podID="4ddcb794-ab03-4308-a93c-c5929ed96e01" containerID="f83f1fa87b51b5557d732f083ed0b520911c865e056a38c97d7c668251609759" exitCode=0 Mar 13 12:18:32 crc kubenswrapper[4837]: I0313 12:18:32.131140 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-vjnpx" event={"ID":"4ddcb794-ab03-4308-a93c-c5929ed96e01","Type":"ContainerDied","Data":"f83f1fa87b51b5557d732f083ed0b520911c865e056a38c97d7c668251609759"} Mar 13 12:18:33 crc kubenswrapper[4837]: I0313 12:18:33.062843 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d6d5bbe-7e5b-4645-95c4-af868cba3244" path="/var/lib/kubelet/pods/5d6d5bbe-7e5b-4645-95c4-af868cba3244/volumes" Mar 13 12:18:33 crc kubenswrapper[4837]: I0313 12:18:33.611397 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-vjnpx" Mar 13 12:18:33 crc kubenswrapper[4837]: I0313 12:18:33.640956 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4ddcb794-ab03-4308-a93c-c5929ed96e01-ssh-key-openstack-edpm-ipam\") pod \"4ddcb794-ab03-4308-a93c-c5929ed96e01\" (UID: \"4ddcb794-ab03-4308-a93c-c5929ed96e01\") " Mar 13 12:18:33 crc kubenswrapper[4837]: I0313 12:18:33.641069 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4ddcb794-ab03-4308-a93c-c5929ed96e01-inventory-0\") pod \"4ddcb794-ab03-4308-a93c-c5929ed96e01\" (UID: \"4ddcb794-ab03-4308-a93c-c5929ed96e01\") " Mar 13 12:18:33 crc kubenswrapper[4837]: I0313 12:18:33.641129 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bt4ls\" (UniqueName: \"kubernetes.io/projected/4ddcb794-ab03-4308-a93c-c5929ed96e01-kube-api-access-bt4ls\") pod \"4ddcb794-ab03-4308-a93c-c5929ed96e01\" (UID: \"4ddcb794-ab03-4308-a93c-c5929ed96e01\") " Mar 13 12:18:33 crc kubenswrapper[4837]: I0313 12:18:33.648852 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ddcb794-ab03-4308-a93c-c5929ed96e01-kube-api-access-bt4ls" (OuterVolumeSpecName: "kube-api-access-bt4ls") pod "4ddcb794-ab03-4308-a93c-c5929ed96e01" (UID: "4ddcb794-ab03-4308-a93c-c5929ed96e01"). InnerVolumeSpecName "kube-api-access-bt4ls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:18:33 crc kubenswrapper[4837]: I0313 12:18:33.668981 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ddcb794-ab03-4308-a93c-c5929ed96e01-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4ddcb794-ab03-4308-a93c-c5929ed96e01" (UID: "4ddcb794-ab03-4308-a93c-c5929ed96e01"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:18:33 crc kubenswrapper[4837]: I0313 12:18:33.672810 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ddcb794-ab03-4308-a93c-c5929ed96e01-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "4ddcb794-ab03-4308-a93c-c5929ed96e01" (UID: "4ddcb794-ab03-4308-a93c-c5929ed96e01"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:18:33 crc kubenswrapper[4837]: I0313 12:18:33.743507 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bt4ls\" (UniqueName: \"kubernetes.io/projected/4ddcb794-ab03-4308-a93c-c5929ed96e01-kube-api-access-bt4ls\") on node \"crc\" DevicePath \"\"" Mar 13 12:18:33 crc kubenswrapper[4837]: I0313 12:18:33.743543 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4ddcb794-ab03-4308-a93c-c5929ed96e01-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 12:18:33 crc kubenswrapper[4837]: I0313 12:18:33.743553 4837 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4ddcb794-ab03-4308-a93c-c5929ed96e01-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 13 12:18:34 crc kubenswrapper[4837]: I0313 12:18:34.146967 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-vjnpx" event={"ID":"4ddcb794-ab03-4308-a93c-c5929ed96e01","Type":"ContainerDied","Data":"451be746bf2063e51cbed79829c6492e0e00242a709c0fe8864eca7fe5d169bc"} Mar 13 12:18:34 crc kubenswrapper[4837]: I0313 12:18:34.147017 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="451be746bf2063e51cbed79829c6492e0e00242a709c0fe8864eca7fe5d169bc" Mar 13 12:18:34 crc kubenswrapper[4837]: I0313 12:18:34.147409 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-vjnpx" Mar 13 12:18:34 crc kubenswrapper[4837]: I0313 12:18:34.264191 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-s6jdp"] Mar 13 12:18:34 crc kubenswrapper[4837]: E0313 12:18:34.264742 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ddcb794-ab03-4308-a93c-c5929ed96e01" containerName="ssh-known-hosts-edpm-deployment" Mar 13 12:18:34 crc kubenswrapper[4837]: I0313 12:18:34.264760 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ddcb794-ab03-4308-a93c-c5929ed96e01" containerName="ssh-known-hosts-edpm-deployment" Mar 13 12:18:34 crc kubenswrapper[4837]: I0313 12:18:34.264981 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ddcb794-ab03-4308-a93c-c5929ed96e01" containerName="ssh-known-hosts-edpm-deployment" Mar 13 12:18:34 crc kubenswrapper[4837]: I0313 12:18:34.265741 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s6jdp" Mar 13 12:18:34 crc kubenswrapper[4837]: I0313 12:18:34.269169 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 12:18:34 crc kubenswrapper[4837]: I0313 12:18:34.269192 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 12:18:34 crc kubenswrapper[4837]: I0313 12:18:34.269465 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 12:18:34 crc kubenswrapper[4837]: I0313 12:18:34.269912 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dxdkz" Mar 13 12:18:34 crc kubenswrapper[4837]: I0313 12:18:34.278167 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-s6jdp"] Mar 13 12:18:34 crc kubenswrapper[4837]: I0313 12:18:34.353949 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jh6xr\" (UniqueName: \"kubernetes.io/projected/f12ac62a-2011-4e89-a16f-e136959f9d1a-kube-api-access-jh6xr\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-s6jdp\" (UID: \"f12ac62a-2011-4e89-a16f-e136959f9d1a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s6jdp" Mar 13 12:18:34 crc kubenswrapper[4837]: I0313 12:18:34.354024 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f12ac62a-2011-4e89-a16f-e136959f9d1a-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-s6jdp\" (UID: \"f12ac62a-2011-4e89-a16f-e136959f9d1a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s6jdp" Mar 13 12:18:34 crc kubenswrapper[4837]: I0313 12:18:34.354095 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f12ac62a-2011-4e89-a16f-e136959f9d1a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-s6jdp\" (UID: \"f12ac62a-2011-4e89-a16f-e136959f9d1a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s6jdp" Mar 13 12:18:34 crc kubenswrapper[4837]: I0313 12:18:34.455614 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f12ac62a-2011-4e89-a16f-e136959f9d1a-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-s6jdp\" (UID: \"f12ac62a-2011-4e89-a16f-e136959f9d1a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s6jdp" Mar 13 12:18:34 crc kubenswrapper[4837]: I0313 12:18:34.455773 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f12ac62a-2011-4e89-a16f-e136959f9d1a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-s6jdp\" (UID: \"f12ac62a-2011-4e89-a16f-e136959f9d1a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s6jdp" Mar 13 12:18:34 crc kubenswrapper[4837]: I0313 12:18:34.455870 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jh6xr\" (UniqueName: \"kubernetes.io/projected/f12ac62a-2011-4e89-a16f-e136959f9d1a-kube-api-access-jh6xr\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-s6jdp\" (UID: \"f12ac62a-2011-4e89-a16f-e136959f9d1a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s6jdp" Mar 13 12:18:34 crc kubenswrapper[4837]: I0313 12:18:34.461181 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f12ac62a-2011-4e89-a16f-e136959f9d1a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-s6jdp\" (UID: \"f12ac62a-2011-4e89-a16f-e136959f9d1a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s6jdp" Mar 13 12:18:34 crc kubenswrapper[4837]: I0313 12:18:34.461962 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f12ac62a-2011-4e89-a16f-e136959f9d1a-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-s6jdp\" (UID: \"f12ac62a-2011-4e89-a16f-e136959f9d1a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s6jdp" Mar 13 12:18:34 crc kubenswrapper[4837]: I0313 12:18:34.484795 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jh6xr\" (UniqueName: \"kubernetes.io/projected/f12ac62a-2011-4e89-a16f-e136959f9d1a-kube-api-access-jh6xr\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-s6jdp\" (UID: \"f12ac62a-2011-4e89-a16f-e136959f9d1a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s6jdp" Mar 13 12:18:34 crc kubenswrapper[4837]: I0313 12:18:34.581729 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s6jdp" Mar 13 12:18:34 crc kubenswrapper[4837]: I0313 12:18:34.880284 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-s6jdp"] Mar 13 12:18:35 crc kubenswrapper[4837]: I0313 12:18:35.155812 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s6jdp" event={"ID":"f12ac62a-2011-4e89-a16f-e136959f9d1a","Type":"ContainerStarted","Data":"2ea8a7545c2f490a975716ffc5914ba1d4d1313b87599c519dda0edd29cdc7cb"} Mar 13 12:18:36 crc kubenswrapper[4837]: I0313 12:18:36.166001 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s6jdp" event={"ID":"f12ac62a-2011-4e89-a16f-e136959f9d1a","Type":"ContainerStarted","Data":"5b0e6698b907c693465a3b16020d70fbda7db26663c7b84f6365068fdb5d08bd"} Mar 13 12:18:36 crc kubenswrapper[4837]: I0313 12:18:36.189421 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s6jdp" podStartSLOduration=1.739640563 podStartE2EDuration="2.189400488s" podCreationTimestamp="2026-03-13 12:18:34 +0000 UTC" firstStartedPulling="2026-03-13 12:18:34.897707132 +0000 UTC m=+1830.535973895" lastFinishedPulling="2026-03-13 12:18:35.347467057 +0000 UTC m=+1830.985733820" observedRunningTime="2026-03-13 12:18:36.182041837 +0000 UTC m=+1831.820308600" watchObservedRunningTime="2026-03-13 12:18:36.189400488 +0000 UTC m=+1831.827667251" Mar 13 12:18:43 crc kubenswrapper[4837]: I0313 12:18:43.048958 4837 scope.go:117] "RemoveContainer" containerID="92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a" Mar 13 12:18:43 crc kubenswrapper[4837]: E0313 12:18:43.049850 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:18:43 crc kubenswrapper[4837]: I0313 12:18:43.239292 4837 generic.go:334] "Generic (PLEG): container finished" podID="f12ac62a-2011-4e89-a16f-e136959f9d1a" containerID="5b0e6698b907c693465a3b16020d70fbda7db26663c7b84f6365068fdb5d08bd" exitCode=0 Mar 13 12:18:43 crc kubenswrapper[4837]: I0313 12:18:43.239337 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s6jdp" event={"ID":"f12ac62a-2011-4e89-a16f-e136959f9d1a","Type":"ContainerDied","Data":"5b0e6698b907c693465a3b16020d70fbda7db26663c7b84f6365068fdb5d08bd"} Mar 13 12:18:44 crc kubenswrapper[4837]: I0313 12:18:44.650747 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s6jdp" Mar 13 12:18:44 crc kubenswrapper[4837]: I0313 12:18:44.846444 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jh6xr\" (UniqueName: \"kubernetes.io/projected/f12ac62a-2011-4e89-a16f-e136959f9d1a-kube-api-access-jh6xr\") pod \"f12ac62a-2011-4e89-a16f-e136959f9d1a\" (UID: \"f12ac62a-2011-4e89-a16f-e136959f9d1a\") " Mar 13 12:18:44 crc kubenswrapper[4837]: I0313 12:18:44.846563 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f12ac62a-2011-4e89-a16f-e136959f9d1a-ssh-key-openstack-edpm-ipam\") pod \"f12ac62a-2011-4e89-a16f-e136959f9d1a\" (UID: \"f12ac62a-2011-4e89-a16f-e136959f9d1a\") " Mar 13 12:18:44 crc kubenswrapper[4837]: I0313 12:18:44.846669 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f12ac62a-2011-4e89-a16f-e136959f9d1a-inventory\") pod \"f12ac62a-2011-4e89-a16f-e136959f9d1a\" (UID: \"f12ac62a-2011-4e89-a16f-e136959f9d1a\") " Mar 13 12:18:44 crc kubenswrapper[4837]: I0313 12:18:44.853074 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f12ac62a-2011-4e89-a16f-e136959f9d1a-kube-api-access-jh6xr" (OuterVolumeSpecName: "kube-api-access-jh6xr") pod "f12ac62a-2011-4e89-a16f-e136959f9d1a" (UID: "f12ac62a-2011-4e89-a16f-e136959f9d1a"). InnerVolumeSpecName "kube-api-access-jh6xr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:18:44 crc kubenswrapper[4837]: I0313 12:18:44.877922 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f12ac62a-2011-4e89-a16f-e136959f9d1a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f12ac62a-2011-4e89-a16f-e136959f9d1a" (UID: "f12ac62a-2011-4e89-a16f-e136959f9d1a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:18:44 crc kubenswrapper[4837]: I0313 12:18:44.881917 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f12ac62a-2011-4e89-a16f-e136959f9d1a-inventory" (OuterVolumeSpecName: "inventory") pod "f12ac62a-2011-4e89-a16f-e136959f9d1a" (UID: "f12ac62a-2011-4e89-a16f-e136959f9d1a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:18:44 crc kubenswrapper[4837]: I0313 12:18:44.948862 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jh6xr\" (UniqueName: \"kubernetes.io/projected/f12ac62a-2011-4e89-a16f-e136959f9d1a-kube-api-access-jh6xr\") on node \"crc\" DevicePath \"\"" Mar 13 12:18:44 crc kubenswrapper[4837]: I0313 12:18:44.948907 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f12ac62a-2011-4e89-a16f-e136959f9d1a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 12:18:44 crc kubenswrapper[4837]: I0313 12:18:44.948921 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f12ac62a-2011-4e89-a16f-e136959f9d1a-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 12:18:45 crc kubenswrapper[4837]: I0313 12:18:45.278269 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s6jdp" event={"ID":"f12ac62a-2011-4e89-a16f-e136959f9d1a","Type":"ContainerDied","Data":"2ea8a7545c2f490a975716ffc5914ba1d4d1313b87599c519dda0edd29cdc7cb"} Mar 13 12:18:45 crc kubenswrapper[4837]: I0313 12:18:45.278314 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ea8a7545c2f490a975716ffc5914ba1d4d1313b87599c519dda0edd29cdc7cb" Mar 13 12:18:45 crc kubenswrapper[4837]: I0313 12:18:45.278370 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s6jdp" Mar 13 12:18:45 crc kubenswrapper[4837]: I0313 12:18:45.358494 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d"] Mar 13 12:18:45 crc kubenswrapper[4837]: E0313 12:18:45.359281 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f12ac62a-2011-4e89-a16f-e136959f9d1a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 13 12:18:45 crc kubenswrapper[4837]: I0313 12:18:45.359306 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f12ac62a-2011-4e89-a16f-e136959f9d1a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 13 12:18:45 crc kubenswrapper[4837]: I0313 12:18:45.359577 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f12ac62a-2011-4e89-a16f-e136959f9d1a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 13 12:18:45 crc kubenswrapper[4837]: I0313 12:18:45.360372 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d" Mar 13 12:18:45 crc kubenswrapper[4837]: I0313 12:18:45.368902 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d"] Mar 13 12:18:45 crc kubenswrapper[4837]: I0313 12:18:45.376112 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 12:18:45 crc kubenswrapper[4837]: I0313 12:18:45.376153 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 12:18:45 crc kubenswrapper[4837]: I0313 12:18:45.376288 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dxdkz" Mar 13 12:18:45 crc kubenswrapper[4837]: I0313 12:18:45.376301 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 12:18:45 crc kubenswrapper[4837]: I0313 12:18:45.457723 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nv2sm\" (UniqueName: \"kubernetes.io/projected/3b96ea7e-2148-4659-9a26-3335c88888c1-kube-api-access-nv2sm\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d\" (UID: \"3b96ea7e-2148-4659-9a26-3335c88888c1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d" Mar 13 12:18:45 crc kubenswrapper[4837]: I0313 12:18:45.457818 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3b96ea7e-2148-4659-9a26-3335c88888c1-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d\" (UID: \"3b96ea7e-2148-4659-9a26-3335c88888c1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d" Mar 13 12:18:45 crc kubenswrapper[4837]: I0313 12:18:45.457908 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3b96ea7e-2148-4659-9a26-3335c88888c1-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d\" (UID: \"3b96ea7e-2148-4659-9a26-3335c88888c1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d" Mar 13 12:18:45 crc kubenswrapper[4837]: I0313 12:18:45.559413 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3b96ea7e-2148-4659-9a26-3335c88888c1-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d\" (UID: \"3b96ea7e-2148-4659-9a26-3335c88888c1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d" Mar 13 12:18:45 crc kubenswrapper[4837]: I0313 12:18:45.559605 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nv2sm\" (UniqueName: \"kubernetes.io/projected/3b96ea7e-2148-4659-9a26-3335c88888c1-kube-api-access-nv2sm\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d\" (UID: \"3b96ea7e-2148-4659-9a26-3335c88888c1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d" Mar 13 12:18:45 crc kubenswrapper[4837]: I0313 12:18:45.559652 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3b96ea7e-2148-4659-9a26-3335c88888c1-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d\" (UID: \"3b96ea7e-2148-4659-9a26-3335c88888c1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d" Mar 13 12:18:45 crc kubenswrapper[4837]: I0313 12:18:45.564274 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3b96ea7e-2148-4659-9a26-3335c88888c1-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d\" (UID: \"3b96ea7e-2148-4659-9a26-3335c88888c1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d" Mar 13 12:18:45 crc kubenswrapper[4837]: I0313 12:18:45.565880 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3b96ea7e-2148-4659-9a26-3335c88888c1-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d\" (UID: \"3b96ea7e-2148-4659-9a26-3335c88888c1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d" Mar 13 12:18:45 crc kubenswrapper[4837]: I0313 12:18:45.577688 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nv2sm\" (UniqueName: \"kubernetes.io/projected/3b96ea7e-2148-4659-9a26-3335c88888c1-kube-api-access-nv2sm\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d\" (UID: \"3b96ea7e-2148-4659-9a26-3335c88888c1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d" Mar 13 12:18:45 crc kubenswrapper[4837]: I0313 12:18:45.701567 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d" Mar 13 12:18:46 crc kubenswrapper[4837]: I0313 12:18:46.196279 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d"] Mar 13 12:18:46 crc kubenswrapper[4837]: I0313 12:18:46.285906 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d" event={"ID":"3b96ea7e-2148-4659-9a26-3335c88888c1","Type":"ContainerStarted","Data":"0bc2be67d1432fb24a9622d1d6fa190c835e77bf54844cd1ef72245a818d45fe"} Mar 13 12:18:47 crc kubenswrapper[4837]: I0313 12:18:47.298321 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d" event={"ID":"3b96ea7e-2148-4659-9a26-3335c88888c1","Type":"ContainerStarted","Data":"f37ec99eae382be87b274f7fe9869f814c866a0ac73b90f403cedf878941c703"} Mar 13 12:18:47 crc kubenswrapper[4837]: I0313 12:18:47.322980 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d" podStartSLOduration=1.916345562 podStartE2EDuration="2.322960751s" podCreationTimestamp="2026-03-13 12:18:45 +0000 UTC" firstStartedPulling="2026-03-13 12:18:46.202222008 +0000 UTC m=+1841.840488771" lastFinishedPulling="2026-03-13 12:18:46.608837177 +0000 UTC m=+1842.247103960" observedRunningTime="2026-03-13 12:18:47.318526581 +0000 UTC m=+1842.956793344" watchObservedRunningTime="2026-03-13 12:18:47.322960751 +0000 UTC m=+1842.961227514" Mar 13 12:18:50 crc kubenswrapper[4837]: I0313 12:18:50.054364 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-xlps2"] Mar 13 12:18:50 crc kubenswrapper[4837]: I0313 12:18:50.073683 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8mzt4"] Mar 13 12:18:50 crc kubenswrapper[4837]: I0313 12:18:50.084759 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-xlps2"] Mar 13 12:18:50 crc kubenswrapper[4837]: I0313 12:18:50.092335 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8mzt4"] Mar 13 12:18:51 crc kubenswrapper[4837]: I0313 12:18:51.060690 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02b82791-6ef3-4a93-9d5a-84065d62775d" path="/var/lib/kubelet/pods/02b82791-6ef3-4a93-9d5a-84065d62775d/volumes" Mar 13 12:18:51 crc kubenswrapper[4837]: I0313 12:18:51.061377 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53268342-9adb-48b3-ba5b-52634c2c68fe" path="/var/lib/kubelet/pods/53268342-9adb-48b3-ba5b-52634c2c68fe/volumes" Mar 13 12:18:55 crc kubenswrapper[4837]: I0313 12:18:55.362755 4837 generic.go:334] "Generic (PLEG): container finished" podID="3b96ea7e-2148-4659-9a26-3335c88888c1" containerID="f37ec99eae382be87b274f7fe9869f814c866a0ac73b90f403cedf878941c703" exitCode=0 Mar 13 12:18:55 crc kubenswrapper[4837]: I0313 12:18:55.362835 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d" event={"ID":"3b96ea7e-2148-4659-9a26-3335c88888c1","Type":"ContainerDied","Data":"f37ec99eae382be87b274f7fe9869f814c866a0ac73b90f403cedf878941c703"} Mar 13 12:18:56 crc kubenswrapper[4837]: I0313 12:18:56.895043 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.049415 4837 scope.go:117] "RemoveContainer" containerID="92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a" Mar 13 12:18:57 crc kubenswrapper[4837]: E0313 12:18:57.053741 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.073502 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nv2sm\" (UniqueName: \"kubernetes.io/projected/3b96ea7e-2148-4659-9a26-3335c88888c1-kube-api-access-nv2sm\") pod \"3b96ea7e-2148-4659-9a26-3335c88888c1\" (UID: \"3b96ea7e-2148-4659-9a26-3335c88888c1\") " Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.073789 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3b96ea7e-2148-4659-9a26-3335c88888c1-inventory\") pod \"3b96ea7e-2148-4659-9a26-3335c88888c1\" (UID: \"3b96ea7e-2148-4659-9a26-3335c88888c1\") " Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.073823 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3b96ea7e-2148-4659-9a26-3335c88888c1-ssh-key-openstack-edpm-ipam\") pod \"3b96ea7e-2148-4659-9a26-3335c88888c1\" (UID: \"3b96ea7e-2148-4659-9a26-3335c88888c1\") " Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.079770 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b96ea7e-2148-4659-9a26-3335c88888c1-kube-api-access-nv2sm" (OuterVolumeSpecName: "kube-api-access-nv2sm") pod "3b96ea7e-2148-4659-9a26-3335c88888c1" (UID: "3b96ea7e-2148-4659-9a26-3335c88888c1"). InnerVolumeSpecName "kube-api-access-nv2sm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.102005 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b96ea7e-2148-4659-9a26-3335c88888c1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3b96ea7e-2148-4659-9a26-3335c88888c1" (UID: "3b96ea7e-2148-4659-9a26-3335c88888c1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.102291 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b96ea7e-2148-4659-9a26-3335c88888c1-inventory" (OuterVolumeSpecName: "inventory") pod "3b96ea7e-2148-4659-9a26-3335c88888c1" (UID: "3b96ea7e-2148-4659-9a26-3335c88888c1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.177384 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3b96ea7e-2148-4659-9a26-3335c88888c1-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.177428 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3b96ea7e-2148-4659-9a26-3335c88888c1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.177445 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nv2sm\" (UniqueName: \"kubernetes.io/projected/3b96ea7e-2148-4659-9a26-3335c88888c1-kube-api-access-nv2sm\") on node \"crc\" DevicePath \"\"" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.385913 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d" event={"ID":"3b96ea7e-2148-4659-9a26-3335c88888c1","Type":"ContainerDied","Data":"0bc2be67d1432fb24a9622d1d6fa190c835e77bf54844cd1ef72245a818d45fe"} Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.385957 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0bc2be67d1432fb24a9622d1d6fa190c835e77bf54844cd1ef72245a818d45fe" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.385974 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.472370 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c"] Mar 13 12:18:57 crc kubenswrapper[4837]: E0313 12:18:57.473177 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b96ea7e-2148-4659-9a26-3335c88888c1" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.473300 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b96ea7e-2148-4659-9a26-3335c88888c1" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.473581 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b96ea7e-2148-4659-9a26-3335c88888c1" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.474458 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.481614 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.482017 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.482311 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.482274 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.482936 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.483059 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.483176 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dxdkz" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.483229 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.487452 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c"] Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.587810 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.588152 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.588207 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.588227 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.588448 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.588548 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.588577 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.588621 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75mvx\" (UniqueName: \"kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-kube-api-access-75mvx\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.588669 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.588704 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.588822 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.588900 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.588987 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.589014 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.691112 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.691176 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.691238 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75mvx\" (UniqueName: \"kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-kube-api-access-75mvx\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.691264 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.691290 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.691346 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.691384 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.691432 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.691457 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.691484 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.691531 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.691591 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.691617 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.691701 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.696695 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.696911 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.698030 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.698443 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.698882 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.699039 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.699105 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.699912 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.700351 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.700621 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.700741 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.702397 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.703965 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.711502 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75mvx\" (UniqueName: \"kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-kube-api-access-75mvx\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gj59c\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:57 crc kubenswrapper[4837]: I0313 12:18:57.800491 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:18:58 crc kubenswrapper[4837]: I0313 12:18:58.331104 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c"] Mar 13 12:18:58 crc kubenswrapper[4837]: I0313 12:18:58.393973 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" event={"ID":"6cc8d0dd-d1e6-4374-bb90-aaefc9197350","Type":"ContainerStarted","Data":"a9a004fb6e650fe374173e7535e9f528dd1cc37af26ae43f015e8366167fa211"} Mar 13 12:18:59 crc kubenswrapper[4837]: I0313 12:18:59.403542 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" event={"ID":"6cc8d0dd-d1e6-4374-bb90-aaefc9197350","Type":"ContainerStarted","Data":"317902596cfacd73a102a6829fe35a61e885c73f5b273e8ac5d10209c855380d"} Mar 13 12:18:59 crc kubenswrapper[4837]: I0313 12:18:59.421112 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" podStartSLOduration=1.994401026 podStartE2EDuration="2.421091047s" podCreationTimestamp="2026-03-13 12:18:57 +0000 UTC" firstStartedPulling="2026-03-13 12:18:58.332667299 +0000 UTC m=+1853.970934052" lastFinishedPulling="2026-03-13 12:18:58.75935731 +0000 UTC m=+1854.397624073" observedRunningTime="2026-03-13 12:18:59.417573486 +0000 UTC m=+1855.055840249" watchObservedRunningTime="2026-03-13 12:18:59.421091047 +0000 UTC m=+1855.059357810" Mar 13 12:19:08 crc kubenswrapper[4837]: I0313 12:19:08.410180 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-b85kp"] Mar 13 12:19:08 crc kubenswrapper[4837]: I0313 12:19:08.413112 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b85kp" Mar 13 12:19:08 crc kubenswrapper[4837]: I0313 12:19:08.425049 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b85kp"] Mar 13 12:19:08 crc kubenswrapper[4837]: I0313 12:19:08.601176 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q229j\" (UniqueName: \"kubernetes.io/projected/9900be86-1923-4036-bccc-7e9c0484fb4c-kube-api-access-q229j\") pod \"redhat-operators-b85kp\" (UID: \"9900be86-1923-4036-bccc-7e9c0484fb4c\") " pod="openshift-marketplace/redhat-operators-b85kp" Mar 13 12:19:08 crc kubenswrapper[4837]: I0313 12:19:08.601309 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9900be86-1923-4036-bccc-7e9c0484fb4c-utilities\") pod \"redhat-operators-b85kp\" (UID: \"9900be86-1923-4036-bccc-7e9c0484fb4c\") " pod="openshift-marketplace/redhat-operators-b85kp" Mar 13 12:19:08 crc kubenswrapper[4837]: I0313 12:19:08.601363 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9900be86-1923-4036-bccc-7e9c0484fb4c-catalog-content\") pod \"redhat-operators-b85kp\" (UID: \"9900be86-1923-4036-bccc-7e9c0484fb4c\") " pod="openshift-marketplace/redhat-operators-b85kp" Mar 13 12:19:08 crc kubenswrapper[4837]: I0313 12:19:08.704402 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q229j\" (UniqueName: \"kubernetes.io/projected/9900be86-1923-4036-bccc-7e9c0484fb4c-kube-api-access-q229j\") pod \"redhat-operators-b85kp\" (UID: \"9900be86-1923-4036-bccc-7e9c0484fb4c\") " pod="openshift-marketplace/redhat-operators-b85kp" Mar 13 12:19:08 crc kubenswrapper[4837]: I0313 12:19:08.704509 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9900be86-1923-4036-bccc-7e9c0484fb4c-utilities\") pod \"redhat-operators-b85kp\" (UID: \"9900be86-1923-4036-bccc-7e9c0484fb4c\") " pod="openshift-marketplace/redhat-operators-b85kp" Mar 13 12:19:08 crc kubenswrapper[4837]: I0313 12:19:08.704553 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9900be86-1923-4036-bccc-7e9c0484fb4c-catalog-content\") pod \"redhat-operators-b85kp\" (UID: \"9900be86-1923-4036-bccc-7e9c0484fb4c\") " pod="openshift-marketplace/redhat-operators-b85kp" Mar 13 12:19:08 crc kubenswrapper[4837]: I0313 12:19:08.705259 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9900be86-1923-4036-bccc-7e9c0484fb4c-catalog-content\") pod \"redhat-operators-b85kp\" (UID: \"9900be86-1923-4036-bccc-7e9c0484fb4c\") " pod="openshift-marketplace/redhat-operators-b85kp" Mar 13 12:19:08 crc kubenswrapper[4837]: I0313 12:19:08.705675 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9900be86-1923-4036-bccc-7e9c0484fb4c-utilities\") pod \"redhat-operators-b85kp\" (UID: \"9900be86-1923-4036-bccc-7e9c0484fb4c\") " pod="openshift-marketplace/redhat-operators-b85kp" Mar 13 12:19:08 crc kubenswrapper[4837]: I0313 12:19:08.726422 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q229j\" (UniqueName: \"kubernetes.io/projected/9900be86-1923-4036-bccc-7e9c0484fb4c-kube-api-access-q229j\") pod \"redhat-operators-b85kp\" (UID: \"9900be86-1923-4036-bccc-7e9c0484fb4c\") " pod="openshift-marketplace/redhat-operators-b85kp" Mar 13 12:19:08 crc kubenswrapper[4837]: I0313 12:19:08.739707 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b85kp" Mar 13 12:19:09 crc kubenswrapper[4837]: I0313 12:19:09.194771 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b85kp"] Mar 13 12:19:09 crc kubenswrapper[4837]: W0313 12:19:09.199706 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9900be86_1923_4036_bccc_7e9c0484fb4c.slice/crio-ca581bf39f1e9f8c7e41b890048bfe29dcb34e40e5c64cbcb1b880ad113b0f6d WatchSource:0}: Error finding container ca581bf39f1e9f8c7e41b890048bfe29dcb34e40e5c64cbcb1b880ad113b0f6d: Status 404 returned error can't find the container with id ca581bf39f1e9f8c7e41b890048bfe29dcb34e40e5c64cbcb1b880ad113b0f6d Mar 13 12:19:09 crc kubenswrapper[4837]: I0313 12:19:09.495565 4837 generic.go:334] "Generic (PLEG): container finished" podID="9900be86-1923-4036-bccc-7e9c0484fb4c" containerID="6fab948827daa6c53b19a10ed677270420a413c7c36a1256f2a3b3246a0b99e4" exitCode=0 Mar 13 12:19:09 crc kubenswrapper[4837]: I0313 12:19:09.495611 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b85kp" event={"ID":"9900be86-1923-4036-bccc-7e9c0484fb4c","Type":"ContainerDied","Data":"6fab948827daa6c53b19a10ed677270420a413c7c36a1256f2a3b3246a0b99e4"} Mar 13 12:19:09 crc kubenswrapper[4837]: I0313 12:19:09.495653 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b85kp" event={"ID":"9900be86-1923-4036-bccc-7e9c0484fb4c","Type":"ContainerStarted","Data":"ca581bf39f1e9f8c7e41b890048bfe29dcb34e40e5c64cbcb1b880ad113b0f6d"} Mar 13 12:19:09 crc kubenswrapper[4837]: I0313 12:19:09.499479 4837 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 12:19:09 crc kubenswrapper[4837]: I0313 12:19:09.813986 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tsfbn"] Mar 13 12:19:09 crc kubenswrapper[4837]: I0313 12:19:09.816607 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tsfbn" Mar 13 12:19:09 crc kubenswrapper[4837]: I0313 12:19:09.825961 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tsfbn"] Mar 13 12:19:09 crc kubenswrapper[4837]: I0313 12:19:09.827802 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a-utilities\") pod \"certified-operators-tsfbn\" (UID: \"d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a\") " pod="openshift-marketplace/certified-operators-tsfbn" Mar 13 12:19:09 crc kubenswrapper[4837]: I0313 12:19:09.827859 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a-catalog-content\") pod \"certified-operators-tsfbn\" (UID: \"d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a\") " pod="openshift-marketplace/certified-operators-tsfbn" Mar 13 12:19:09 crc kubenswrapper[4837]: I0313 12:19:09.827999 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml6fk\" (UniqueName: \"kubernetes.io/projected/d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a-kube-api-access-ml6fk\") pod \"certified-operators-tsfbn\" (UID: \"d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a\") " pod="openshift-marketplace/certified-operators-tsfbn" Mar 13 12:19:09 crc kubenswrapper[4837]: I0313 12:19:09.930148 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ml6fk\" (UniqueName: \"kubernetes.io/projected/d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a-kube-api-access-ml6fk\") pod \"certified-operators-tsfbn\" (UID: \"d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a\") " pod="openshift-marketplace/certified-operators-tsfbn" Mar 13 12:19:09 crc kubenswrapper[4837]: I0313 12:19:09.930292 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a-utilities\") pod \"certified-operators-tsfbn\" (UID: \"d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a\") " pod="openshift-marketplace/certified-operators-tsfbn" Mar 13 12:19:09 crc kubenswrapper[4837]: I0313 12:19:09.930326 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a-catalog-content\") pod \"certified-operators-tsfbn\" (UID: \"d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a\") " pod="openshift-marketplace/certified-operators-tsfbn" Mar 13 12:19:09 crc kubenswrapper[4837]: I0313 12:19:09.931069 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a-utilities\") pod \"certified-operators-tsfbn\" (UID: \"d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a\") " pod="openshift-marketplace/certified-operators-tsfbn" Mar 13 12:19:09 crc kubenswrapper[4837]: I0313 12:19:09.931139 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a-catalog-content\") pod \"certified-operators-tsfbn\" (UID: \"d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a\") " pod="openshift-marketplace/certified-operators-tsfbn" Mar 13 12:19:09 crc kubenswrapper[4837]: I0313 12:19:09.954184 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml6fk\" (UniqueName: \"kubernetes.io/projected/d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a-kube-api-access-ml6fk\") pod \"certified-operators-tsfbn\" (UID: \"d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a\") " pod="openshift-marketplace/certified-operators-tsfbn" Mar 13 12:19:10 crc kubenswrapper[4837]: I0313 12:19:10.049129 4837 scope.go:117] "RemoveContainer" containerID="92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a" Mar 13 12:19:10 crc kubenswrapper[4837]: E0313 12:19:10.049422 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:19:10 crc kubenswrapper[4837]: I0313 12:19:10.135581 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tsfbn" Mar 13 12:19:10 crc kubenswrapper[4837]: I0313 12:19:10.516160 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b85kp" event={"ID":"9900be86-1923-4036-bccc-7e9c0484fb4c","Type":"ContainerStarted","Data":"053200fa75a090695d23d906d7072bcc128ba23e140608f0676e9047dd0b4f6e"} Mar 13 12:19:10 crc kubenswrapper[4837]: I0313 12:19:10.717849 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tsfbn"] Mar 13 12:19:11 crc kubenswrapper[4837]: I0313 12:19:11.530206 4837 generic.go:334] "Generic (PLEG): container finished" podID="9900be86-1923-4036-bccc-7e9c0484fb4c" containerID="053200fa75a090695d23d906d7072bcc128ba23e140608f0676e9047dd0b4f6e" exitCode=0 Mar 13 12:19:11 crc kubenswrapper[4837]: I0313 12:19:11.530279 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b85kp" event={"ID":"9900be86-1923-4036-bccc-7e9c0484fb4c","Type":"ContainerDied","Data":"053200fa75a090695d23d906d7072bcc128ba23e140608f0676e9047dd0b4f6e"} Mar 13 12:19:11 crc kubenswrapper[4837]: I0313 12:19:11.547398 4837 generic.go:334] "Generic (PLEG): container finished" podID="d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a" containerID="adad9381bc6a39ddbdad6c4301cbb23bc8c90b91950618f3b2fe7fc956cf30c4" exitCode=0 Mar 13 12:19:11 crc kubenswrapper[4837]: I0313 12:19:11.547440 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tsfbn" event={"ID":"d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a","Type":"ContainerDied","Data":"adad9381bc6a39ddbdad6c4301cbb23bc8c90b91950618f3b2fe7fc956cf30c4"} Mar 13 12:19:11 crc kubenswrapper[4837]: I0313 12:19:11.547472 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tsfbn" event={"ID":"d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a","Type":"ContainerStarted","Data":"9cdfda55cf58dcae44b171ff87d0d9876fe414823d1ec7d0b1b7ed1df6f59fe5"} Mar 13 12:19:13 crc kubenswrapper[4837]: I0313 12:19:13.567904 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b85kp" event={"ID":"9900be86-1923-4036-bccc-7e9c0484fb4c","Type":"ContainerStarted","Data":"b0190408b44511066b179adf5b02a13554289d3dacdbf4491cd7c6e138e935cd"} Mar 13 12:19:13 crc kubenswrapper[4837]: I0313 12:19:13.571600 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tsfbn" event={"ID":"d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a","Type":"ContainerStarted","Data":"50cefa7392ae5d869e28ff046f43607d22feea06cc84d09928cf9eb7cc27bc7c"} Mar 13 12:19:13 crc kubenswrapper[4837]: I0313 12:19:13.593384 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-b85kp" podStartSLOduration=2.710451386 podStartE2EDuration="5.593362852s" podCreationTimestamp="2026-03-13 12:19:08 +0000 UTC" firstStartedPulling="2026-03-13 12:19:09.499194528 +0000 UTC m=+1865.137461291" lastFinishedPulling="2026-03-13 12:19:12.382105994 +0000 UTC m=+1868.020372757" observedRunningTime="2026-03-13 12:19:13.583958596 +0000 UTC m=+1869.222225369" watchObservedRunningTime="2026-03-13 12:19:13.593362852 +0000 UTC m=+1869.231629615" Mar 13 12:19:14 crc kubenswrapper[4837]: I0313 12:19:14.584472 4837 generic.go:334] "Generic (PLEG): container finished" podID="d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a" containerID="50cefa7392ae5d869e28ff046f43607d22feea06cc84d09928cf9eb7cc27bc7c" exitCode=0 Mar 13 12:19:14 crc kubenswrapper[4837]: I0313 12:19:14.584590 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tsfbn" event={"ID":"d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a","Type":"ContainerDied","Data":"50cefa7392ae5d869e28ff046f43607d22feea06cc84d09928cf9eb7cc27bc7c"} Mar 13 12:19:16 crc kubenswrapper[4837]: I0313 12:19:16.602281 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tsfbn" event={"ID":"d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a","Type":"ContainerStarted","Data":"bfa2c9a9825342e83fc8ddcd92d9af1ebbe4e83ce8898f8cdf46504fa5d9b0b9"} Mar 13 12:19:16 crc kubenswrapper[4837]: I0313 12:19:16.630474 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tsfbn" podStartSLOduration=3.180937052 podStartE2EDuration="7.630450514s" podCreationTimestamp="2026-03-13 12:19:09 +0000 UTC" firstStartedPulling="2026-03-13 12:19:11.55120443 +0000 UTC m=+1867.189471203" lastFinishedPulling="2026-03-13 12:19:16.000717902 +0000 UTC m=+1871.638984665" observedRunningTime="2026-03-13 12:19:16.62236954 +0000 UTC m=+1872.260636293" watchObservedRunningTime="2026-03-13 12:19:16.630450514 +0000 UTC m=+1872.268717277" Mar 13 12:19:18 crc kubenswrapper[4837]: I0313 12:19:18.740749 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-b85kp" Mar 13 12:19:18 crc kubenswrapper[4837]: I0313 12:19:18.741744 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-b85kp" Mar 13 12:19:19 crc kubenswrapper[4837]: I0313 12:19:19.805216 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-b85kp" podUID="9900be86-1923-4036-bccc-7e9c0484fb4c" containerName="registry-server" probeResult="failure" output=< Mar 13 12:19:19 crc kubenswrapper[4837]: timeout: failed to connect service ":50051" within 1s Mar 13 12:19:19 crc kubenswrapper[4837]: > Mar 13 12:19:20 crc kubenswrapper[4837]: I0313 12:19:20.136757 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tsfbn" Mar 13 12:19:20 crc kubenswrapper[4837]: I0313 12:19:20.136819 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tsfbn" Mar 13 12:19:20 crc kubenswrapper[4837]: I0313 12:19:20.182657 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tsfbn" Mar 13 12:19:25 crc kubenswrapper[4837]: I0313 12:19:25.054091 4837 scope.go:117] "RemoveContainer" containerID="92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a" Mar 13 12:19:25 crc kubenswrapper[4837]: E0313 12:19:25.055035 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:19:28 crc kubenswrapper[4837]: I0313 12:19:28.784625 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-b85kp" Mar 13 12:19:28 crc kubenswrapper[4837]: I0313 12:19:28.834810 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-b85kp" Mar 13 12:19:29 crc kubenswrapper[4837]: I0313 12:19:29.022963 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b85kp"] Mar 13 12:19:29 crc kubenswrapper[4837]: I0313 12:19:29.763948 4837 scope.go:117] "RemoveContainer" containerID="deea73f54571ed1f4517906256e112c93e642ebacb77d1a62a53b5217eb1d25c" Mar 13 12:19:29 crc kubenswrapper[4837]: I0313 12:19:29.803967 4837 scope.go:117] "RemoveContainer" containerID="5337a2212bdc3b1dbf150fa95afc9aaae420bfce797da10558e36cb08bd46c77" Mar 13 12:19:29 crc kubenswrapper[4837]: I0313 12:19:29.844321 4837 scope.go:117] "RemoveContainer" containerID="5e2fe1dde876f5e43e3e8ce2528c539e8504cc8726824d4a38da88b3f10df140" Mar 13 12:19:30 crc kubenswrapper[4837]: I0313 12:19:30.184734 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tsfbn" Mar 13 12:19:30 crc kubenswrapper[4837]: I0313 12:19:30.725162 4837 generic.go:334] "Generic (PLEG): container finished" podID="6cc8d0dd-d1e6-4374-bb90-aaefc9197350" containerID="317902596cfacd73a102a6829fe35a61e885c73f5b273e8ac5d10209c855380d" exitCode=0 Mar 13 12:19:30 crc kubenswrapper[4837]: I0313 12:19:30.725355 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-b85kp" podUID="9900be86-1923-4036-bccc-7e9c0484fb4c" containerName="registry-server" containerID="cri-o://b0190408b44511066b179adf5b02a13554289d3dacdbf4491cd7c6e138e935cd" gracePeriod=2 Mar 13 12:19:30 crc kubenswrapper[4837]: I0313 12:19:30.725596 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" event={"ID":"6cc8d0dd-d1e6-4374-bb90-aaefc9197350","Type":"ContainerDied","Data":"317902596cfacd73a102a6829fe35a61e885c73f5b273e8ac5d10209c855380d"} Mar 13 12:19:31 crc kubenswrapper[4837]: I0313 12:19:31.427406 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tsfbn"] Mar 13 12:19:31 crc kubenswrapper[4837]: I0313 12:19:31.427971 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tsfbn" podUID="d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a" containerName="registry-server" containerID="cri-o://bfa2c9a9825342e83fc8ddcd92d9af1ebbe4e83ce8898f8cdf46504fa5d9b0b9" gracePeriod=2 Mar 13 12:19:31 crc kubenswrapper[4837]: I0313 12:19:31.698747 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b85kp" Mar 13 12:19:31 crc kubenswrapper[4837]: I0313 12:19:31.735078 4837 generic.go:334] "Generic (PLEG): container finished" podID="9900be86-1923-4036-bccc-7e9c0484fb4c" containerID="b0190408b44511066b179adf5b02a13554289d3dacdbf4491cd7c6e138e935cd" exitCode=0 Mar 13 12:19:31 crc kubenswrapper[4837]: I0313 12:19:31.735142 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b85kp" event={"ID":"9900be86-1923-4036-bccc-7e9c0484fb4c","Type":"ContainerDied","Data":"b0190408b44511066b179adf5b02a13554289d3dacdbf4491cd7c6e138e935cd"} Mar 13 12:19:31 crc kubenswrapper[4837]: I0313 12:19:31.735169 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b85kp" event={"ID":"9900be86-1923-4036-bccc-7e9c0484fb4c","Type":"ContainerDied","Data":"ca581bf39f1e9f8c7e41b890048bfe29dcb34e40e5c64cbcb1b880ad113b0f6d"} Mar 13 12:19:31 crc kubenswrapper[4837]: I0313 12:19:31.735185 4837 scope.go:117] "RemoveContainer" containerID="b0190408b44511066b179adf5b02a13554289d3dacdbf4491cd7c6e138e935cd" Mar 13 12:19:31 crc kubenswrapper[4837]: I0313 12:19:31.735201 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b85kp" Mar 13 12:19:31 crc kubenswrapper[4837]: I0313 12:19:31.737466 4837 generic.go:334] "Generic (PLEG): container finished" podID="d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a" containerID="bfa2c9a9825342e83fc8ddcd92d9af1ebbe4e83ce8898f8cdf46504fa5d9b0b9" exitCode=0 Mar 13 12:19:31 crc kubenswrapper[4837]: I0313 12:19:31.737619 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tsfbn" event={"ID":"d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a","Type":"ContainerDied","Data":"bfa2c9a9825342e83fc8ddcd92d9af1ebbe4e83ce8898f8cdf46504fa5d9b0b9"} Mar 13 12:19:31 crc kubenswrapper[4837]: I0313 12:19:31.758087 4837 scope.go:117] "RemoveContainer" containerID="053200fa75a090695d23d906d7072bcc128ba23e140608f0676e9047dd0b4f6e" Mar 13 12:19:31 crc kubenswrapper[4837]: I0313 12:19:31.817193 4837 scope.go:117] "RemoveContainer" containerID="6fab948827daa6c53b19a10ed677270420a413c7c36a1256f2a3b3246a0b99e4" Mar 13 12:19:31 crc kubenswrapper[4837]: I0313 12:19:31.840331 4837 scope.go:117] "RemoveContainer" containerID="b0190408b44511066b179adf5b02a13554289d3dacdbf4491cd7c6e138e935cd" Mar 13 12:19:31 crc kubenswrapper[4837]: E0313 12:19:31.842156 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0190408b44511066b179adf5b02a13554289d3dacdbf4491cd7c6e138e935cd\": container with ID starting with b0190408b44511066b179adf5b02a13554289d3dacdbf4491cd7c6e138e935cd not found: ID does not exist" containerID="b0190408b44511066b179adf5b02a13554289d3dacdbf4491cd7c6e138e935cd" Mar 13 12:19:31 crc kubenswrapper[4837]: I0313 12:19:31.842210 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0190408b44511066b179adf5b02a13554289d3dacdbf4491cd7c6e138e935cd"} err="failed to get container status \"b0190408b44511066b179adf5b02a13554289d3dacdbf4491cd7c6e138e935cd\": rpc error: code = NotFound desc = could not find container \"b0190408b44511066b179adf5b02a13554289d3dacdbf4491cd7c6e138e935cd\": container with ID starting with b0190408b44511066b179adf5b02a13554289d3dacdbf4491cd7c6e138e935cd not found: ID does not exist" Mar 13 12:19:31 crc kubenswrapper[4837]: I0313 12:19:31.842239 4837 scope.go:117] "RemoveContainer" containerID="053200fa75a090695d23d906d7072bcc128ba23e140608f0676e9047dd0b4f6e" Mar 13 12:19:31 crc kubenswrapper[4837]: E0313 12:19:31.842478 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"053200fa75a090695d23d906d7072bcc128ba23e140608f0676e9047dd0b4f6e\": container with ID starting with 053200fa75a090695d23d906d7072bcc128ba23e140608f0676e9047dd0b4f6e not found: ID does not exist" containerID="053200fa75a090695d23d906d7072bcc128ba23e140608f0676e9047dd0b4f6e" Mar 13 12:19:31 crc kubenswrapper[4837]: I0313 12:19:31.842510 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"053200fa75a090695d23d906d7072bcc128ba23e140608f0676e9047dd0b4f6e"} err="failed to get container status \"053200fa75a090695d23d906d7072bcc128ba23e140608f0676e9047dd0b4f6e\": rpc error: code = NotFound desc = could not find container \"053200fa75a090695d23d906d7072bcc128ba23e140608f0676e9047dd0b4f6e\": container with ID starting with 053200fa75a090695d23d906d7072bcc128ba23e140608f0676e9047dd0b4f6e not found: ID does not exist" Mar 13 12:19:31 crc kubenswrapper[4837]: I0313 12:19:31.842527 4837 scope.go:117] "RemoveContainer" containerID="6fab948827daa6c53b19a10ed677270420a413c7c36a1256f2a3b3246a0b99e4" Mar 13 12:19:31 crc kubenswrapper[4837]: E0313 12:19:31.842981 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fab948827daa6c53b19a10ed677270420a413c7c36a1256f2a3b3246a0b99e4\": container with ID starting with 6fab948827daa6c53b19a10ed677270420a413c7c36a1256f2a3b3246a0b99e4 not found: ID does not exist" containerID="6fab948827daa6c53b19a10ed677270420a413c7c36a1256f2a3b3246a0b99e4" Mar 13 12:19:31 crc kubenswrapper[4837]: I0313 12:19:31.843004 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fab948827daa6c53b19a10ed677270420a413c7c36a1256f2a3b3246a0b99e4"} err="failed to get container status \"6fab948827daa6c53b19a10ed677270420a413c7c36a1256f2a3b3246a0b99e4\": rpc error: code = NotFound desc = could not find container \"6fab948827daa6c53b19a10ed677270420a413c7c36a1256f2a3b3246a0b99e4\": container with ID starting with 6fab948827daa6c53b19a10ed677270420a413c7c36a1256f2a3b3246a0b99e4 not found: ID does not exist" Mar 13 12:19:31 crc kubenswrapper[4837]: I0313 12:19:31.859827 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q229j\" (UniqueName: \"kubernetes.io/projected/9900be86-1923-4036-bccc-7e9c0484fb4c-kube-api-access-q229j\") pod \"9900be86-1923-4036-bccc-7e9c0484fb4c\" (UID: \"9900be86-1923-4036-bccc-7e9c0484fb4c\") " Mar 13 12:19:31 crc kubenswrapper[4837]: I0313 12:19:31.859963 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9900be86-1923-4036-bccc-7e9c0484fb4c-catalog-content\") pod \"9900be86-1923-4036-bccc-7e9c0484fb4c\" (UID: \"9900be86-1923-4036-bccc-7e9c0484fb4c\") " Mar 13 12:19:31 crc kubenswrapper[4837]: I0313 12:19:31.860023 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9900be86-1923-4036-bccc-7e9c0484fb4c-utilities\") pod \"9900be86-1923-4036-bccc-7e9c0484fb4c\" (UID: \"9900be86-1923-4036-bccc-7e9c0484fb4c\") " Mar 13 12:19:31 crc kubenswrapper[4837]: I0313 12:19:31.860687 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9900be86-1923-4036-bccc-7e9c0484fb4c-utilities" (OuterVolumeSpecName: "utilities") pod "9900be86-1923-4036-bccc-7e9c0484fb4c" (UID: "9900be86-1923-4036-bccc-7e9c0484fb4c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:19:31 crc kubenswrapper[4837]: I0313 12:19:31.866602 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9900be86-1923-4036-bccc-7e9c0484fb4c-kube-api-access-q229j" (OuterVolumeSpecName: "kube-api-access-q229j") pod "9900be86-1923-4036-bccc-7e9c0484fb4c" (UID: "9900be86-1923-4036-bccc-7e9c0484fb4c"). InnerVolumeSpecName "kube-api-access-q229j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:19:31 crc kubenswrapper[4837]: I0313 12:19:31.931870 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tsfbn" Mar 13 12:19:31 crc kubenswrapper[4837]: I0313 12:19:31.963492 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q229j\" (UniqueName: \"kubernetes.io/projected/9900be86-1923-4036-bccc-7e9c0484fb4c-kube-api-access-q229j\") on node \"crc\" DevicePath \"\"" Mar 13 12:19:31 crc kubenswrapper[4837]: I0313 12:19:31.963541 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9900be86-1923-4036-bccc-7e9c0484fb4c-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.005948 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9900be86-1923-4036-bccc-7e9c0484fb4c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9900be86-1923-4036-bccc-7e9c0484fb4c" (UID: "9900be86-1923-4036-bccc-7e9c0484fb4c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.065158 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ml6fk\" (UniqueName: \"kubernetes.io/projected/d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a-kube-api-access-ml6fk\") pod \"d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a\" (UID: \"d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a\") " Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.065274 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a-utilities\") pod \"d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a\" (UID: \"d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a\") " Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.065428 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a-catalog-content\") pod \"d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a\" (UID: \"d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a\") " Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.066000 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9900be86-1923-4036-bccc-7e9c0484fb4c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.066402 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a-utilities" (OuterVolumeSpecName: "utilities") pod "d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a" (UID: "d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.069263 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a-kube-api-access-ml6fk" (OuterVolumeSpecName: "kube-api-access-ml6fk") pod "d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a" (UID: "d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a"). InnerVolumeSpecName "kube-api-access-ml6fk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.128907 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a" (UID: "d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.141237 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.167826 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b85kp"] Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.168120 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.168176 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.168194 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ml6fk\" (UniqueName: \"kubernetes.io/projected/d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a-kube-api-access-ml6fk\") on node \"crc\" DevicePath \"\"" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.184785 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-b85kp"] Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.269378 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-libvirt-combined-ca-bundle\") pod \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.269420 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-nova-combined-ca-bundle\") pod \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.269452 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-openstack-edpm-ipam-ovn-default-certs-0\") pod \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.270299 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.270344 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-neutron-metadata-combined-ca-bundle\") pod \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.270381 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75mvx\" (UniqueName: \"kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-kube-api-access-75mvx\") pod \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.270416 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-ovn-combined-ca-bundle\") pod \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.270688 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-bootstrap-combined-ca-bundle\") pod \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.271028 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.271112 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-inventory\") pod \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.271572 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-repo-setup-combined-ca-bundle\") pod \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.271667 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-telemetry-combined-ca-bundle\") pod \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.271755 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.273428 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-ssh-key-openstack-edpm-ipam\") pod \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\" (UID: \"6cc8d0dd-d1e6-4374-bb90-aaefc9197350\") " Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.275294 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "6cc8d0dd-d1e6-4374-bb90-aaefc9197350" (UID: "6cc8d0dd-d1e6-4374-bb90-aaefc9197350"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.275911 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "6cc8d0dd-d1e6-4374-bb90-aaefc9197350" (UID: "6cc8d0dd-d1e6-4374-bb90-aaefc9197350"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.276173 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "6cc8d0dd-d1e6-4374-bb90-aaefc9197350" (UID: "6cc8d0dd-d1e6-4374-bb90-aaefc9197350"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.278966 4837 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.279000 4837 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.279029 4837 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.279823 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "6cc8d0dd-d1e6-4374-bb90-aaefc9197350" (UID: "6cc8d0dd-d1e6-4374-bb90-aaefc9197350"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.281903 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "6cc8d0dd-d1e6-4374-bb90-aaefc9197350" (UID: "6cc8d0dd-d1e6-4374-bb90-aaefc9197350"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.292536 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-kube-api-access-75mvx" (OuterVolumeSpecName: "kube-api-access-75mvx") pod "6cc8d0dd-d1e6-4374-bb90-aaefc9197350" (UID: "6cc8d0dd-d1e6-4374-bb90-aaefc9197350"). InnerVolumeSpecName "kube-api-access-75mvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.297314 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "6cc8d0dd-d1e6-4374-bb90-aaefc9197350" (UID: "6cc8d0dd-d1e6-4374-bb90-aaefc9197350"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.297300 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "6cc8d0dd-d1e6-4374-bb90-aaefc9197350" (UID: "6cc8d0dd-d1e6-4374-bb90-aaefc9197350"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.297964 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "6cc8d0dd-d1e6-4374-bb90-aaefc9197350" (UID: "6cc8d0dd-d1e6-4374-bb90-aaefc9197350"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.298307 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "6cc8d0dd-d1e6-4374-bb90-aaefc9197350" (UID: "6cc8d0dd-d1e6-4374-bb90-aaefc9197350"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.298492 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "6cc8d0dd-d1e6-4374-bb90-aaefc9197350" (UID: "6cc8d0dd-d1e6-4374-bb90-aaefc9197350"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.298692 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "6cc8d0dd-d1e6-4374-bb90-aaefc9197350" (UID: "6cc8d0dd-d1e6-4374-bb90-aaefc9197350"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.374151 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6cc8d0dd-d1e6-4374-bb90-aaefc9197350" (UID: "6cc8d0dd-d1e6-4374-bb90-aaefc9197350"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.374752 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-inventory" (OuterVolumeSpecName: "inventory") pod "6cc8d0dd-d1e6-4374-bb90-aaefc9197350" (UID: "6cc8d0dd-d1e6-4374-bb90-aaefc9197350"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.380534 4837 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.380578 4837 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.380593 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.380602 4837 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.380613 4837 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.380623 4837 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.380631 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.380680 4837 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.380688 4837 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.380698 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75mvx\" (UniqueName: \"kubernetes.io/projected/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-kube-api-access-75mvx\") on node \"crc\" DevicePath \"\"" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.380708 4837 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc8d0dd-d1e6-4374-bb90-aaefc9197350-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.748400 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" event={"ID":"6cc8d0dd-d1e6-4374-bb90-aaefc9197350","Type":"ContainerDied","Data":"a9a004fb6e650fe374173e7535e9f528dd1cc37af26ae43f015e8366167fa211"} Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.748453 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9a004fb6e650fe374173e7535e9f528dd1cc37af26ae43f015e8366167fa211" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.748543 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gj59c" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.753170 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tsfbn" event={"ID":"d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a","Type":"ContainerDied","Data":"9cdfda55cf58dcae44b171ff87d0d9876fe414823d1ec7d0b1b7ed1df6f59fe5"} Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.753236 4837 scope.go:117] "RemoveContainer" containerID="bfa2c9a9825342e83fc8ddcd92d9af1ebbe4e83ce8898f8cdf46504fa5d9b0b9" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.753234 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tsfbn" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.785543 4837 scope.go:117] "RemoveContainer" containerID="50cefa7392ae5d869e28ff046f43607d22feea06cc84d09928cf9eb7cc27bc7c" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.805713 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tsfbn"] Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.807865 4837 scope.go:117] "RemoveContainer" containerID="adad9381bc6a39ddbdad6c4301cbb23bc8c90b91950618f3b2fe7fc956cf30c4" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.814846 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tsfbn"] Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.888351 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-kbffp"] Mar 13 12:19:32 crc kubenswrapper[4837]: E0313 12:19:32.888770 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cc8d0dd-d1e6-4374-bb90-aaefc9197350" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.888796 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cc8d0dd-d1e6-4374-bb90-aaefc9197350" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 13 12:19:32 crc kubenswrapper[4837]: E0313 12:19:32.888824 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a" containerName="extract-content" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.888833 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a" containerName="extract-content" Mar 13 12:19:32 crc kubenswrapper[4837]: E0313 12:19:32.888855 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a" containerName="extract-utilities" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.888863 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a" containerName="extract-utilities" Mar 13 12:19:32 crc kubenswrapper[4837]: E0313 12:19:32.888880 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9900be86-1923-4036-bccc-7e9c0484fb4c" containerName="extract-content" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.888887 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="9900be86-1923-4036-bccc-7e9c0484fb4c" containerName="extract-content" Mar 13 12:19:32 crc kubenswrapper[4837]: E0313 12:19:32.888896 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a" containerName="registry-server" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.888905 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a" containerName="registry-server" Mar 13 12:19:32 crc kubenswrapper[4837]: E0313 12:19:32.888922 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9900be86-1923-4036-bccc-7e9c0484fb4c" containerName="registry-server" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.888931 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="9900be86-1923-4036-bccc-7e9c0484fb4c" containerName="registry-server" Mar 13 12:19:32 crc kubenswrapper[4837]: E0313 12:19:32.888942 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9900be86-1923-4036-bccc-7e9c0484fb4c" containerName="extract-utilities" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.888950 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="9900be86-1923-4036-bccc-7e9c0484fb4c" containerName="extract-utilities" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.889196 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a" containerName="registry-server" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.889221 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="9900be86-1923-4036-bccc-7e9c0484fb4c" containerName="registry-server" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.889238 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cc8d0dd-d1e6-4374-bb90-aaefc9197350" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.889990 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kbffp" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.892514 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.892920 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.893099 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.900530 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.900679 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dxdkz" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.901343 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-kbffp"] Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.991863 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/092bd277-504a-450d-aca1-d8ecc18f0c9f-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kbffp\" (UID: \"092bd277-504a-450d-aca1-d8ecc18f0c9f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kbffp" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.992096 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/092bd277-504a-450d-aca1-d8ecc18f0c9f-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kbffp\" (UID: \"092bd277-504a-450d-aca1-d8ecc18f0c9f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kbffp" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.992170 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/092bd277-504a-450d-aca1-d8ecc18f0c9f-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kbffp\" (UID: \"092bd277-504a-450d-aca1-d8ecc18f0c9f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kbffp" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.992261 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v59h5\" (UniqueName: \"kubernetes.io/projected/092bd277-504a-450d-aca1-d8ecc18f0c9f-kube-api-access-v59h5\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kbffp\" (UID: \"092bd277-504a-450d-aca1-d8ecc18f0c9f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kbffp" Mar 13 12:19:32 crc kubenswrapper[4837]: I0313 12:19:32.992366 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/092bd277-504a-450d-aca1-d8ecc18f0c9f-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kbffp\" (UID: \"092bd277-504a-450d-aca1-d8ecc18f0c9f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kbffp" Mar 13 12:19:33 crc kubenswrapper[4837]: I0313 12:19:33.059768 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9900be86-1923-4036-bccc-7e9c0484fb4c" path="/var/lib/kubelet/pods/9900be86-1923-4036-bccc-7e9c0484fb4c/volumes" Mar 13 12:19:33 crc kubenswrapper[4837]: I0313 12:19:33.060842 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a" path="/var/lib/kubelet/pods/d8e0d9ea-b925-488b-93dc-bfd9fb6d2c1a/volumes" Mar 13 12:19:33 crc kubenswrapper[4837]: I0313 12:19:33.094440 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/092bd277-504a-450d-aca1-d8ecc18f0c9f-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kbffp\" (UID: \"092bd277-504a-450d-aca1-d8ecc18f0c9f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kbffp" Mar 13 12:19:33 crc kubenswrapper[4837]: I0313 12:19:33.094557 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/092bd277-504a-450d-aca1-d8ecc18f0c9f-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kbffp\" (UID: \"092bd277-504a-450d-aca1-d8ecc18f0c9f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kbffp" Mar 13 12:19:33 crc kubenswrapper[4837]: I0313 12:19:33.094590 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/092bd277-504a-450d-aca1-d8ecc18f0c9f-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kbffp\" (UID: \"092bd277-504a-450d-aca1-d8ecc18f0c9f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kbffp" Mar 13 12:19:33 crc kubenswrapper[4837]: I0313 12:19:33.094617 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v59h5\" (UniqueName: \"kubernetes.io/projected/092bd277-504a-450d-aca1-d8ecc18f0c9f-kube-api-access-v59h5\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kbffp\" (UID: \"092bd277-504a-450d-aca1-d8ecc18f0c9f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kbffp" Mar 13 12:19:33 crc kubenswrapper[4837]: I0313 12:19:33.094666 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/092bd277-504a-450d-aca1-d8ecc18f0c9f-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kbffp\" (UID: \"092bd277-504a-450d-aca1-d8ecc18f0c9f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kbffp" Mar 13 12:19:33 crc kubenswrapper[4837]: I0313 12:19:33.095884 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/092bd277-504a-450d-aca1-d8ecc18f0c9f-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kbffp\" (UID: \"092bd277-504a-450d-aca1-d8ecc18f0c9f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kbffp" Mar 13 12:19:33 crc kubenswrapper[4837]: I0313 12:19:33.100753 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/092bd277-504a-450d-aca1-d8ecc18f0c9f-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kbffp\" (UID: \"092bd277-504a-450d-aca1-d8ecc18f0c9f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kbffp" Mar 13 12:19:33 crc kubenswrapper[4837]: I0313 12:19:33.100939 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/092bd277-504a-450d-aca1-d8ecc18f0c9f-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kbffp\" (UID: \"092bd277-504a-450d-aca1-d8ecc18f0c9f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kbffp" Mar 13 12:19:33 crc kubenswrapper[4837]: I0313 12:19:33.102371 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/092bd277-504a-450d-aca1-d8ecc18f0c9f-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kbffp\" (UID: \"092bd277-504a-450d-aca1-d8ecc18f0c9f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kbffp" Mar 13 12:19:33 crc kubenswrapper[4837]: I0313 12:19:33.116302 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v59h5\" (UniqueName: \"kubernetes.io/projected/092bd277-504a-450d-aca1-d8ecc18f0c9f-kube-api-access-v59h5\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kbffp\" (UID: \"092bd277-504a-450d-aca1-d8ecc18f0c9f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kbffp" Mar 13 12:19:33 crc kubenswrapper[4837]: I0313 12:19:33.230143 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kbffp" Mar 13 12:19:33 crc kubenswrapper[4837]: W0313 12:19:33.729287 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod092bd277_504a_450d_aca1_d8ecc18f0c9f.slice/crio-a99bf809e709f4093c6a3a9be8928febf788cb56e51d961891f138284d2fd35e WatchSource:0}: Error finding container a99bf809e709f4093c6a3a9be8928febf788cb56e51d961891f138284d2fd35e: Status 404 returned error can't find the container with id a99bf809e709f4093c6a3a9be8928febf788cb56e51d961891f138284d2fd35e Mar 13 12:19:33 crc kubenswrapper[4837]: I0313 12:19:33.731846 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-kbffp"] Mar 13 12:19:33 crc kubenswrapper[4837]: I0313 12:19:33.765101 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kbffp" event={"ID":"092bd277-504a-450d-aca1-d8ecc18f0c9f","Type":"ContainerStarted","Data":"a99bf809e709f4093c6a3a9be8928febf788cb56e51d961891f138284d2fd35e"} Mar 13 12:19:34 crc kubenswrapper[4837]: I0313 12:19:34.042754 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-mzmd5"] Mar 13 12:19:34 crc kubenswrapper[4837]: I0313 12:19:34.060664 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-mzmd5"] Mar 13 12:19:34 crc kubenswrapper[4837]: I0313 12:19:34.775843 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kbffp" event={"ID":"092bd277-504a-450d-aca1-d8ecc18f0c9f","Type":"ContainerStarted","Data":"77078cd73552b4fd4a97cf95b6976032937dfb766ff067aae032358f923a91d8"} Mar 13 12:19:34 crc kubenswrapper[4837]: I0313 12:19:34.798536 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kbffp" podStartSLOduration=2.383676351 podStartE2EDuration="2.798515959s" podCreationTimestamp="2026-03-13 12:19:32 +0000 UTC" firstStartedPulling="2026-03-13 12:19:33.731763962 +0000 UTC m=+1889.370030725" lastFinishedPulling="2026-03-13 12:19:34.14660357 +0000 UTC m=+1889.784870333" observedRunningTime="2026-03-13 12:19:34.792391947 +0000 UTC m=+1890.430658730" watchObservedRunningTime="2026-03-13 12:19:34.798515959 +0000 UTC m=+1890.436782722" Mar 13 12:19:35 crc kubenswrapper[4837]: I0313 12:19:35.060886 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0f45aae-caa3-4c50-9059-be42d328cba1" path="/var/lib/kubelet/pods/f0f45aae-caa3-4c50-9059-be42d328cba1/volumes" Mar 13 12:19:39 crc kubenswrapper[4837]: I0313 12:19:39.048178 4837 scope.go:117] "RemoveContainer" containerID="92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a" Mar 13 12:19:39 crc kubenswrapper[4837]: I0313 12:19:39.819750 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerStarted","Data":"95ed8f8c7021ad56734ed8e8626e89cd8f2efcdf1bf9a33ce258f19439eeb037"} Mar 13 12:20:00 crc kubenswrapper[4837]: I0313 12:20:00.140814 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556740-snmw2"] Mar 13 12:20:00 crc kubenswrapper[4837]: I0313 12:20:00.142758 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556740-snmw2" Mar 13 12:20:00 crc kubenswrapper[4837]: I0313 12:20:00.144904 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:20:00 crc kubenswrapper[4837]: I0313 12:20:00.145030 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:20:00 crc kubenswrapper[4837]: I0313 12:20:00.146747 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:20:00 crc kubenswrapper[4837]: I0313 12:20:00.156222 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556740-snmw2"] Mar 13 12:20:00 crc kubenswrapper[4837]: I0313 12:20:00.309577 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j68n2\" (UniqueName: \"kubernetes.io/projected/e01710d7-a463-41fe-9d86-2410a8ccd8e8-kube-api-access-j68n2\") pod \"auto-csr-approver-29556740-snmw2\" (UID: \"e01710d7-a463-41fe-9d86-2410a8ccd8e8\") " pod="openshift-infra/auto-csr-approver-29556740-snmw2" Mar 13 12:20:00 crc kubenswrapper[4837]: I0313 12:20:00.412053 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j68n2\" (UniqueName: \"kubernetes.io/projected/e01710d7-a463-41fe-9d86-2410a8ccd8e8-kube-api-access-j68n2\") pod \"auto-csr-approver-29556740-snmw2\" (UID: \"e01710d7-a463-41fe-9d86-2410a8ccd8e8\") " pod="openshift-infra/auto-csr-approver-29556740-snmw2" Mar 13 12:20:00 crc kubenswrapper[4837]: I0313 12:20:00.434549 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j68n2\" (UniqueName: \"kubernetes.io/projected/e01710d7-a463-41fe-9d86-2410a8ccd8e8-kube-api-access-j68n2\") pod \"auto-csr-approver-29556740-snmw2\" (UID: \"e01710d7-a463-41fe-9d86-2410a8ccd8e8\") " pod="openshift-infra/auto-csr-approver-29556740-snmw2" Mar 13 12:20:00 crc kubenswrapper[4837]: I0313 12:20:00.481964 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556740-snmw2" Mar 13 12:20:00 crc kubenswrapper[4837]: I0313 12:20:00.923504 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556740-snmw2"] Mar 13 12:20:01 crc kubenswrapper[4837]: I0313 12:20:01.002987 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556740-snmw2" event={"ID":"e01710d7-a463-41fe-9d86-2410a8ccd8e8","Type":"ContainerStarted","Data":"2872cecee5e2227b12f3a548445c20274e30fe510d83ffd4afcd24e93795e826"} Mar 13 12:20:03 crc kubenswrapper[4837]: I0313 12:20:03.025356 4837 generic.go:334] "Generic (PLEG): container finished" podID="e01710d7-a463-41fe-9d86-2410a8ccd8e8" containerID="5c53da3a56d1c8f877bdab4d65362dc1a8c31f8cd4991718456d0c1946898d66" exitCode=0 Mar 13 12:20:03 crc kubenswrapper[4837]: I0313 12:20:03.025496 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556740-snmw2" event={"ID":"e01710d7-a463-41fe-9d86-2410a8ccd8e8","Type":"ContainerDied","Data":"5c53da3a56d1c8f877bdab4d65362dc1a8c31f8cd4991718456d0c1946898d66"} Mar 13 12:20:04 crc kubenswrapper[4837]: I0313 12:20:04.354534 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556740-snmw2" Mar 13 12:20:04 crc kubenswrapper[4837]: I0313 12:20:04.489075 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j68n2\" (UniqueName: \"kubernetes.io/projected/e01710d7-a463-41fe-9d86-2410a8ccd8e8-kube-api-access-j68n2\") pod \"e01710d7-a463-41fe-9d86-2410a8ccd8e8\" (UID: \"e01710d7-a463-41fe-9d86-2410a8ccd8e8\") " Mar 13 12:20:04 crc kubenswrapper[4837]: I0313 12:20:04.494152 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e01710d7-a463-41fe-9d86-2410a8ccd8e8-kube-api-access-j68n2" (OuterVolumeSpecName: "kube-api-access-j68n2") pod "e01710d7-a463-41fe-9d86-2410a8ccd8e8" (UID: "e01710d7-a463-41fe-9d86-2410a8ccd8e8"). InnerVolumeSpecName "kube-api-access-j68n2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:20:04 crc kubenswrapper[4837]: I0313 12:20:04.590979 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j68n2\" (UniqueName: \"kubernetes.io/projected/e01710d7-a463-41fe-9d86-2410a8ccd8e8-kube-api-access-j68n2\") on node \"crc\" DevicePath \"\"" Mar 13 12:20:05 crc kubenswrapper[4837]: I0313 12:20:05.047978 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556740-snmw2" Mar 13 12:20:05 crc kubenswrapper[4837]: I0313 12:20:05.065198 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556740-snmw2" event={"ID":"e01710d7-a463-41fe-9d86-2410a8ccd8e8","Type":"ContainerDied","Data":"2872cecee5e2227b12f3a548445c20274e30fe510d83ffd4afcd24e93795e826"} Mar 13 12:20:05 crc kubenswrapper[4837]: I0313 12:20:05.065257 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2872cecee5e2227b12f3a548445c20274e30fe510d83ffd4afcd24e93795e826" Mar 13 12:20:05 crc kubenswrapper[4837]: I0313 12:20:05.416455 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556734-g7zt7"] Mar 13 12:20:05 crc kubenswrapper[4837]: I0313 12:20:05.424864 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556734-g7zt7"] Mar 13 12:20:07 crc kubenswrapper[4837]: I0313 12:20:07.059487 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b41b916d-46ab-43e8-b624-bb1fb6aaf2f8" path="/var/lib/kubelet/pods/b41b916d-46ab-43e8-b624-bb1fb6aaf2f8/volumes" Mar 13 12:20:12 crc kubenswrapper[4837]: E0313 12:20:12.185795 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode01710d7_a463_41fe_9d86_2410a8ccd8e8.slice\": RecentStats: unable to find data in memory cache]" Mar 13 12:20:22 crc kubenswrapper[4837]: E0313 12:20:22.431918 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode01710d7_a463_41fe_9d86_2410a8ccd8e8.slice\": RecentStats: unable to find data in memory cache]" Mar 13 12:20:28 crc kubenswrapper[4837]: I0313 12:20:28.249398 4837 generic.go:334] "Generic (PLEG): container finished" podID="092bd277-504a-450d-aca1-d8ecc18f0c9f" containerID="77078cd73552b4fd4a97cf95b6976032937dfb766ff067aae032358f923a91d8" exitCode=0 Mar 13 12:20:28 crc kubenswrapper[4837]: I0313 12:20:28.249470 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kbffp" event={"ID":"092bd277-504a-450d-aca1-d8ecc18f0c9f","Type":"ContainerDied","Data":"77078cd73552b4fd4a97cf95b6976032937dfb766ff067aae032358f923a91d8"} Mar 13 12:20:29 crc kubenswrapper[4837]: I0313 12:20:29.656169 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kbffp" Mar 13 12:20:29 crc kubenswrapper[4837]: I0313 12:20:29.678202 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/092bd277-504a-450d-aca1-d8ecc18f0c9f-ovncontroller-config-0\") pod \"092bd277-504a-450d-aca1-d8ecc18f0c9f\" (UID: \"092bd277-504a-450d-aca1-d8ecc18f0c9f\") " Mar 13 12:20:29 crc kubenswrapper[4837]: I0313 12:20:29.678269 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/092bd277-504a-450d-aca1-d8ecc18f0c9f-ovn-combined-ca-bundle\") pod \"092bd277-504a-450d-aca1-d8ecc18f0c9f\" (UID: \"092bd277-504a-450d-aca1-d8ecc18f0c9f\") " Mar 13 12:20:29 crc kubenswrapper[4837]: I0313 12:20:29.678318 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/092bd277-504a-450d-aca1-d8ecc18f0c9f-inventory\") pod \"092bd277-504a-450d-aca1-d8ecc18f0c9f\" (UID: \"092bd277-504a-450d-aca1-d8ecc18f0c9f\") " Mar 13 12:20:29 crc kubenswrapper[4837]: I0313 12:20:29.678355 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/092bd277-504a-450d-aca1-d8ecc18f0c9f-ssh-key-openstack-edpm-ipam\") pod \"092bd277-504a-450d-aca1-d8ecc18f0c9f\" (UID: \"092bd277-504a-450d-aca1-d8ecc18f0c9f\") " Mar 13 12:20:29 crc kubenswrapper[4837]: I0313 12:20:29.678395 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v59h5\" (UniqueName: \"kubernetes.io/projected/092bd277-504a-450d-aca1-d8ecc18f0c9f-kube-api-access-v59h5\") pod \"092bd277-504a-450d-aca1-d8ecc18f0c9f\" (UID: \"092bd277-504a-450d-aca1-d8ecc18f0c9f\") " Mar 13 12:20:29 crc kubenswrapper[4837]: I0313 12:20:29.684812 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/092bd277-504a-450d-aca1-d8ecc18f0c9f-kube-api-access-v59h5" (OuterVolumeSpecName: "kube-api-access-v59h5") pod "092bd277-504a-450d-aca1-d8ecc18f0c9f" (UID: "092bd277-504a-450d-aca1-d8ecc18f0c9f"). InnerVolumeSpecName "kube-api-access-v59h5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:20:29 crc kubenswrapper[4837]: I0313 12:20:29.684828 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/092bd277-504a-450d-aca1-d8ecc18f0c9f-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "092bd277-504a-450d-aca1-d8ecc18f0c9f" (UID: "092bd277-504a-450d-aca1-d8ecc18f0c9f"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:20:29 crc kubenswrapper[4837]: I0313 12:20:29.710090 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/092bd277-504a-450d-aca1-d8ecc18f0c9f-inventory" (OuterVolumeSpecName: "inventory") pod "092bd277-504a-450d-aca1-d8ecc18f0c9f" (UID: "092bd277-504a-450d-aca1-d8ecc18f0c9f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:20:29 crc kubenswrapper[4837]: I0313 12:20:29.713828 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/092bd277-504a-450d-aca1-d8ecc18f0c9f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "092bd277-504a-450d-aca1-d8ecc18f0c9f" (UID: "092bd277-504a-450d-aca1-d8ecc18f0c9f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:20:29 crc kubenswrapper[4837]: I0313 12:20:29.715221 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/092bd277-504a-450d-aca1-d8ecc18f0c9f-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "092bd277-504a-450d-aca1-d8ecc18f0c9f" (UID: "092bd277-504a-450d-aca1-d8ecc18f0c9f"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:20:29 crc kubenswrapper[4837]: I0313 12:20:29.780061 4837 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/092bd277-504a-450d-aca1-d8ecc18f0c9f-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 13 12:20:29 crc kubenswrapper[4837]: I0313 12:20:29.780105 4837 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/092bd277-504a-450d-aca1-d8ecc18f0c9f-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:20:29 crc kubenswrapper[4837]: I0313 12:20:29.780118 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/092bd277-504a-450d-aca1-d8ecc18f0c9f-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 12:20:29 crc kubenswrapper[4837]: I0313 12:20:29.780129 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/092bd277-504a-450d-aca1-d8ecc18f0c9f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 12:20:29 crc kubenswrapper[4837]: I0313 12:20:29.780142 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v59h5\" (UniqueName: \"kubernetes.io/projected/092bd277-504a-450d-aca1-d8ecc18f0c9f-kube-api-access-v59h5\") on node \"crc\" DevicePath \"\"" Mar 13 12:20:29 crc kubenswrapper[4837]: I0313 12:20:29.948611 4837 scope.go:117] "RemoveContainer" containerID="e540ca1787fcba1ed1f9804f4336a11c9388c115ed0bc76404d559071e68ab56" Mar 13 12:20:29 crc kubenswrapper[4837]: I0313 12:20:29.997853 4837 scope.go:117] "RemoveContainer" containerID="618f29cef46a018933eff3564372eb6b93270ae38a4b8bb52de53e9e241ebfba" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.269969 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kbffp" event={"ID":"092bd277-504a-450d-aca1-d8ecc18f0c9f","Type":"ContainerDied","Data":"a99bf809e709f4093c6a3a9be8928febf788cb56e51d961891f138284d2fd35e"} Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.270250 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a99bf809e709f4093c6a3a9be8928febf788cb56e51d961891f138284d2fd35e" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.270037 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kbffp" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.352831 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4"] Mar 13 12:20:30 crc kubenswrapper[4837]: E0313 12:20:30.353289 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="092bd277-504a-450d-aca1-d8ecc18f0c9f" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.353309 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="092bd277-504a-450d-aca1-d8ecc18f0c9f" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 13 12:20:30 crc kubenswrapper[4837]: E0313 12:20:30.353327 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e01710d7-a463-41fe-9d86-2410a8ccd8e8" containerName="oc" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.353336 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e01710d7-a463-41fe-9d86-2410a8ccd8e8" containerName="oc" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.353582 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="e01710d7-a463-41fe-9d86-2410a8ccd8e8" containerName="oc" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.353600 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="092bd277-504a-450d-aca1-d8ecc18f0c9f" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.354536 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.360906 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.360944 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.360959 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.361108 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.361850 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dxdkz" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.362430 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.369547 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4"] Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.391279 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59kht\" (UniqueName: \"kubernetes.io/projected/20f35066-9c10-4433-a655-f5cef18d4deb-kube-api-access-59kht\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4\" (UID: \"20f35066-9c10-4433-a655-f5cef18d4deb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.391373 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4\" (UID: \"20f35066-9c10-4433-a655-f5cef18d4deb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.391452 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4\" (UID: \"20f35066-9c10-4433-a655-f5cef18d4deb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.391692 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4\" (UID: \"20f35066-9c10-4433-a655-f5cef18d4deb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.391854 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4\" (UID: \"20f35066-9c10-4433-a655-f5cef18d4deb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.391974 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4\" (UID: \"20f35066-9c10-4433-a655-f5cef18d4deb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.494051 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59kht\" (UniqueName: \"kubernetes.io/projected/20f35066-9c10-4433-a655-f5cef18d4deb-kube-api-access-59kht\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4\" (UID: \"20f35066-9c10-4433-a655-f5cef18d4deb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.494146 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4\" (UID: \"20f35066-9c10-4433-a655-f5cef18d4deb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.494202 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4\" (UID: \"20f35066-9c10-4433-a655-f5cef18d4deb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.494316 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4\" (UID: \"20f35066-9c10-4433-a655-f5cef18d4deb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.494408 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4\" (UID: \"20f35066-9c10-4433-a655-f5cef18d4deb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.494484 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4\" (UID: \"20f35066-9c10-4433-a655-f5cef18d4deb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.498333 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4\" (UID: \"20f35066-9c10-4433-a655-f5cef18d4deb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.498551 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4\" (UID: \"20f35066-9c10-4433-a655-f5cef18d4deb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.498901 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4\" (UID: \"20f35066-9c10-4433-a655-f5cef18d4deb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.499676 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4\" (UID: \"20f35066-9c10-4433-a655-f5cef18d4deb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.500760 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4\" (UID: \"20f35066-9c10-4433-a655-f5cef18d4deb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.512957 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59kht\" (UniqueName: \"kubernetes.io/projected/20f35066-9c10-4433-a655-f5cef18d4deb-kube-api-access-59kht\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4\" (UID: \"20f35066-9c10-4433-a655-f5cef18d4deb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4" Mar 13 12:20:30 crc kubenswrapper[4837]: I0313 12:20:30.671543 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4" Mar 13 12:20:31 crc kubenswrapper[4837]: I0313 12:20:31.173870 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4"] Mar 13 12:20:31 crc kubenswrapper[4837]: I0313 12:20:31.281166 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4" event={"ID":"20f35066-9c10-4433-a655-f5cef18d4deb","Type":"ContainerStarted","Data":"904b0b4d824437fa7194e901c77da3b777325b0d631b66698c9a20c59c99d938"} Mar 13 12:20:32 crc kubenswrapper[4837]: I0313 12:20:32.290041 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4" event={"ID":"20f35066-9c10-4433-a655-f5cef18d4deb","Type":"ContainerStarted","Data":"ff8212009c342279b5e1961bf82567e4bb8b1fc5a57b88231787fdcfc37b919c"} Mar 13 12:20:32 crc kubenswrapper[4837]: I0313 12:20:32.313842 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4" podStartSLOduration=1.804728444 podStartE2EDuration="2.313825724s" podCreationTimestamp="2026-03-13 12:20:30 +0000 UTC" firstStartedPulling="2026-03-13 12:20:31.182202328 +0000 UTC m=+1946.820469091" lastFinishedPulling="2026-03-13 12:20:31.691299608 +0000 UTC m=+1947.329566371" observedRunningTime="2026-03-13 12:20:32.312933745 +0000 UTC m=+1947.951200508" watchObservedRunningTime="2026-03-13 12:20:32.313825724 +0000 UTC m=+1947.952092487" Mar 13 12:20:32 crc kubenswrapper[4837]: E0313 12:20:32.675921 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode01710d7_a463_41fe_9d86_2410a8ccd8e8.slice\": RecentStats: unable to find data in memory cache]" Mar 13 12:20:42 crc kubenswrapper[4837]: E0313 12:20:42.914286 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode01710d7_a463_41fe_9d86_2410a8ccd8e8.slice\": RecentStats: unable to find data in memory cache]" Mar 13 12:20:53 crc kubenswrapper[4837]: E0313 12:20:53.149899 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode01710d7_a463_41fe_9d86_2410a8ccd8e8.slice\": RecentStats: unable to find data in memory cache]" Mar 13 12:21:03 crc kubenswrapper[4837]: E0313 12:21:03.378474 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode01710d7_a463_41fe_9d86_2410a8ccd8e8.slice\": RecentStats: unable to find data in memory cache]" Mar 13 12:21:13 crc kubenswrapper[4837]: I0313 12:21:13.625022 4837 generic.go:334] "Generic (PLEG): container finished" podID="20f35066-9c10-4433-a655-f5cef18d4deb" containerID="ff8212009c342279b5e1961bf82567e4bb8b1fc5a57b88231787fdcfc37b919c" exitCode=0 Mar 13 12:21:13 crc kubenswrapper[4837]: I0313 12:21:13.625102 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4" event={"ID":"20f35066-9c10-4433-a655-f5cef18d4deb","Type":"ContainerDied","Data":"ff8212009c342279b5e1961bf82567e4bb8b1fc5a57b88231787fdcfc37b919c"} Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.059942 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.164958 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-nova-metadata-neutron-config-0\") pod \"20f35066-9c10-4433-a655-f5cef18d4deb\" (UID: \"20f35066-9c10-4433-a655-f5cef18d4deb\") " Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.165037 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-neutron-metadata-combined-ca-bundle\") pod \"20f35066-9c10-4433-a655-f5cef18d4deb\" (UID: \"20f35066-9c10-4433-a655-f5cef18d4deb\") " Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.165082 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59kht\" (UniqueName: \"kubernetes.io/projected/20f35066-9c10-4433-a655-f5cef18d4deb-kube-api-access-59kht\") pod \"20f35066-9c10-4433-a655-f5cef18d4deb\" (UID: \"20f35066-9c10-4433-a655-f5cef18d4deb\") " Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.165113 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-ssh-key-openstack-edpm-ipam\") pod \"20f35066-9c10-4433-a655-f5cef18d4deb\" (UID: \"20f35066-9c10-4433-a655-f5cef18d4deb\") " Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.165864 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-neutron-ovn-metadata-agent-neutron-config-0\") pod \"20f35066-9c10-4433-a655-f5cef18d4deb\" (UID: \"20f35066-9c10-4433-a655-f5cef18d4deb\") " Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.166036 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-inventory\") pod \"20f35066-9c10-4433-a655-f5cef18d4deb\" (UID: \"20f35066-9c10-4433-a655-f5cef18d4deb\") " Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.171992 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20f35066-9c10-4433-a655-f5cef18d4deb-kube-api-access-59kht" (OuterVolumeSpecName: "kube-api-access-59kht") pod "20f35066-9c10-4433-a655-f5cef18d4deb" (UID: "20f35066-9c10-4433-a655-f5cef18d4deb"). InnerVolumeSpecName "kube-api-access-59kht". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.183119 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "20f35066-9c10-4433-a655-f5cef18d4deb" (UID: "20f35066-9c10-4433-a655-f5cef18d4deb"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.195389 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "20f35066-9c10-4433-a655-f5cef18d4deb" (UID: "20f35066-9c10-4433-a655-f5cef18d4deb"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.197462 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "20f35066-9c10-4433-a655-f5cef18d4deb" (UID: "20f35066-9c10-4433-a655-f5cef18d4deb"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.202621 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "20f35066-9c10-4433-a655-f5cef18d4deb" (UID: "20f35066-9c10-4433-a655-f5cef18d4deb"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.207326 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-inventory" (OuterVolumeSpecName: "inventory") pod "20f35066-9c10-4433-a655-f5cef18d4deb" (UID: "20f35066-9c10-4433-a655-f5cef18d4deb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.269002 4837 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.269031 4837 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.269068 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59kht\" (UniqueName: \"kubernetes.io/projected/20f35066-9c10-4433-a655-f5cef18d4deb-kube-api-access-59kht\") on node \"crc\" DevicePath \"\"" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.269078 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.269088 4837 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.269100 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/20f35066-9c10-4433-a655-f5cef18d4deb-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.640682 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4" event={"ID":"20f35066-9c10-4433-a655-f5cef18d4deb","Type":"ContainerDied","Data":"904b0b4d824437fa7194e901c77da3b777325b0d631b66698c9a20c59c99d938"} Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.640725 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="904b0b4d824437fa7194e901c77da3b777325b0d631b66698c9a20c59c99d938" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.640734 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.766165 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5"] Mar 13 12:21:15 crc kubenswrapper[4837]: E0313 12:21:15.766744 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20f35066-9c10-4433-a655-f5cef18d4deb" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.766847 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="20f35066-9c10-4433-a655-f5cef18d4deb" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.767075 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="20f35066-9c10-4433-a655-f5cef18d4deb" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.770391 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.777736 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.778082 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.778526 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.781510 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.782719 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dxdkz" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.788396 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5"] Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.895871 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/394104d4-0291-4071-a7da-d7b71e0f4083-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5\" (UID: \"394104d4-0291-4071-a7da-d7b71e0f4083\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.895926 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/394104d4-0291-4071-a7da-d7b71e0f4083-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5\" (UID: \"394104d4-0291-4071-a7da-d7b71e0f4083\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.895974 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/394104d4-0291-4071-a7da-d7b71e0f4083-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5\" (UID: \"394104d4-0291-4071-a7da-d7b71e0f4083\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.896308 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/394104d4-0291-4071-a7da-d7b71e0f4083-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5\" (UID: \"394104d4-0291-4071-a7da-d7b71e0f4083\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.896389 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwdjz\" (UniqueName: \"kubernetes.io/projected/394104d4-0291-4071-a7da-d7b71e0f4083-kube-api-access-zwdjz\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5\" (UID: \"394104d4-0291-4071-a7da-d7b71e0f4083\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.998748 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/394104d4-0291-4071-a7da-d7b71e0f4083-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5\" (UID: \"394104d4-0291-4071-a7da-d7b71e0f4083\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.998808 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwdjz\" (UniqueName: \"kubernetes.io/projected/394104d4-0291-4071-a7da-d7b71e0f4083-kube-api-access-zwdjz\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5\" (UID: \"394104d4-0291-4071-a7da-d7b71e0f4083\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.998930 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/394104d4-0291-4071-a7da-d7b71e0f4083-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5\" (UID: \"394104d4-0291-4071-a7da-d7b71e0f4083\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.998959 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/394104d4-0291-4071-a7da-d7b71e0f4083-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5\" (UID: \"394104d4-0291-4071-a7da-d7b71e0f4083\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5" Mar 13 12:21:15 crc kubenswrapper[4837]: I0313 12:21:15.998998 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/394104d4-0291-4071-a7da-d7b71e0f4083-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5\" (UID: \"394104d4-0291-4071-a7da-d7b71e0f4083\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5" Mar 13 12:21:16 crc kubenswrapper[4837]: I0313 12:21:16.002995 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/394104d4-0291-4071-a7da-d7b71e0f4083-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5\" (UID: \"394104d4-0291-4071-a7da-d7b71e0f4083\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5" Mar 13 12:21:16 crc kubenswrapper[4837]: I0313 12:21:16.004072 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/394104d4-0291-4071-a7da-d7b71e0f4083-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5\" (UID: \"394104d4-0291-4071-a7da-d7b71e0f4083\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5" Mar 13 12:21:16 crc kubenswrapper[4837]: I0313 12:21:16.005037 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/394104d4-0291-4071-a7da-d7b71e0f4083-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5\" (UID: \"394104d4-0291-4071-a7da-d7b71e0f4083\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5" Mar 13 12:21:16 crc kubenswrapper[4837]: I0313 12:21:16.007165 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/394104d4-0291-4071-a7da-d7b71e0f4083-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5\" (UID: \"394104d4-0291-4071-a7da-d7b71e0f4083\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5" Mar 13 12:21:16 crc kubenswrapper[4837]: I0313 12:21:16.019444 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwdjz\" (UniqueName: \"kubernetes.io/projected/394104d4-0291-4071-a7da-d7b71e0f4083-kube-api-access-zwdjz\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5\" (UID: \"394104d4-0291-4071-a7da-d7b71e0f4083\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5" Mar 13 12:21:16 crc kubenswrapper[4837]: I0313 12:21:16.096999 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5" Mar 13 12:21:16 crc kubenswrapper[4837]: I0313 12:21:16.638566 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5"] Mar 13 12:21:17 crc kubenswrapper[4837]: I0313 12:21:17.659399 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5" event={"ID":"394104d4-0291-4071-a7da-d7b71e0f4083","Type":"ContainerStarted","Data":"f30e37b9f7f0384121aa71f44589ddb9d3068a703ce922ccc922c61ff88b1f38"} Mar 13 12:21:17 crc kubenswrapper[4837]: I0313 12:21:17.660756 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5" event={"ID":"394104d4-0291-4071-a7da-d7b71e0f4083","Type":"ContainerStarted","Data":"635868923cf7f5008b52abe367a7a6d82aa47f6efef93a5cafc25c193c32e1e5"} Mar 13 12:21:17 crc kubenswrapper[4837]: I0313 12:21:17.677997 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5" podStartSLOduration=1.9677041119999998 podStartE2EDuration="2.677975126s" podCreationTimestamp="2026-03-13 12:21:15 +0000 UTC" firstStartedPulling="2026-03-13 12:21:16.64434545 +0000 UTC m=+1992.282612213" lastFinishedPulling="2026-03-13 12:21:17.354616464 +0000 UTC m=+1992.992883227" observedRunningTime="2026-03-13 12:21:17.672218575 +0000 UTC m=+1993.310485338" watchObservedRunningTime="2026-03-13 12:21:17.677975126 +0000 UTC m=+1993.316241889" Mar 13 12:22:00 crc kubenswrapper[4837]: I0313 12:22:00.140313 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556742-5ggnq"] Mar 13 12:22:00 crc kubenswrapper[4837]: I0313 12:22:00.142416 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556742-5ggnq" Mar 13 12:22:00 crc kubenswrapper[4837]: I0313 12:22:00.144789 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:22:00 crc kubenswrapper[4837]: I0313 12:22:00.145011 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:22:00 crc kubenswrapper[4837]: I0313 12:22:00.145672 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:22:00 crc kubenswrapper[4837]: I0313 12:22:00.149303 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556742-5ggnq"] Mar 13 12:22:00 crc kubenswrapper[4837]: I0313 12:22:00.282232 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qf6v4\" (UniqueName: \"kubernetes.io/projected/aed6dbbf-3a09-4b60-9757-7c74a07f9c63-kube-api-access-qf6v4\") pod \"auto-csr-approver-29556742-5ggnq\" (UID: \"aed6dbbf-3a09-4b60-9757-7c74a07f9c63\") " pod="openshift-infra/auto-csr-approver-29556742-5ggnq" Mar 13 12:22:00 crc kubenswrapper[4837]: I0313 12:22:00.384094 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qf6v4\" (UniqueName: \"kubernetes.io/projected/aed6dbbf-3a09-4b60-9757-7c74a07f9c63-kube-api-access-qf6v4\") pod \"auto-csr-approver-29556742-5ggnq\" (UID: \"aed6dbbf-3a09-4b60-9757-7c74a07f9c63\") " pod="openshift-infra/auto-csr-approver-29556742-5ggnq" Mar 13 12:22:00 crc kubenswrapper[4837]: I0313 12:22:00.412533 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qf6v4\" (UniqueName: \"kubernetes.io/projected/aed6dbbf-3a09-4b60-9757-7c74a07f9c63-kube-api-access-qf6v4\") pod \"auto-csr-approver-29556742-5ggnq\" (UID: \"aed6dbbf-3a09-4b60-9757-7c74a07f9c63\") " pod="openshift-infra/auto-csr-approver-29556742-5ggnq" Mar 13 12:22:00 crc kubenswrapper[4837]: I0313 12:22:00.500410 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556742-5ggnq" Mar 13 12:22:00 crc kubenswrapper[4837]: I0313 12:22:00.923990 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556742-5ggnq"] Mar 13 12:22:01 crc kubenswrapper[4837]: I0313 12:22:01.038326 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556742-5ggnq" event={"ID":"aed6dbbf-3a09-4b60-9757-7c74a07f9c63","Type":"ContainerStarted","Data":"58ec7ab55c0c919cae79f2a8321f6600f4d542955baab0750f8a171f55c53c13"} Mar 13 12:22:03 crc kubenswrapper[4837]: I0313 12:22:03.065380 4837 generic.go:334] "Generic (PLEG): container finished" podID="aed6dbbf-3a09-4b60-9757-7c74a07f9c63" containerID="852beb2b4218c3ee146b9596afb327ce3ec642be20ae0116d12166c03475804d" exitCode=0 Mar 13 12:22:03 crc kubenswrapper[4837]: I0313 12:22:03.066410 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556742-5ggnq" event={"ID":"aed6dbbf-3a09-4b60-9757-7c74a07f9c63","Type":"ContainerDied","Data":"852beb2b4218c3ee146b9596afb327ce3ec642be20ae0116d12166c03475804d"} Mar 13 12:22:04 crc kubenswrapper[4837]: I0313 12:22:04.442848 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556742-5ggnq" Mar 13 12:22:04 crc kubenswrapper[4837]: I0313 12:22:04.487742 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qf6v4\" (UniqueName: \"kubernetes.io/projected/aed6dbbf-3a09-4b60-9757-7c74a07f9c63-kube-api-access-qf6v4\") pod \"aed6dbbf-3a09-4b60-9757-7c74a07f9c63\" (UID: \"aed6dbbf-3a09-4b60-9757-7c74a07f9c63\") " Mar 13 12:22:04 crc kubenswrapper[4837]: I0313 12:22:04.495872 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aed6dbbf-3a09-4b60-9757-7c74a07f9c63-kube-api-access-qf6v4" (OuterVolumeSpecName: "kube-api-access-qf6v4") pod "aed6dbbf-3a09-4b60-9757-7c74a07f9c63" (UID: "aed6dbbf-3a09-4b60-9757-7c74a07f9c63"). InnerVolumeSpecName "kube-api-access-qf6v4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:22:04 crc kubenswrapper[4837]: I0313 12:22:04.589814 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qf6v4\" (UniqueName: \"kubernetes.io/projected/aed6dbbf-3a09-4b60-9757-7c74a07f9c63-kube-api-access-qf6v4\") on node \"crc\" DevicePath \"\"" Mar 13 12:22:05 crc kubenswrapper[4837]: I0313 12:22:05.084540 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556742-5ggnq" event={"ID":"aed6dbbf-3a09-4b60-9757-7c74a07f9c63","Type":"ContainerDied","Data":"58ec7ab55c0c919cae79f2a8321f6600f4d542955baab0750f8a171f55c53c13"} Mar 13 12:22:05 crc kubenswrapper[4837]: I0313 12:22:05.084583 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58ec7ab55c0c919cae79f2a8321f6600f4d542955baab0750f8a171f55c53c13" Mar 13 12:22:05 crc kubenswrapper[4837]: I0313 12:22:05.084649 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556742-5ggnq" Mar 13 12:22:05 crc kubenswrapper[4837]: I0313 12:22:05.483474 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:22:05 crc kubenswrapper[4837]: I0313 12:22:05.483866 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:22:05 crc kubenswrapper[4837]: I0313 12:22:05.506150 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556736-26kwx"] Mar 13 12:22:05 crc kubenswrapper[4837]: I0313 12:22:05.514140 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556736-26kwx"] Mar 13 12:22:07 crc kubenswrapper[4837]: I0313 12:22:07.061517 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2c1518c-d031-4597-ab77-8626e068bcda" path="/var/lib/kubelet/pods/a2c1518c-d031-4597-ab77-8626e068bcda/volumes" Mar 13 12:22:14 crc kubenswrapper[4837]: E0313 12:22:14.984337 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaed6dbbf_3a09_4b60_9757_7c74a07f9c63.slice\": RecentStats: unable to find data in memory cache]" Mar 13 12:22:25 crc kubenswrapper[4837]: E0313 12:22:25.225122 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaed6dbbf_3a09_4b60_9757_7c74a07f9c63.slice\": RecentStats: unable to find data in memory cache]" Mar 13 12:22:30 crc kubenswrapper[4837]: I0313 12:22:30.140106 4837 scope.go:117] "RemoveContainer" containerID="eab36df7c6a9acf9dc7560368f9674c4b5510068e382ff493b327a540b10eb38" Mar 13 12:22:35 crc kubenswrapper[4837]: E0313 12:22:35.453842 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaed6dbbf_3a09_4b60_9757_7c74a07f9c63.slice\": RecentStats: unable to find data in memory cache]" Mar 13 12:22:35 crc kubenswrapper[4837]: I0313 12:22:35.483527 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:22:35 crc kubenswrapper[4837]: I0313 12:22:35.483602 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:22:45 crc kubenswrapper[4837]: E0313 12:22:45.680125 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaed6dbbf_3a09_4b60_9757_7c74a07f9c63.slice\": RecentStats: unable to find data in memory cache]" Mar 13 12:22:55 crc kubenswrapper[4837]: E0313 12:22:55.911702 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaed6dbbf_3a09_4b60_9757_7c74a07f9c63.slice\": RecentStats: unable to find data in memory cache]" Mar 13 12:23:05 crc kubenswrapper[4837]: I0313 12:23:05.484091 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:23:05 crc kubenswrapper[4837]: I0313 12:23:05.484559 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:23:05 crc kubenswrapper[4837]: I0313 12:23:05.484609 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 12:23:05 crc kubenswrapper[4837]: I0313 12:23:05.485329 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"95ed8f8c7021ad56734ed8e8626e89cd8f2efcdf1bf9a33ce258f19439eeb037"} pod="openshift-machine-config-operator/machine-config-daemon-2td4d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 12:23:05 crc kubenswrapper[4837]: I0313 12:23:05.485381 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" containerID="cri-o://95ed8f8c7021ad56734ed8e8626e89cd8f2efcdf1bf9a33ce258f19439eeb037" gracePeriod=600 Mar 13 12:23:05 crc kubenswrapper[4837]: I0313 12:23:05.614861 4837 generic.go:334] "Generic (PLEG): container finished" podID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerID="95ed8f8c7021ad56734ed8e8626e89cd8f2efcdf1bf9a33ce258f19439eeb037" exitCode=0 Mar 13 12:23:05 crc kubenswrapper[4837]: I0313 12:23:05.614914 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerDied","Data":"95ed8f8c7021ad56734ed8e8626e89cd8f2efcdf1bf9a33ce258f19439eeb037"} Mar 13 12:23:05 crc kubenswrapper[4837]: I0313 12:23:05.614954 4837 scope.go:117] "RemoveContainer" containerID="92ee41a64544d27e288dd6522ee4da27e8cb19ccf312984b122a6650cec27a8a" Mar 13 12:23:06 crc kubenswrapper[4837]: I0313 12:23:06.626410 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerStarted","Data":"e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504"} Mar 13 12:24:00 crc kubenswrapper[4837]: I0313 12:24:00.149911 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556744-m59fh"] Mar 13 12:24:00 crc kubenswrapper[4837]: E0313 12:24:00.151085 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aed6dbbf-3a09-4b60-9757-7c74a07f9c63" containerName="oc" Mar 13 12:24:00 crc kubenswrapper[4837]: I0313 12:24:00.151108 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="aed6dbbf-3a09-4b60-9757-7c74a07f9c63" containerName="oc" Mar 13 12:24:00 crc kubenswrapper[4837]: I0313 12:24:00.151484 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="aed6dbbf-3a09-4b60-9757-7c74a07f9c63" containerName="oc" Mar 13 12:24:00 crc kubenswrapper[4837]: I0313 12:24:00.152379 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556744-m59fh" Mar 13 12:24:00 crc kubenswrapper[4837]: I0313 12:24:00.154653 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:24:00 crc kubenswrapper[4837]: I0313 12:24:00.158335 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:24:00 crc kubenswrapper[4837]: I0313 12:24:00.158669 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:24:00 crc kubenswrapper[4837]: I0313 12:24:00.162214 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556744-m59fh"] Mar 13 12:24:00 crc kubenswrapper[4837]: I0313 12:24:00.302407 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjls7\" (UniqueName: \"kubernetes.io/projected/1e934250-1bb4-41fe-b36e-2acf48194bcf-kube-api-access-fjls7\") pod \"auto-csr-approver-29556744-m59fh\" (UID: \"1e934250-1bb4-41fe-b36e-2acf48194bcf\") " pod="openshift-infra/auto-csr-approver-29556744-m59fh" Mar 13 12:24:00 crc kubenswrapper[4837]: I0313 12:24:00.405352 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjls7\" (UniqueName: \"kubernetes.io/projected/1e934250-1bb4-41fe-b36e-2acf48194bcf-kube-api-access-fjls7\") pod \"auto-csr-approver-29556744-m59fh\" (UID: \"1e934250-1bb4-41fe-b36e-2acf48194bcf\") " pod="openshift-infra/auto-csr-approver-29556744-m59fh" Mar 13 12:24:00 crc kubenswrapper[4837]: I0313 12:24:00.423875 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjls7\" (UniqueName: \"kubernetes.io/projected/1e934250-1bb4-41fe-b36e-2acf48194bcf-kube-api-access-fjls7\") pod \"auto-csr-approver-29556744-m59fh\" (UID: \"1e934250-1bb4-41fe-b36e-2acf48194bcf\") " pod="openshift-infra/auto-csr-approver-29556744-m59fh" Mar 13 12:24:00 crc kubenswrapper[4837]: I0313 12:24:00.475131 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556744-m59fh" Mar 13 12:24:00 crc kubenswrapper[4837]: I0313 12:24:00.912124 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556744-m59fh"] Mar 13 12:24:01 crc kubenswrapper[4837]: I0313 12:24:01.156983 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556744-m59fh" event={"ID":"1e934250-1bb4-41fe-b36e-2acf48194bcf","Type":"ContainerStarted","Data":"344af983d8397c15688256e604b383f3dd8ba0e599135f66bdd75e3c171eca4b"} Mar 13 12:24:03 crc kubenswrapper[4837]: I0313 12:24:03.176859 4837 generic.go:334] "Generic (PLEG): container finished" podID="1e934250-1bb4-41fe-b36e-2acf48194bcf" containerID="5d6f6eb18121de7ac4f3538b881026fd87404ecae22fe7e8d631b874d26990e4" exitCode=0 Mar 13 12:24:03 crc kubenswrapper[4837]: I0313 12:24:03.176941 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556744-m59fh" event={"ID":"1e934250-1bb4-41fe-b36e-2acf48194bcf","Type":"ContainerDied","Data":"5d6f6eb18121de7ac4f3538b881026fd87404ecae22fe7e8d631b874d26990e4"} Mar 13 12:24:04 crc kubenswrapper[4837]: I0313 12:24:04.504918 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556744-m59fh" Mar 13 12:24:04 crc kubenswrapper[4837]: I0313 12:24:04.694482 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjls7\" (UniqueName: \"kubernetes.io/projected/1e934250-1bb4-41fe-b36e-2acf48194bcf-kube-api-access-fjls7\") pod \"1e934250-1bb4-41fe-b36e-2acf48194bcf\" (UID: \"1e934250-1bb4-41fe-b36e-2acf48194bcf\") " Mar 13 12:24:04 crc kubenswrapper[4837]: I0313 12:24:04.702180 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e934250-1bb4-41fe-b36e-2acf48194bcf-kube-api-access-fjls7" (OuterVolumeSpecName: "kube-api-access-fjls7") pod "1e934250-1bb4-41fe-b36e-2acf48194bcf" (UID: "1e934250-1bb4-41fe-b36e-2acf48194bcf"). InnerVolumeSpecName "kube-api-access-fjls7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:24:04 crc kubenswrapper[4837]: I0313 12:24:04.797150 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjls7\" (UniqueName: \"kubernetes.io/projected/1e934250-1bb4-41fe-b36e-2acf48194bcf-kube-api-access-fjls7\") on node \"crc\" DevicePath \"\"" Mar 13 12:24:05 crc kubenswrapper[4837]: I0313 12:24:05.197944 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556744-m59fh" event={"ID":"1e934250-1bb4-41fe-b36e-2acf48194bcf","Type":"ContainerDied","Data":"344af983d8397c15688256e604b383f3dd8ba0e599135f66bdd75e3c171eca4b"} Mar 13 12:24:05 crc kubenswrapper[4837]: I0313 12:24:05.198297 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="344af983d8397c15688256e604b383f3dd8ba0e599135f66bdd75e3c171eca4b" Mar 13 12:24:05 crc kubenswrapper[4837]: I0313 12:24:05.197988 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556744-m59fh" Mar 13 12:24:05 crc kubenswrapper[4837]: I0313 12:24:05.572133 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556738-trdfn"] Mar 13 12:24:05 crc kubenswrapper[4837]: I0313 12:24:05.581831 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556738-trdfn"] Mar 13 12:24:07 crc kubenswrapper[4837]: I0313 12:24:07.058889 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abf39778-b981-4807-916d-f62ff0a03ac9" path="/var/lib/kubelet/pods/abf39778-b981-4807-916d-f62ff0a03ac9/volumes" Mar 13 12:24:30 crc kubenswrapper[4837]: I0313 12:24:30.265161 4837 scope.go:117] "RemoveContainer" containerID="7e866ef5a9a2608fd8aa30e6d573f07172996e7b068a978cf3d3449b179bd748" Mar 13 12:24:42 crc kubenswrapper[4837]: I0313 12:24:42.512850 4837 generic.go:334] "Generic (PLEG): container finished" podID="394104d4-0291-4071-a7da-d7b71e0f4083" containerID="f30e37b9f7f0384121aa71f44589ddb9d3068a703ce922ccc922c61ff88b1f38" exitCode=0 Mar 13 12:24:42 crc kubenswrapper[4837]: I0313 12:24:42.512956 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5" event={"ID":"394104d4-0291-4071-a7da-d7b71e0f4083","Type":"ContainerDied","Data":"f30e37b9f7f0384121aa71f44589ddb9d3068a703ce922ccc922c61ff88b1f38"} Mar 13 12:24:43 crc kubenswrapper[4837]: I0313 12:24:43.903946 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.073091 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/394104d4-0291-4071-a7da-d7b71e0f4083-libvirt-combined-ca-bundle\") pod \"394104d4-0291-4071-a7da-d7b71e0f4083\" (UID: \"394104d4-0291-4071-a7da-d7b71e0f4083\") " Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.073208 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwdjz\" (UniqueName: \"kubernetes.io/projected/394104d4-0291-4071-a7da-d7b71e0f4083-kube-api-access-zwdjz\") pod \"394104d4-0291-4071-a7da-d7b71e0f4083\" (UID: \"394104d4-0291-4071-a7da-d7b71e0f4083\") " Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.073231 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/394104d4-0291-4071-a7da-d7b71e0f4083-libvirt-secret-0\") pod \"394104d4-0291-4071-a7da-d7b71e0f4083\" (UID: \"394104d4-0291-4071-a7da-d7b71e0f4083\") " Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.073335 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/394104d4-0291-4071-a7da-d7b71e0f4083-inventory\") pod \"394104d4-0291-4071-a7da-d7b71e0f4083\" (UID: \"394104d4-0291-4071-a7da-d7b71e0f4083\") " Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.073369 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/394104d4-0291-4071-a7da-d7b71e0f4083-ssh-key-openstack-edpm-ipam\") pod \"394104d4-0291-4071-a7da-d7b71e0f4083\" (UID: \"394104d4-0291-4071-a7da-d7b71e0f4083\") " Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.079502 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/394104d4-0291-4071-a7da-d7b71e0f4083-kube-api-access-zwdjz" (OuterVolumeSpecName: "kube-api-access-zwdjz") pod "394104d4-0291-4071-a7da-d7b71e0f4083" (UID: "394104d4-0291-4071-a7da-d7b71e0f4083"). InnerVolumeSpecName "kube-api-access-zwdjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.081364 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/394104d4-0291-4071-a7da-d7b71e0f4083-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "394104d4-0291-4071-a7da-d7b71e0f4083" (UID: "394104d4-0291-4071-a7da-d7b71e0f4083"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.106290 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/394104d4-0291-4071-a7da-d7b71e0f4083-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "394104d4-0291-4071-a7da-d7b71e0f4083" (UID: "394104d4-0291-4071-a7da-d7b71e0f4083"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.109495 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/394104d4-0291-4071-a7da-d7b71e0f4083-inventory" (OuterVolumeSpecName: "inventory") pod "394104d4-0291-4071-a7da-d7b71e0f4083" (UID: "394104d4-0291-4071-a7da-d7b71e0f4083"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.110540 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/394104d4-0291-4071-a7da-d7b71e0f4083-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "394104d4-0291-4071-a7da-d7b71e0f4083" (UID: "394104d4-0291-4071-a7da-d7b71e0f4083"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.175631 4837 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/394104d4-0291-4071-a7da-d7b71e0f4083-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.175687 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwdjz\" (UniqueName: \"kubernetes.io/projected/394104d4-0291-4071-a7da-d7b71e0f4083-kube-api-access-zwdjz\") on node \"crc\" DevicePath \"\"" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.175714 4837 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/394104d4-0291-4071-a7da-d7b71e0f4083-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.175736 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/394104d4-0291-4071-a7da-d7b71e0f4083-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.175750 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/394104d4-0291-4071-a7da-d7b71e0f4083-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.541779 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5" event={"ID":"394104d4-0291-4071-a7da-d7b71e0f4083","Type":"ContainerDied","Data":"635868923cf7f5008b52abe367a7a6d82aa47f6efef93a5cafc25c193c32e1e5"} Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.541873 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="635868923cf7f5008b52abe367a7a6d82aa47f6efef93a5cafc25c193c32e1e5" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.542155 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.632110 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk"] Mar 13 12:24:44 crc kubenswrapper[4837]: E0313 12:24:44.632769 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="394104d4-0291-4071-a7da-d7b71e0f4083" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.632843 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="394104d4-0291-4071-a7da-d7b71e0f4083" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 13 12:24:44 crc kubenswrapper[4837]: E0313 12:24:44.632903 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e934250-1bb4-41fe-b36e-2acf48194bcf" containerName="oc" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.632980 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e934250-1bb4-41fe-b36e-2acf48194bcf" containerName="oc" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.633204 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e934250-1bb4-41fe-b36e-2acf48194bcf" containerName="oc" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.633271 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="394104d4-0291-4071-a7da-d7b71e0f4083" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.634014 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.636096 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.636416 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.636674 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dxdkz" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.638126 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.638281 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.638449 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.638583 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.660165 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk"] Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.787086 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.787144 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.787163 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.787252 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.787273 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e6986f16-e143-49f4-81e5-58abba717876-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.787292 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.787322 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.787371 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzqgf\" (UniqueName: \"kubernetes.io/projected/e6986f16-e143-49f4-81e5-58abba717876-kube-api-access-fzqgf\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.787435 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.787451 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.787468 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.888938 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.888986 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e6986f16-e143-49f4-81e5-58abba717876-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.889007 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.889040 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.889088 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzqgf\" (UniqueName: \"kubernetes.io/projected/e6986f16-e143-49f4-81e5-58abba717876-kube-api-access-fzqgf\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.889126 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.889142 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.889157 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.889195 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.889234 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.889250 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.894149 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.894549 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e6986f16-e143-49f4-81e5-58abba717876-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.895425 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.897293 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.897626 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.897911 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.898131 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.902356 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.902990 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.914193 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.919255 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzqgf\" (UniqueName: \"kubernetes.io/projected/e6986f16-e143-49f4-81e5-58abba717876-kube-api-access-fzqgf\") pod \"nova-edpm-deployment-openstack-edpm-ipam-4jdmk\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:44 crc kubenswrapper[4837]: I0313 12:24:44.961872 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:24:45 crc kubenswrapper[4837]: I0313 12:24:45.517263 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk"] Mar 13 12:24:45 crc kubenswrapper[4837]: W0313 12:24:45.524523 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6986f16_e143_49f4_81e5_58abba717876.slice/crio-7cc82d3fd71f5170b65dc07b84d25573c7febed498176a66fed2dfd3b4619643 WatchSource:0}: Error finding container 7cc82d3fd71f5170b65dc07b84d25573c7febed498176a66fed2dfd3b4619643: Status 404 returned error can't find the container with id 7cc82d3fd71f5170b65dc07b84d25573c7febed498176a66fed2dfd3b4619643 Mar 13 12:24:45 crc kubenswrapper[4837]: I0313 12:24:45.528090 4837 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 12:24:45 crc kubenswrapper[4837]: I0313 12:24:45.554939 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" event={"ID":"e6986f16-e143-49f4-81e5-58abba717876","Type":"ContainerStarted","Data":"7cc82d3fd71f5170b65dc07b84d25573c7febed498176a66fed2dfd3b4619643"} Mar 13 12:24:46 crc kubenswrapper[4837]: I0313 12:24:46.564001 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" event={"ID":"e6986f16-e143-49f4-81e5-58abba717876","Type":"ContainerStarted","Data":"18a83cd1cba4b0ec8cbb0763088a8fc20438f178de5bb307e3e42d268b1d9ec5"} Mar 13 12:24:46 crc kubenswrapper[4837]: I0313 12:24:46.586855 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" podStartSLOduration=2.113208818 podStartE2EDuration="2.586839393s" podCreationTimestamp="2026-03-13 12:24:44 +0000 UTC" firstStartedPulling="2026-03-13 12:24:45.52783777 +0000 UTC m=+2201.166104533" lastFinishedPulling="2026-03-13 12:24:46.001468335 +0000 UTC m=+2201.639735108" observedRunningTime="2026-03-13 12:24:46.580987969 +0000 UTC m=+2202.219254752" watchObservedRunningTime="2026-03-13 12:24:46.586839393 +0000 UTC m=+2202.225106156" Mar 13 12:25:05 crc kubenswrapper[4837]: I0313 12:25:05.484372 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:25:05 crc kubenswrapper[4837]: I0313 12:25:05.485111 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:25:35 crc kubenswrapper[4837]: I0313 12:25:35.483950 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:25:35 crc kubenswrapper[4837]: I0313 12:25:35.484978 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:25:54 crc kubenswrapper[4837]: I0313 12:25:54.978198 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-j6t59"] Mar 13 12:25:54 crc kubenswrapper[4837]: I0313 12:25:54.981197 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j6t59" Mar 13 12:25:54 crc kubenswrapper[4837]: I0313 12:25:54.993727 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j6t59"] Mar 13 12:25:55 crc kubenswrapper[4837]: I0313 12:25:55.068487 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69jzh\" (UniqueName: \"kubernetes.io/projected/5739768e-3825-4869-9a20-d65269d6ff6e-kube-api-access-69jzh\") pod \"community-operators-j6t59\" (UID: \"5739768e-3825-4869-9a20-d65269d6ff6e\") " pod="openshift-marketplace/community-operators-j6t59" Mar 13 12:25:55 crc kubenswrapper[4837]: I0313 12:25:55.068698 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5739768e-3825-4869-9a20-d65269d6ff6e-catalog-content\") pod \"community-operators-j6t59\" (UID: \"5739768e-3825-4869-9a20-d65269d6ff6e\") " pod="openshift-marketplace/community-operators-j6t59" Mar 13 12:25:55 crc kubenswrapper[4837]: I0313 12:25:55.068729 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5739768e-3825-4869-9a20-d65269d6ff6e-utilities\") pod \"community-operators-j6t59\" (UID: \"5739768e-3825-4869-9a20-d65269d6ff6e\") " pod="openshift-marketplace/community-operators-j6t59" Mar 13 12:25:55 crc kubenswrapper[4837]: I0313 12:25:55.170750 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69jzh\" (UniqueName: \"kubernetes.io/projected/5739768e-3825-4869-9a20-d65269d6ff6e-kube-api-access-69jzh\") pod \"community-operators-j6t59\" (UID: \"5739768e-3825-4869-9a20-d65269d6ff6e\") " pod="openshift-marketplace/community-operators-j6t59" Mar 13 12:25:55 crc kubenswrapper[4837]: I0313 12:25:55.170924 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5739768e-3825-4869-9a20-d65269d6ff6e-catalog-content\") pod \"community-operators-j6t59\" (UID: \"5739768e-3825-4869-9a20-d65269d6ff6e\") " pod="openshift-marketplace/community-operators-j6t59" Mar 13 12:25:55 crc kubenswrapper[4837]: I0313 12:25:55.170946 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5739768e-3825-4869-9a20-d65269d6ff6e-utilities\") pod \"community-operators-j6t59\" (UID: \"5739768e-3825-4869-9a20-d65269d6ff6e\") " pod="openshift-marketplace/community-operators-j6t59" Mar 13 12:25:55 crc kubenswrapper[4837]: I0313 12:25:55.171436 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5739768e-3825-4869-9a20-d65269d6ff6e-utilities\") pod \"community-operators-j6t59\" (UID: \"5739768e-3825-4869-9a20-d65269d6ff6e\") " pod="openshift-marketplace/community-operators-j6t59" Mar 13 12:25:55 crc kubenswrapper[4837]: I0313 12:25:55.171611 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5739768e-3825-4869-9a20-d65269d6ff6e-catalog-content\") pod \"community-operators-j6t59\" (UID: \"5739768e-3825-4869-9a20-d65269d6ff6e\") " pod="openshift-marketplace/community-operators-j6t59" Mar 13 12:25:55 crc kubenswrapper[4837]: I0313 12:25:55.195002 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69jzh\" (UniqueName: \"kubernetes.io/projected/5739768e-3825-4869-9a20-d65269d6ff6e-kube-api-access-69jzh\") pod \"community-operators-j6t59\" (UID: \"5739768e-3825-4869-9a20-d65269d6ff6e\") " pod="openshift-marketplace/community-operators-j6t59" Mar 13 12:25:55 crc kubenswrapper[4837]: I0313 12:25:55.313989 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j6t59" Mar 13 12:25:55 crc kubenswrapper[4837]: I0313 12:25:55.806572 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j6t59"] Mar 13 12:25:56 crc kubenswrapper[4837]: I0313 12:25:56.397713 4837 generic.go:334] "Generic (PLEG): container finished" podID="5739768e-3825-4869-9a20-d65269d6ff6e" containerID="8ec46fdf64b8b41f5ed86e596fecb59aef4dd2e3445048553d57b57f44cdf29d" exitCode=0 Mar 13 12:25:56 crc kubenswrapper[4837]: I0313 12:25:56.398153 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6t59" event={"ID":"5739768e-3825-4869-9a20-d65269d6ff6e","Type":"ContainerDied","Data":"8ec46fdf64b8b41f5ed86e596fecb59aef4dd2e3445048553d57b57f44cdf29d"} Mar 13 12:25:56 crc kubenswrapper[4837]: I0313 12:25:56.398182 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6t59" event={"ID":"5739768e-3825-4869-9a20-d65269d6ff6e","Type":"ContainerStarted","Data":"6e5ab350326a007d05239aba067c1ae7270bb4feadf1120d3dff1a07c76500a1"} Mar 13 12:25:57 crc kubenswrapper[4837]: I0313 12:25:57.367207 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-n7dpp"] Mar 13 12:25:57 crc kubenswrapper[4837]: I0313 12:25:57.369819 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n7dpp" Mar 13 12:25:57 crc kubenswrapper[4837]: I0313 12:25:57.382487 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n7dpp"] Mar 13 12:25:57 crc kubenswrapper[4837]: I0313 12:25:57.415593 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6t59" event={"ID":"5739768e-3825-4869-9a20-d65269d6ff6e","Type":"ContainerStarted","Data":"18665a547215957278c7198a584f67fb5944ff6d98d64e5f47da861c793724c2"} Mar 13 12:25:57 crc kubenswrapper[4837]: I0313 12:25:57.526930 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdac88ff-0567-4477-b88a-a90c2bc99da8-catalog-content\") pod \"redhat-marketplace-n7dpp\" (UID: \"fdac88ff-0567-4477-b88a-a90c2bc99da8\") " pod="openshift-marketplace/redhat-marketplace-n7dpp" Mar 13 12:25:57 crc kubenswrapper[4837]: I0313 12:25:57.526992 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c9hp\" (UniqueName: \"kubernetes.io/projected/fdac88ff-0567-4477-b88a-a90c2bc99da8-kube-api-access-7c9hp\") pod \"redhat-marketplace-n7dpp\" (UID: \"fdac88ff-0567-4477-b88a-a90c2bc99da8\") " pod="openshift-marketplace/redhat-marketplace-n7dpp" Mar 13 12:25:57 crc kubenswrapper[4837]: I0313 12:25:57.527029 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdac88ff-0567-4477-b88a-a90c2bc99da8-utilities\") pod \"redhat-marketplace-n7dpp\" (UID: \"fdac88ff-0567-4477-b88a-a90c2bc99da8\") " pod="openshift-marketplace/redhat-marketplace-n7dpp" Mar 13 12:25:57 crc kubenswrapper[4837]: I0313 12:25:57.629292 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdac88ff-0567-4477-b88a-a90c2bc99da8-utilities\") pod \"redhat-marketplace-n7dpp\" (UID: \"fdac88ff-0567-4477-b88a-a90c2bc99da8\") " pod="openshift-marketplace/redhat-marketplace-n7dpp" Mar 13 12:25:57 crc kubenswrapper[4837]: I0313 12:25:57.629559 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdac88ff-0567-4477-b88a-a90c2bc99da8-utilities\") pod \"redhat-marketplace-n7dpp\" (UID: \"fdac88ff-0567-4477-b88a-a90c2bc99da8\") " pod="openshift-marketplace/redhat-marketplace-n7dpp" Mar 13 12:25:57 crc kubenswrapper[4837]: I0313 12:25:57.629811 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdac88ff-0567-4477-b88a-a90c2bc99da8-catalog-content\") pod \"redhat-marketplace-n7dpp\" (UID: \"fdac88ff-0567-4477-b88a-a90c2bc99da8\") " pod="openshift-marketplace/redhat-marketplace-n7dpp" Mar 13 12:25:57 crc kubenswrapper[4837]: I0313 12:25:57.629863 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7c9hp\" (UniqueName: \"kubernetes.io/projected/fdac88ff-0567-4477-b88a-a90c2bc99da8-kube-api-access-7c9hp\") pod \"redhat-marketplace-n7dpp\" (UID: \"fdac88ff-0567-4477-b88a-a90c2bc99da8\") " pod="openshift-marketplace/redhat-marketplace-n7dpp" Mar 13 12:25:57 crc kubenswrapper[4837]: I0313 12:25:57.630552 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdac88ff-0567-4477-b88a-a90c2bc99da8-catalog-content\") pod \"redhat-marketplace-n7dpp\" (UID: \"fdac88ff-0567-4477-b88a-a90c2bc99da8\") " pod="openshift-marketplace/redhat-marketplace-n7dpp" Mar 13 12:25:57 crc kubenswrapper[4837]: I0313 12:25:57.660295 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c9hp\" (UniqueName: \"kubernetes.io/projected/fdac88ff-0567-4477-b88a-a90c2bc99da8-kube-api-access-7c9hp\") pod \"redhat-marketplace-n7dpp\" (UID: \"fdac88ff-0567-4477-b88a-a90c2bc99da8\") " pod="openshift-marketplace/redhat-marketplace-n7dpp" Mar 13 12:25:57 crc kubenswrapper[4837]: I0313 12:25:57.691157 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n7dpp" Mar 13 12:25:58 crc kubenswrapper[4837]: I0313 12:25:58.164746 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n7dpp"] Mar 13 12:25:58 crc kubenswrapper[4837]: I0313 12:25:58.425746 4837 generic.go:334] "Generic (PLEG): container finished" podID="5739768e-3825-4869-9a20-d65269d6ff6e" containerID="18665a547215957278c7198a584f67fb5944ff6d98d64e5f47da861c793724c2" exitCode=0 Mar 13 12:25:58 crc kubenswrapper[4837]: I0313 12:25:58.425798 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6t59" event={"ID":"5739768e-3825-4869-9a20-d65269d6ff6e","Type":"ContainerDied","Data":"18665a547215957278c7198a584f67fb5944ff6d98d64e5f47da861c793724c2"} Mar 13 12:25:58 crc kubenswrapper[4837]: I0313 12:25:58.426884 4837 generic.go:334] "Generic (PLEG): container finished" podID="fdac88ff-0567-4477-b88a-a90c2bc99da8" containerID="f67572c8c6ce19fc30d5a363241b6294efe0fe117e547d75720e88fd9323c357" exitCode=0 Mar 13 12:25:58 crc kubenswrapper[4837]: I0313 12:25:58.426917 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n7dpp" event={"ID":"fdac88ff-0567-4477-b88a-a90c2bc99da8","Type":"ContainerDied","Data":"f67572c8c6ce19fc30d5a363241b6294efe0fe117e547d75720e88fd9323c357"} Mar 13 12:25:58 crc kubenswrapper[4837]: I0313 12:25:58.426939 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n7dpp" event={"ID":"fdac88ff-0567-4477-b88a-a90c2bc99da8","Type":"ContainerStarted","Data":"bc1c61427223537fb34eec963d583b2360136dfdbb62761ba6bcff070b488990"} Mar 13 12:25:59 crc kubenswrapper[4837]: I0313 12:25:59.436295 4837 generic.go:334] "Generic (PLEG): container finished" podID="fdac88ff-0567-4477-b88a-a90c2bc99da8" containerID="7b62114542f297a4d3e9e2cc215c273a290ed34518b0790734229b78c1fdfc3c" exitCode=0 Mar 13 12:25:59 crc kubenswrapper[4837]: I0313 12:25:59.436374 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n7dpp" event={"ID":"fdac88ff-0567-4477-b88a-a90c2bc99da8","Type":"ContainerDied","Data":"7b62114542f297a4d3e9e2cc215c273a290ed34518b0790734229b78c1fdfc3c"} Mar 13 12:25:59 crc kubenswrapper[4837]: I0313 12:25:59.444398 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6t59" event={"ID":"5739768e-3825-4869-9a20-d65269d6ff6e","Type":"ContainerStarted","Data":"7515c5269cc2629534353ef272cf372654fdc13cb5d83e02f1693f6899aa19f0"} Mar 13 12:25:59 crc kubenswrapper[4837]: I0313 12:25:59.475702 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-j6t59" podStartSLOduration=3.011868767 podStartE2EDuration="5.475682274s" podCreationTimestamp="2026-03-13 12:25:54 +0000 UTC" firstStartedPulling="2026-03-13 12:25:56.39946636 +0000 UTC m=+2272.037733123" lastFinishedPulling="2026-03-13 12:25:58.863279847 +0000 UTC m=+2274.501546630" observedRunningTime="2026-03-13 12:25:59.473758253 +0000 UTC m=+2275.112025026" watchObservedRunningTime="2026-03-13 12:25:59.475682274 +0000 UTC m=+2275.113949037" Mar 13 12:26:00 crc kubenswrapper[4837]: I0313 12:26:00.154753 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556746-vwjkq"] Mar 13 12:26:00 crc kubenswrapper[4837]: I0313 12:26:00.156469 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556746-vwjkq" Mar 13 12:26:00 crc kubenswrapper[4837]: I0313 12:26:00.158479 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:26:00 crc kubenswrapper[4837]: I0313 12:26:00.158975 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:26:00 crc kubenswrapper[4837]: I0313 12:26:00.159067 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:26:00 crc kubenswrapper[4837]: I0313 12:26:00.174388 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556746-vwjkq"] Mar 13 12:26:00 crc kubenswrapper[4837]: I0313 12:26:00.206996 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxsh6\" (UniqueName: \"kubernetes.io/projected/eb9a9c7b-13fc-4655-91b2-a388c3870bf8-kube-api-access-lxsh6\") pod \"auto-csr-approver-29556746-vwjkq\" (UID: \"eb9a9c7b-13fc-4655-91b2-a388c3870bf8\") " pod="openshift-infra/auto-csr-approver-29556746-vwjkq" Mar 13 12:26:00 crc kubenswrapper[4837]: I0313 12:26:00.309093 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxsh6\" (UniqueName: \"kubernetes.io/projected/eb9a9c7b-13fc-4655-91b2-a388c3870bf8-kube-api-access-lxsh6\") pod \"auto-csr-approver-29556746-vwjkq\" (UID: \"eb9a9c7b-13fc-4655-91b2-a388c3870bf8\") " pod="openshift-infra/auto-csr-approver-29556746-vwjkq" Mar 13 12:26:00 crc kubenswrapper[4837]: I0313 12:26:00.329320 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxsh6\" (UniqueName: \"kubernetes.io/projected/eb9a9c7b-13fc-4655-91b2-a388c3870bf8-kube-api-access-lxsh6\") pod \"auto-csr-approver-29556746-vwjkq\" (UID: \"eb9a9c7b-13fc-4655-91b2-a388c3870bf8\") " pod="openshift-infra/auto-csr-approver-29556746-vwjkq" Mar 13 12:26:00 crc kubenswrapper[4837]: I0313 12:26:00.453986 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n7dpp" event={"ID":"fdac88ff-0567-4477-b88a-a90c2bc99da8","Type":"ContainerStarted","Data":"a5eee508b483e64f809508b03aa1f0b24998bdde6d37da5807abe3cdc59f087e"} Mar 13 12:26:00 crc kubenswrapper[4837]: I0313 12:26:00.479516 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-n7dpp" podStartSLOduration=2.097581473 podStartE2EDuration="3.479492118s" podCreationTimestamp="2026-03-13 12:25:57 +0000 UTC" firstStartedPulling="2026-03-13 12:25:58.431898867 +0000 UTC m=+2274.070165650" lastFinishedPulling="2026-03-13 12:25:59.813809532 +0000 UTC m=+2275.452076295" observedRunningTime="2026-03-13 12:26:00.469399402 +0000 UTC m=+2276.107666155" watchObservedRunningTime="2026-03-13 12:26:00.479492118 +0000 UTC m=+2276.117758881" Mar 13 12:26:00 crc kubenswrapper[4837]: I0313 12:26:00.480595 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556746-vwjkq" Mar 13 12:26:00 crc kubenswrapper[4837]: I0313 12:26:00.968474 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556746-vwjkq"] Mar 13 12:26:00 crc kubenswrapper[4837]: W0313 12:26:00.974034 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb9a9c7b_13fc_4655_91b2_a388c3870bf8.slice/crio-413d37f2c29c6b102d98d402abf30d47d543609ff1745230a949d42eb5a26c91 WatchSource:0}: Error finding container 413d37f2c29c6b102d98d402abf30d47d543609ff1745230a949d42eb5a26c91: Status 404 returned error can't find the container with id 413d37f2c29c6b102d98d402abf30d47d543609ff1745230a949d42eb5a26c91 Mar 13 12:26:01 crc kubenswrapper[4837]: I0313 12:26:01.467229 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556746-vwjkq" event={"ID":"eb9a9c7b-13fc-4655-91b2-a388c3870bf8","Type":"ContainerStarted","Data":"413d37f2c29c6b102d98d402abf30d47d543609ff1745230a949d42eb5a26c91"} Mar 13 12:26:02 crc kubenswrapper[4837]: I0313 12:26:02.489456 4837 generic.go:334] "Generic (PLEG): container finished" podID="eb9a9c7b-13fc-4655-91b2-a388c3870bf8" containerID="27d05aedac81655ab98a132d059aa69f642170fd7305465ba1bc55dadd819af6" exitCode=0 Mar 13 12:26:02 crc kubenswrapper[4837]: I0313 12:26:02.489531 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556746-vwjkq" event={"ID":"eb9a9c7b-13fc-4655-91b2-a388c3870bf8","Type":"ContainerDied","Data":"27d05aedac81655ab98a132d059aa69f642170fd7305465ba1bc55dadd819af6"} Mar 13 12:26:03 crc kubenswrapper[4837]: I0313 12:26:03.811494 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556746-vwjkq" Mar 13 12:26:03 crc kubenswrapper[4837]: I0313 12:26:03.875096 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxsh6\" (UniqueName: \"kubernetes.io/projected/eb9a9c7b-13fc-4655-91b2-a388c3870bf8-kube-api-access-lxsh6\") pod \"eb9a9c7b-13fc-4655-91b2-a388c3870bf8\" (UID: \"eb9a9c7b-13fc-4655-91b2-a388c3870bf8\") " Mar 13 12:26:03 crc kubenswrapper[4837]: I0313 12:26:03.883010 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb9a9c7b-13fc-4655-91b2-a388c3870bf8-kube-api-access-lxsh6" (OuterVolumeSpecName: "kube-api-access-lxsh6") pod "eb9a9c7b-13fc-4655-91b2-a388c3870bf8" (UID: "eb9a9c7b-13fc-4655-91b2-a388c3870bf8"). InnerVolumeSpecName "kube-api-access-lxsh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:26:03 crc kubenswrapper[4837]: I0313 12:26:03.978051 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxsh6\" (UniqueName: \"kubernetes.io/projected/eb9a9c7b-13fc-4655-91b2-a388c3870bf8-kube-api-access-lxsh6\") on node \"crc\" DevicePath \"\"" Mar 13 12:26:04 crc kubenswrapper[4837]: I0313 12:26:04.514154 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556746-vwjkq" event={"ID":"eb9a9c7b-13fc-4655-91b2-a388c3870bf8","Type":"ContainerDied","Data":"413d37f2c29c6b102d98d402abf30d47d543609ff1745230a949d42eb5a26c91"} Mar 13 12:26:04 crc kubenswrapper[4837]: I0313 12:26:04.514209 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556746-vwjkq" Mar 13 12:26:04 crc kubenswrapper[4837]: I0313 12:26:04.514225 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="413d37f2c29c6b102d98d402abf30d47d543609ff1745230a949d42eb5a26c91" Mar 13 12:26:04 crc kubenswrapper[4837]: I0313 12:26:04.879602 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556740-snmw2"] Mar 13 12:26:04 crc kubenswrapper[4837]: I0313 12:26:04.887508 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556740-snmw2"] Mar 13 12:26:05 crc kubenswrapper[4837]: I0313 12:26:05.058454 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e01710d7-a463-41fe-9d86-2410a8ccd8e8" path="/var/lib/kubelet/pods/e01710d7-a463-41fe-9d86-2410a8ccd8e8/volumes" Mar 13 12:26:05 crc kubenswrapper[4837]: I0313 12:26:05.314619 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-j6t59" Mar 13 12:26:05 crc kubenswrapper[4837]: I0313 12:26:05.314751 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-j6t59" Mar 13 12:26:05 crc kubenswrapper[4837]: I0313 12:26:05.361566 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-j6t59" Mar 13 12:26:05 crc kubenswrapper[4837]: I0313 12:26:05.483804 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:26:05 crc kubenswrapper[4837]: I0313 12:26:05.483877 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:26:05 crc kubenswrapper[4837]: I0313 12:26:05.483927 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 12:26:05 crc kubenswrapper[4837]: I0313 12:26:05.484821 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504"} pod="openshift-machine-config-operator/machine-config-daemon-2td4d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 12:26:05 crc kubenswrapper[4837]: I0313 12:26:05.484889 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" containerID="cri-o://e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504" gracePeriod=600 Mar 13 12:26:05 crc kubenswrapper[4837]: I0313 12:26:05.581129 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-j6t59" Mar 13 12:26:05 crc kubenswrapper[4837]: E0313 12:26:05.626866 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:26:05 crc kubenswrapper[4837]: I0313 12:26:05.629013 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j6t59"] Mar 13 12:26:06 crc kubenswrapper[4837]: I0313 12:26:06.533799 4837 generic.go:334] "Generic (PLEG): container finished" podID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerID="e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504" exitCode=0 Mar 13 12:26:06 crc kubenswrapper[4837]: I0313 12:26:06.533849 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerDied","Data":"e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504"} Mar 13 12:26:06 crc kubenswrapper[4837]: I0313 12:26:06.533907 4837 scope.go:117] "RemoveContainer" containerID="95ed8f8c7021ad56734ed8e8626e89cd8f2efcdf1bf9a33ce258f19439eeb037" Mar 13 12:26:06 crc kubenswrapper[4837]: I0313 12:26:06.534577 4837 scope.go:117] "RemoveContainer" containerID="e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504" Mar 13 12:26:06 crc kubenswrapper[4837]: E0313 12:26:06.534828 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:26:07 crc kubenswrapper[4837]: I0313 12:26:07.545253 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-j6t59" podUID="5739768e-3825-4869-9a20-d65269d6ff6e" containerName="registry-server" containerID="cri-o://7515c5269cc2629534353ef272cf372654fdc13cb5d83e02f1693f6899aa19f0" gracePeriod=2 Mar 13 12:26:07 crc kubenswrapper[4837]: I0313 12:26:07.694934 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-n7dpp" Mar 13 12:26:07 crc kubenswrapper[4837]: I0313 12:26:07.694984 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-n7dpp" Mar 13 12:26:07 crc kubenswrapper[4837]: I0313 12:26:07.753171 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-n7dpp" Mar 13 12:26:08 crc kubenswrapper[4837]: I0313 12:26:08.012100 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j6t59" Mar 13 12:26:08 crc kubenswrapper[4837]: I0313 12:26:08.058814 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5739768e-3825-4869-9a20-d65269d6ff6e-catalog-content\") pod \"5739768e-3825-4869-9a20-d65269d6ff6e\" (UID: \"5739768e-3825-4869-9a20-d65269d6ff6e\") " Mar 13 12:26:08 crc kubenswrapper[4837]: I0313 12:26:08.058898 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5739768e-3825-4869-9a20-d65269d6ff6e-utilities\") pod \"5739768e-3825-4869-9a20-d65269d6ff6e\" (UID: \"5739768e-3825-4869-9a20-d65269d6ff6e\") " Mar 13 12:26:08 crc kubenswrapper[4837]: I0313 12:26:08.059057 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69jzh\" (UniqueName: \"kubernetes.io/projected/5739768e-3825-4869-9a20-d65269d6ff6e-kube-api-access-69jzh\") pod \"5739768e-3825-4869-9a20-d65269d6ff6e\" (UID: \"5739768e-3825-4869-9a20-d65269d6ff6e\") " Mar 13 12:26:08 crc kubenswrapper[4837]: I0313 12:26:08.059978 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5739768e-3825-4869-9a20-d65269d6ff6e-utilities" (OuterVolumeSpecName: "utilities") pod "5739768e-3825-4869-9a20-d65269d6ff6e" (UID: "5739768e-3825-4869-9a20-d65269d6ff6e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:26:08 crc kubenswrapper[4837]: I0313 12:26:08.065421 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5739768e-3825-4869-9a20-d65269d6ff6e-kube-api-access-69jzh" (OuterVolumeSpecName: "kube-api-access-69jzh") pod "5739768e-3825-4869-9a20-d65269d6ff6e" (UID: "5739768e-3825-4869-9a20-d65269d6ff6e"). InnerVolumeSpecName "kube-api-access-69jzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:26:08 crc kubenswrapper[4837]: I0313 12:26:08.108613 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5739768e-3825-4869-9a20-d65269d6ff6e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5739768e-3825-4869-9a20-d65269d6ff6e" (UID: "5739768e-3825-4869-9a20-d65269d6ff6e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:26:08 crc kubenswrapper[4837]: I0313 12:26:08.162126 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5739768e-3825-4869-9a20-d65269d6ff6e-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:26:08 crc kubenswrapper[4837]: I0313 12:26:08.162169 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69jzh\" (UniqueName: \"kubernetes.io/projected/5739768e-3825-4869-9a20-d65269d6ff6e-kube-api-access-69jzh\") on node \"crc\" DevicePath \"\"" Mar 13 12:26:08 crc kubenswrapper[4837]: I0313 12:26:08.162180 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5739768e-3825-4869-9a20-d65269d6ff6e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:26:08 crc kubenswrapper[4837]: I0313 12:26:08.556533 4837 generic.go:334] "Generic (PLEG): container finished" podID="5739768e-3825-4869-9a20-d65269d6ff6e" containerID="7515c5269cc2629534353ef272cf372654fdc13cb5d83e02f1693f6899aa19f0" exitCode=0 Mar 13 12:26:08 crc kubenswrapper[4837]: I0313 12:26:08.556601 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6t59" event={"ID":"5739768e-3825-4869-9a20-d65269d6ff6e","Type":"ContainerDied","Data":"7515c5269cc2629534353ef272cf372654fdc13cb5d83e02f1693f6899aa19f0"} Mar 13 12:26:08 crc kubenswrapper[4837]: I0313 12:26:08.556652 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j6t59" Mar 13 12:26:08 crc kubenswrapper[4837]: I0313 12:26:08.556697 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6t59" event={"ID":"5739768e-3825-4869-9a20-d65269d6ff6e","Type":"ContainerDied","Data":"6e5ab350326a007d05239aba067c1ae7270bb4feadf1120d3dff1a07c76500a1"} Mar 13 12:26:08 crc kubenswrapper[4837]: I0313 12:26:08.556751 4837 scope.go:117] "RemoveContainer" containerID="7515c5269cc2629534353ef272cf372654fdc13cb5d83e02f1693f6899aa19f0" Mar 13 12:26:08 crc kubenswrapper[4837]: I0313 12:26:08.579937 4837 scope.go:117] "RemoveContainer" containerID="18665a547215957278c7198a584f67fb5944ff6d98d64e5f47da861c793724c2" Mar 13 12:26:08 crc kubenswrapper[4837]: I0313 12:26:08.603489 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j6t59"] Mar 13 12:26:08 crc kubenswrapper[4837]: I0313 12:26:08.618739 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-n7dpp" Mar 13 12:26:08 crc kubenswrapper[4837]: I0313 12:26:08.625285 4837 scope.go:117] "RemoveContainer" containerID="8ec46fdf64b8b41f5ed86e596fecb59aef4dd2e3445048553d57b57f44cdf29d" Mar 13 12:26:08 crc kubenswrapper[4837]: I0313 12:26:08.627199 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-j6t59"] Mar 13 12:26:08 crc kubenswrapper[4837]: I0313 12:26:08.656462 4837 scope.go:117] "RemoveContainer" containerID="7515c5269cc2629534353ef272cf372654fdc13cb5d83e02f1693f6899aa19f0" Mar 13 12:26:08 crc kubenswrapper[4837]: E0313 12:26:08.657028 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7515c5269cc2629534353ef272cf372654fdc13cb5d83e02f1693f6899aa19f0\": container with ID starting with 7515c5269cc2629534353ef272cf372654fdc13cb5d83e02f1693f6899aa19f0 not found: ID does not exist" containerID="7515c5269cc2629534353ef272cf372654fdc13cb5d83e02f1693f6899aa19f0" Mar 13 12:26:08 crc kubenswrapper[4837]: I0313 12:26:08.657078 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7515c5269cc2629534353ef272cf372654fdc13cb5d83e02f1693f6899aa19f0"} err="failed to get container status \"7515c5269cc2629534353ef272cf372654fdc13cb5d83e02f1693f6899aa19f0\": rpc error: code = NotFound desc = could not find container \"7515c5269cc2629534353ef272cf372654fdc13cb5d83e02f1693f6899aa19f0\": container with ID starting with 7515c5269cc2629534353ef272cf372654fdc13cb5d83e02f1693f6899aa19f0 not found: ID does not exist" Mar 13 12:26:08 crc kubenswrapper[4837]: I0313 12:26:08.657110 4837 scope.go:117] "RemoveContainer" containerID="18665a547215957278c7198a584f67fb5944ff6d98d64e5f47da861c793724c2" Mar 13 12:26:08 crc kubenswrapper[4837]: E0313 12:26:08.657481 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18665a547215957278c7198a584f67fb5944ff6d98d64e5f47da861c793724c2\": container with ID starting with 18665a547215957278c7198a584f67fb5944ff6d98d64e5f47da861c793724c2 not found: ID does not exist" containerID="18665a547215957278c7198a584f67fb5944ff6d98d64e5f47da861c793724c2" Mar 13 12:26:08 crc kubenswrapper[4837]: I0313 12:26:08.657524 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18665a547215957278c7198a584f67fb5944ff6d98d64e5f47da861c793724c2"} err="failed to get container status \"18665a547215957278c7198a584f67fb5944ff6d98d64e5f47da861c793724c2\": rpc error: code = NotFound desc = could not find container \"18665a547215957278c7198a584f67fb5944ff6d98d64e5f47da861c793724c2\": container with ID starting with 18665a547215957278c7198a584f67fb5944ff6d98d64e5f47da861c793724c2 not found: ID does not exist" Mar 13 12:26:08 crc kubenswrapper[4837]: I0313 12:26:08.657551 4837 scope.go:117] "RemoveContainer" containerID="8ec46fdf64b8b41f5ed86e596fecb59aef4dd2e3445048553d57b57f44cdf29d" Mar 13 12:26:08 crc kubenswrapper[4837]: E0313 12:26:08.657835 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ec46fdf64b8b41f5ed86e596fecb59aef4dd2e3445048553d57b57f44cdf29d\": container with ID starting with 8ec46fdf64b8b41f5ed86e596fecb59aef4dd2e3445048553d57b57f44cdf29d not found: ID does not exist" containerID="8ec46fdf64b8b41f5ed86e596fecb59aef4dd2e3445048553d57b57f44cdf29d" Mar 13 12:26:08 crc kubenswrapper[4837]: I0313 12:26:08.657865 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ec46fdf64b8b41f5ed86e596fecb59aef4dd2e3445048553d57b57f44cdf29d"} err="failed to get container status \"8ec46fdf64b8b41f5ed86e596fecb59aef4dd2e3445048553d57b57f44cdf29d\": rpc error: code = NotFound desc = could not find container \"8ec46fdf64b8b41f5ed86e596fecb59aef4dd2e3445048553d57b57f44cdf29d\": container with ID starting with 8ec46fdf64b8b41f5ed86e596fecb59aef4dd2e3445048553d57b57f44cdf29d not found: ID does not exist" Mar 13 12:26:09 crc kubenswrapper[4837]: I0313 12:26:09.072910 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5739768e-3825-4869-9a20-d65269d6ff6e" path="/var/lib/kubelet/pods/5739768e-3825-4869-9a20-d65269d6ff6e/volumes" Mar 13 12:26:12 crc kubenswrapper[4837]: I0313 12:26:12.202049 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n7dpp"] Mar 13 12:26:12 crc kubenswrapper[4837]: I0313 12:26:12.202297 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-n7dpp" podUID="fdac88ff-0567-4477-b88a-a90c2bc99da8" containerName="registry-server" containerID="cri-o://a5eee508b483e64f809508b03aa1f0b24998bdde6d37da5807abe3cdc59f087e" gracePeriod=2 Mar 13 12:26:12 crc kubenswrapper[4837]: I0313 12:26:12.607397 4837 generic.go:334] "Generic (PLEG): container finished" podID="fdac88ff-0567-4477-b88a-a90c2bc99da8" containerID="a5eee508b483e64f809508b03aa1f0b24998bdde6d37da5807abe3cdc59f087e" exitCode=0 Mar 13 12:26:12 crc kubenswrapper[4837]: I0313 12:26:12.607421 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n7dpp" event={"ID":"fdac88ff-0567-4477-b88a-a90c2bc99da8","Type":"ContainerDied","Data":"a5eee508b483e64f809508b03aa1f0b24998bdde6d37da5807abe3cdc59f087e"} Mar 13 12:26:12 crc kubenswrapper[4837]: I0313 12:26:12.607787 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n7dpp" event={"ID":"fdac88ff-0567-4477-b88a-a90c2bc99da8","Type":"ContainerDied","Data":"bc1c61427223537fb34eec963d583b2360136dfdbb62761ba6bcff070b488990"} Mar 13 12:26:12 crc kubenswrapper[4837]: I0313 12:26:12.607806 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc1c61427223537fb34eec963d583b2360136dfdbb62761ba6bcff070b488990" Mar 13 12:26:12 crc kubenswrapper[4837]: I0313 12:26:12.683227 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n7dpp" Mar 13 12:26:12 crc kubenswrapper[4837]: I0313 12:26:12.746432 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdac88ff-0567-4477-b88a-a90c2bc99da8-catalog-content\") pod \"fdac88ff-0567-4477-b88a-a90c2bc99da8\" (UID: \"fdac88ff-0567-4477-b88a-a90c2bc99da8\") " Mar 13 12:26:12 crc kubenswrapper[4837]: I0313 12:26:12.746476 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c9hp\" (UniqueName: \"kubernetes.io/projected/fdac88ff-0567-4477-b88a-a90c2bc99da8-kube-api-access-7c9hp\") pod \"fdac88ff-0567-4477-b88a-a90c2bc99da8\" (UID: \"fdac88ff-0567-4477-b88a-a90c2bc99da8\") " Mar 13 12:26:12 crc kubenswrapper[4837]: I0313 12:26:12.746601 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdac88ff-0567-4477-b88a-a90c2bc99da8-utilities\") pod \"fdac88ff-0567-4477-b88a-a90c2bc99da8\" (UID: \"fdac88ff-0567-4477-b88a-a90c2bc99da8\") " Mar 13 12:26:12 crc kubenswrapper[4837]: I0313 12:26:12.747627 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdac88ff-0567-4477-b88a-a90c2bc99da8-utilities" (OuterVolumeSpecName: "utilities") pod "fdac88ff-0567-4477-b88a-a90c2bc99da8" (UID: "fdac88ff-0567-4477-b88a-a90c2bc99da8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:26:13 crc kubenswrapper[4837]: I0313 12:26:12.772915 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdac88ff-0567-4477-b88a-a90c2bc99da8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fdac88ff-0567-4477-b88a-a90c2bc99da8" (UID: "fdac88ff-0567-4477-b88a-a90c2bc99da8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:26:13 crc kubenswrapper[4837]: I0313 12:26:13.615457 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n7dpp" Mar 13 12:26:13 crc kubenswrapper[4837]: I0313 12:26:13.643585 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdac88ff-0567-4477-b88a-a90c2bc99da8-kube-api-access-7c9hp" (OuterVolumeSpecName: "kube-api-access-7c9hp") pod "fdac88ff-0567-4477-b88a-a90c2bc99da8" (UID: "fdac88ff-0567-4477-b88a-a90c2bc99da8"). InnerVolumeSpecName "kube-api-access-7c9hp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:26:13 crc kubenswrapper[4837]: I0313 12:26:13.648962 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdac88ff-0567-4477-b88a-a90c2bc99da8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:26:13 crc kubenswrapper[4837]: I0313 12:26:13.648993 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c9hp\" (UniqueName: \"kubernetes.io/projected/fdac88ff-0567-4477-b88a-a90c2bc99da8-kube-api-access-7c9hp\") on node \"crc\" DevicePath \"\"" Mar 13 12:26:13 crc kubenswrapper[4837]: I0313 12:26:13.649006 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdac88ff-0567-4477-b88a-a90c2bc99da8-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:26:13 crc kubenswrapper[4837]: I0313 12:26:13.937934 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n7dpp"] Mar 13 12:26:13 crc kubenswrapper[4837]: I0313 12:26:13.952404 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-n7dpp"] Mar 13 12:26:15 crc kubenswrapper[4837]: I0313 12:26:15.060814 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdac88ff-0567-4477-b88a-a90c2bc99da8" path="/var/lib/kubelet/pods/fdac88ff-0567-4477-b88a-a90c2bc99da8/volumes" Mar 13 12:26:18 crc kubenswrapper[4837]: I0313 12:26:18.048898 4837 scope.go:117] "RemoveContainer" containerID="e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504" Mar 13 12:26:18 crc kubenswrapper[4837]: E0313 12:26:18.049326 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:26:29 crc kubenswrapper[4837]: I0313 12:26:29.049234 4837 scope.go:117] "RemoveContainer" containerID="e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504" Mar 13 12:26:29 crc kubenswrapper[4837]: E0313 12:26:29.050514 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:26:30 crc kubenswrapper[4837]: I0313 12:26:30.355419 4837 scope.go:117] "RemoveContainer" containerID="5c53da3a56d1c8f877bdab4d65362dc1a8c31f8cd4991718456d0c1946898d66" Mar 13 12:26:40 crc kubenswrapper[4837]: I0313 12:26:40.048201 4837 scope.go:117] "RemoveContainer" containerID="e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504" Mar 13 12:26:40 crc kubenswrapper[4837]: E0313 12:26:40.050572 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:26:51 crc kubenswrapper[4837]: I0313 12:26:51.952619 4837 generic.go:334] "Generic (PLEG): container finished" podID="e6986f16-e143-49f4-81e5-58abba717876" containerID="18a83cd1cba4b0ec8cbb0763088a8fc20438f178de5bb307e3e42d268b1d9ec5" exitCode=0 Mar 13 12:26:51 crc kubenswrapper[4837]: I0313 12:26:51.952692 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" event={"ID":"e6986f16-e143-49f4-81e5-58abba717876","Type":"ContainerDied","Data":"18a83cd1cba4b0ec8cbb0763088a8fc20438f178de5bb307e3e42d268b1d9ec5"} Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.404190 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.577348 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-cell1-compute-config-2\") pod \"e6986f16-e143-49f4-81e5-58abba717876\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.577414 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-cell1-compute-config-3\") pod \"e6986f16-e143-49f4-81e5-58abba717876\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.577439 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-ssh-key-openstack-edpm-ipam\") pod \"e6986f16-e143-49f4-81e5-58abba717876\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.577482 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-cell1-compute-config-0\") pod \"e6986f16-e143-49f4-81e5-58abba717876\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.577514 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e6986f16-e143-49f4-81e5-58abba717876-nova-extra-config-0\") pod \"e6986f16-e143-49f4-81e5-58abba717876\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.577561 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzqgf\" (UniqueName: \"kubernetes.io/projected/e6986f16-e143-49f4-81e5-58abba717876-kube-api-access-fzqgf\") pod \"e6986f16-e143-49f4-81e5-58abba717876\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.577615 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-migration-ssh-key-0\") pod \"e6986f16-e143-49f4-81e5-58abba717876\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.577668 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-inventory\") pod \"e6986f16-e143-49f4-81e5-58abba717876\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.577743 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-cell1-compute-config-1\") pod \"e6986f16-e143-49f4-81e5-58abba717876\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.577796 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-combined-ca-bundle\") pod \"e6986f16-e143-49f4-81e5-58abba717876\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.577833 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-migration-ssh-key-1\") pod \"e6986f16-e143-49f4-81e5-58abba717876\" (UID: \"e6986f16-e143-49f4-81e5-58abba717876\") " Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.583587 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6986f16-e143-49f4-81e5-58abba717876-kube-api-access-fzqgf" (OuterVolumeSpecName: "kube-api-access-fzqgf") pod "e6986f16-e143-49f4-81e5-58abba717876" (UID: "e6986f16-e143-49f4-81e5-58abba717876"). InnerVolumeSpecName "kube-api-access-fzqgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.583738 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "e6986f16-e143-49f4-81e5-58abba717876" (UID: "e6986f16-e143-49f4-81e5-58abba717876"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.607206 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "e6986f16-e143-49f4-81e5-58abba717876" (UID: "e6986f16-e143-49f4-81e5-58abba717876"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.607300 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "e6986f16-e143-49f4-81e5-58abba717876" (UID: "e6986f16-e143-49f4-81e5-58abba717876"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.608596 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e6986f16-e143-49f4-81e5-58abba717876" (UID: "e6986f16-e143-49f4-81e5-58abba717876"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.609197 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "e6986f16-e143-49f4-81e5-58abba717876" (UID: "e6986f16-e143-49f4-81e5-58abba717876"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.613836 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "e6986f16-e143-49f4-81e5-58abba717876" (UID: "e6986f16-e143-49f4-81e5-58abba717876"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.615620 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-inventory" (OuterVolumeSpecName: "inventory") pod "e6986f16-e143-49f4-81e5-58abba717876" (UID: "e6986f16-e143-49f4-81e5-58abba717876"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.617918 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "e6986f16-e143-49f4-81e5-58abba717876" (UID: "e6986f16-e143-49f4-81e5-58abba717876"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.628116 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "e6986f16-e143-49f4-81e5-58abba717876" (UID: "e6986f16-e143-49f4-81e5-58abba717876"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.642824 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6986f16-e143-49f4-81e5-58abba717876-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "e6986f16-e143-49f4-81e5-58abba717876" (UID: "e6986f16-e143-49f4-81e5-58abba717876"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.678755 4837 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.678794 4837 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.678803 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.678815 4837 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.678825 4837 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e6986f16-e143-49f4-81e5-58abba717876-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.678833 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzqgf\" (UniqueName: \"kubernetes.io/projected/e6986f16-e143-49f4-81e5-58abba717876-kube-api-access-fzqgf\") on node \"crc\" DevicePath \"\"" Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.678842 4837 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.678852 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.678861 4837 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.678869 4837 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.678878 4837 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e6986f16-e143-49f4-81e5-58abba717876-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.972684 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" event={"ID":"e6986f16-e143-49f4-81e5-58abba717876","Type":"ContainerDied","Data":"7cc82d3fd71f5170b65dc07b84d25573c7febed498176a66fed2dfd3b4619643"} Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.972992 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7cc82d3fd71f5170b65dc07b84d25573c7febed498176a66fed2dfd3b4619643" Mar 13 12:26:53 crc kubenswrapper[4837]: I0313 12:26:53.972725 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-4jdmk" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.049557 4837 scope.go:117] "RemoveContainer" containerID="e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504" Mar 13 12:26:54 crc kubenswrapper[4837]: E0313 12:26:54.050023 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.083558 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x"] Mar 13 12:26:54 crc kubenswrapper[4837]: E0313 12:26:54.084135 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb9a9c7b-13fc-4655-91b2-a388c3870bf8" containerName="oc" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.084157 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb9a9c7b-13fc-4655-91b2-a388c3870bf8" containerName="oc" Mar 13 12:26:54 crc kubenswrapper[4837]: E0313 12:26:54.084170 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdac88ff-0567-4477-b88a-a90c2bc99da8" containerName="extract-utilities" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.084177 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdac88ff-0567-4477-b88a-a90c2bc99da8" containerName="extract-utilities" Mar 13 12:26:54 crc kubenswrapper[4837]: E0313 12:26:54.084199 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5739768e-3825-4869-9a20-d65269d6ff6e" containerName="extract-content" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.084205 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5739768e-3825-4869-9a20-d65269d6ff6e" containerName="extract-content" Mar 13 12:26:54 crc kubenswrapper[4837]: E0313 12:26:54.084215 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5739768e-3825-4869-9a20-d65269d6ff6e" containerName="extract-utilities" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.084220 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5739768e-3825-4869-9a20-d65269d6ff6e" containerName="extract-utilities" Mar 13 12:26:54 crc kubenswrapper[4837]: E0313 12:26:54.084231 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5739768e-3825-4869-9a20-d65269d6ff6e" containerName="registry-server" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.084237 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5739768e-3825-4869-9a20-d65269d6ff6e" containerName="registry-server" Mar 13 12:26:54 crc kubenswrapper[4837]: E0313 12:26:54.084255 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdac88ff-0567-4477-b88a-a90c2bc99da8" containerName="registry-server" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.084260 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdac88ff-0567-4477-b88a-a90c2bc99da8" containerName="registry-server" Mar 13 12:26:54 crc kubenswrapper[4837]: E0313 12:26:54.084277 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdac88ff-0567-4477-b88a-a90c2bc99da8" containerName="extract-content" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.084283 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdac88ff-0567-4477-b88a-a90c2bc99da8" containerName="extract-content" Mar 13 12:26:54 crc kubenswrapper[4837]: E0313 12:26:54.084298 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6986f16-e143-49f4-81e5-58abba717876" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.084304 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6986f16-e143-49f4-81e5-58abba717876" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.084717 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="5739768e-3825-4869-9a20-d65269d6ff6e" containerName="registry-server" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.084767 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb9a9c7b-13fc-4655-91b2-a388c3870bf8" containerName="oc" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.084783 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdac88ff-0567-4477-b88a-a90c2bc99da8" containerName="registry-server" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.084816 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6986f16-e143-49f4-81e5-58abba717876" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.085726 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.088169 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.088473 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.089358 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.089443 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dxdkz" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.090350 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.097704 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x"] Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.289689 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x\" (UID: \"ac15848f-4f6f-4159-828f-d30a77f93a4b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.289739 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x\" (UID: \"ac15848f-4f6f-4159-828f-d30a77f93a4b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.289843 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x\" (UID: \"ac15848f-4f6f-4159-828f-d30a77f93a4b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.289976 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x\" (UID: \"ac15848f-4f6f-4159-828f-d30a77f93a4b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.290043 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x\" (UID: \"ac15848f-4f6f-4159-828f-d30a77f93a4b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.290123 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x\" (UID: \"ac15848f-4f6f-4159-828f-d30a77f93a4b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.290179 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7pcx\" (UniqueName: \"kubernetes.io/projected/ac15848f-4f6f-4159-828f-d30a77f93a4b-kube-api-access-w7pcx\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x\" (UID: \"ac15848f-4f6f-4159-828f-d30a77f93a4b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.391802 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x\" (UID: \"ac15848f-4f6f-4159-828f-d30a77f93a4b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.391930 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7pcx\" (UniqueName: \"kubernetes.io/projected/ac15848f-4f6f-4159-828f-d30a77f93a4b-kube-api-access-w7pcx\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x\" (UID: \"ac15848f-4f6f-4159-828f-d30a77f93a4b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.391975 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x\" (UID: \"ac15848f-4f6f-4159-828f-d30a77f93a4b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.392001 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x\" (UID: \"ac15848f-4f6f-4159-828f-d30a77f93a4b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.392045 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x\" (UID: \"ac15848f-4f6f-4159-828f-d30a77f93a4b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.392146 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x\" (UID: \"ac15848f-4f6f-4159-828f-d30a77f93a4b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.392204 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x\" (UID: \"ac15848f-4f6f-4159-828f-d30a77f93a4b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.397301 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x\" (UID: \"ac15848f-4f6f-4159-828f-d30a77f93a4b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.397812 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x\" (UID: \"ac15848f-4f6f-4159-828f-d30a77f93a4b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.398092 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x\" (UID: \"ac15848f-4f6f-4159-828f-d30a77f93a4b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.398215 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x\" (UID: \"ac15848f-4f6f-4159-828f-d30a77f93a4b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.399203 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x\" (UID: \"ac15848f-4f6f-4159-828f-d30a77f93a4b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.407158 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x\" (UID: \"ac15848f-4f6f-4159-828f-d30a77f93a4b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.411886 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7pcx\" (UniqueName: \"kubernetes.io/projected/ac15848f-4f6f-4159-828f-d30a77f93a4b-kube-api-access-w7pcx\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x\" (UID: \"ac15848f-4f6f-4159-828f-d30a77f93a4b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" Mar 13 12:26:54 crc kubenswrapper[4837]: I0313 12:26:54.702412 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" Mar 13 12:26:55 crc kubenswrapper[4837]: I0313 12:26:55.277905 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x"] Mar 13 12:26:56 crc kubenswrapper[4837]: I0313 12:26:56.002298 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" event={"ID":"ac15848f-4f6f-4159-828f-d30a77f93a4b","Type":"ContainerStarted","Data":"66c3c97b0179ed9e241ac7948bb5392842023bac50cbf42e15411d47abcc3b2e"} Mar 13 12:26:56 crc kubenswrapper[4837]: I0313 12:26:56.002869 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" event={"ID":"ac15848f-4f6f-4159-828f-d30a77f93a4b","Type":"ContainerStarted","Data":"b7bde5f05c7e3a4c4bb83a3a8eca2fac351228924917d2ac26964f39056c8c9f"} Mar 13 12:26:56 crc kubenswrapper[4837]: I0313 12:26:56.030521 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" podStartSLOduration=1.605321947 podStartE2EDuration="2.030499163s" podCreationTimestamp="2026-03-13 12:26:54 +0000 UTC" firstStartedPulling="2026-03-13 12:26:55.284040996 +0000 UTC m=+2330.922307759" lastFinishedPulling="2026-03-13 12:26:55.709218212 +0000 UTC m=+2331.347484975" observedRunningTime="2026-03-13 12:26:56.020888892 +0000 UTC m=+2331.659155675" watchObservedRunningTime="2026-03-13 12:26:56.030499163 +0000 UTC m=+2331.668765926" Mar 13 12:27:09 crc kubenswrapper[4837]: I0313 12:27:09.055206 4837 scope.go:117] "RemoveContainer" containerID="e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504" Mar 13 12:27:09 crc kubenswrapper[4837]: E0313 12:27:09.056774 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:27:20 crc kubenswrapper[4837]: I0313 12:27:20.048819 4837 scope.go:117] "RemoveContainer" containerID="e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504" Mar 13 12:27:20 crc kubenswrapper[4837]: E0313 12:27:20.050111 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:27:31 crc kubenswrapper[4837]: I0313 12:27:31.049334 4837 scope.go:117] "RemoveContainer" containerID="e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504" Mar 13 12:27:31 crc kubenswrapper[4837]: E0313 12:27:31.050686 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:27:43 crc kubenswrapper[4837]: I0313 12:27:43.048720 4837 scope.go:117] "RemoveContainer" containerID="e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504" Mar 13 12:27:43 crc kubenswrapper[4837]: E0313 12:27:43.049783 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:27:54 crc kubenswrapper[4837]: I0313 12:27:54.048928 4837 scope.go:117] "RemoveContainer" containerID="e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504" Mar 13 12:27:54 crc kubenswrapper[4837]: E0313 12:27:54.049509 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:28:00 crc kubenswrapper[4837]: I0313 12:28:00.138931 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556748-dgqh7"] Mar 13 12:28:00 crc kubenswrapper[4837]: I0313 12:28:00.140441 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556748-dgqh7" Mar 13 12:28:00 crc kubenswrapper[4837]: I0313 12:28:00.144868 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:28:00 crc kubenswrapper[4837]: I0313 12:28:00.144872 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:28:00 crc kubenswrapper[4837]: I0313 12:28:00.145032 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:28:00 crc kubenswrapper[4837]: I0313 12:28:00.161861 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556748-dgqh7"] Mar 13 12:28:00 crc kubenswrapper[4837]: I0313 12:28:00.264278 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlhlt\" (UniqueName: \"kubernetes.io/projected/72d56baf-1d17-4cbb-a351-8f5bf373c768-kube-api-access-dlhlt\") pod \"auto-csr-approver-29556748-dgqh7\" (UID: \"72d56baf-1d17-4cbb-a351-8f5bf373c768\") " pod="openshift-infra/auto-csr-approver-29556748-dgqh7" Mar 13 12:28:00 crc kubenswrapper[4837]: I0313 12:28:00.366755 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlhlt\" (UniqueName: \"kubernetes.io/projected/72d56baf-1d17-4cbb-a351-8f5bf373c768-kube-api-access-dlhlt\") pod \"auto-csr-approver-29556748-dgqh7\" (UID: \"72d56baf-1d17-4cbb-a351-8f5bf373c768\") " pod="openshift-infra/auto-csr-approver-29556748-dgqh7" Mar 13 12:28:00 crc kubenswrapper[4837]: I0313 12:28:00.385268 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlhlt\" (UniqueName: \"kubernetes.io/projected/72d56baf-1d17-4cbb-a351-8f5bf373c768-kube-api-access-dlhlt\") pod \"auto-csr-approver-29556748-dgqh7\" (UID: \"72d56baf-1d17-4cbb-a351-8f5bf373c768\") " pod="openshift-infra/auto-csr-approver-29556748-dgqh7" Mar 13 12:28:00 crc kubenswrapper[4837]: I0313 12:28:00.470427 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556748-dgqh7" Mar 13 12:28:00 crc kubenswrapper[4837]: I0313 12:28:00.955774 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556748-dgqh7"] Mar 13 12:28:01 crc kubenswrapper[4837]: I0313 12:28:01.562691 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556748-dgqh7" event={"ID":"72d56baf-1d17-4cbb-a351-8f5bf373c768","Type":"ContainerStarted","Data":"16962aea59727abab979a02d6decfef8fe22e9a60374e99da3395d6326096e2e"} Mar 13 12:28:02 crc kubenswrapper[4837]: I0313 12:28:02.576856 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556748-dgqh7" event={"ID":"72d56baf-1d17-4cbb-a351-8f5bf373c768","Type":"ContainerStarted","Data":"1f6ecae5057b8984cf0bda7716bd86797af96a5f4c3a84ef8f5f85cb3c1def23"} Mar 13 12:28:02 crc kubenswrapper[4837]: I0313 12:28:02.597772 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556748-dgqh7" podStartSLOduration=1.393065649 podStartE2EDuration="2.59774881s" podCreationTimestamp="2026-03-13 12:28:00 +0000 UTC" firstStartedPulling="2026-03-13 12:28:00.960944594 +0000 UTC m=+2396.599211357" lastFinishedPulling="2026-03-13 12:28:02.165627755 +0000 UTC m=+2397.803894518" observedRunningTime="2026-03-13 12:28:02.589165051 +0000 UTC m=+2398.227431824" watchObservedRunningTime="2026-03-13 12:28:02.59774881 +0000 UTC m=+2398.236015573" Mar 13 12:28:03 crc kubenswrapper[4837]: I0313 12:28:03.588869 4837 generic.go:334] "Generic (PLEG): container finished" podID="72d56baf-1d17-4cbb-a351-8f5bf373c768" containerID="1f6ecae5057b8984cf0bda7716bd86797af96a5f4c3a84ef8f5f85cb3c1def23" exitCode=0 Mar 13 12:28:03 crc kubenswrapper[4837]: I0313 12:28:03.588909 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556748-dgqh7" event={"ID":"72d56baf-1d17-4cbb-a351-8f5bf373c768","Type":"ContainerDied","Data":"1f6ecae5057b8984cf0bda7716bd86797af96a5f4c3a84ef8f5f85cb3c1def23"} Mar 13 12:28:04 crc kubenswrapper[4837]: I0313 12:28:04.931328 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556748-dgqh7" Mar 13 12:28:04 crc kubenswrapper[4837]: I0313 12:28:04.956016 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlhlt\" (UniqueName: \"kubernetes.io/projected/72d56baf-1d17-4cbb-a351-8f5bf373c768-kube-api-access-dlhlt\") pod \"72d56baf-1d17-4cbb-a351-8f5bf373c768\" (UID: \"72d56baf-1d17-4cbb-a351-8f5bf373c768\") " Mar 13 12:28:04 crc kubenswrapper[4837]: I0313 12:28:04.967792 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72d56baf-1d17-4cbb-a351-8f5bf373c768-kube-api-access-dlhlt" (OuterVolumeSpecName: "kube-api-access-dlhlt") pod "72d56baf-1d17-4cbb-a351-8f5bf373c768" (UID: "72d56baf-1d17-4cbb-a351-8f5bf373c768"). InnerVolumeSpecName "kube-api-access-dlhlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:28:05 crc kubenswrapper[4837]: I0313 12:28:05.057837 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlhlt\" (UniqueName: \"kubernetes.io/projected/72d56baf-1d17-4cbb-a351-8f5bf373c768-kube-api-access-dlhlt\") on node \"crc\" DevicePath \"\"" Mar 13 12:28:05 crc kubenswrapper[4837]: I0313 12:28:05.610032 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556748-dgqh7" event={"ID":"72d56baf-1d17-4cbb-a351-8f5bf373c768","Type":"ContainerDied","Data":"16962aea59727abab979a02d6decfef8fe22e9a60374e99da3395d6326096e2e"} Mar 13 12:28:05 crc kubenswrapper[4837]: I0313 12:28:05.610085 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16962aea59727abab979a02d6decfef8fe22e9a60374e99da3395d6326096e2e" Mar 13 12:28:05 crc kubenswrapper[4837]: I0313 12:28:05.610157 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556748-dgqh7" Mar 13 12:28:05 crc kubenswrapper[4837]: I0313 12:28:05.660575 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556742-5ggnq"] Mar 13 12:28:05 crc kubenswrapper[4837]: I0313 12:28:05.670747 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556742-5ggnq"] Mar 13 12:28:07 crc kubenswrapper[4837]: I0313 12:28:07.049034 4837 scope.go:117] "RemoveContainer" containerID="e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504" Mar 13 12:28:07 crc kubenswrapper[4837]: E0313 12:28:07.049367 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:28:07 crc kubenswrapper[4837]: I0313 12:28:07.061466 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aed6dbbf-3a09-4b60-9757-7c74a07f9c63" path="/var/lib/kubelet/pods/aed6dbbf-3a09-4b60-9757-7c74a07f9c63/volumes" Mar 13 12:28:13 crc kubenswrapper[4837]: E0313 12:28:13.217512 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72d56baf_1d17_4cbb_a351_8f5bf373c768.slice/crio-16962aea59727abab979a02d6decfef8fe22e9a60374e99da3395d6326096e2e\": RecentStats: unable to find data in memory cache]" Mar 13 12:28:18 crc kubenswrapper[4837]: I0313 12:28:18.048611 4837 scope.go:117] "RemoveContainer" containerID="e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504" Mar 13 12:28:18 crc kubenswrapper[4837]: E0313 12:28:18.049087 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:28:23 crc kubenswrapper[4837]: E0313 12:28:23.476257 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72d56baf_1d17_4cbb_a351_8f5bf373c768.slice/crio-16962aea59727abab979a02d6decfef8fe22e9a60374e99da3395d6326096e2e\": RecentStats: unable to find data in memory cache]" Mar 13 12:28:30 crc kubenswrapper[4837]: I0313 12:28:30.464013 4837 scope.go:117] "RemoveContainer" containerID="852beb2b4218c3ee146b9596afb327ce3ec642be20ae0116d12166c03475804d" Mar 13 12:28:33 crc kubenswrapper[4837]: I0313 12:28:33.048747 4837 scope.go:117] "RemoveContainer" containerID="e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504" Mar 13 12:28:33 crc kubenswrapper[4837]: E0313 12:28:33.049254 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:28:33 crc kubenswrapper[4837]: E0313 12:28:33.770898 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72d56baf_1d17_4cbb_a351_8f5bf373c768.slice/crio-16962aea59727abab979a02d6decfef8fe22e9a60374e99da3395d6326096e2e\": RecentStats: unable to find data in memory cache]" Mar 13 12:28:44 crc kubenswrapper[4837]: E0313 12:28:44.030134 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72d56baf_1d17_4cbb_a351_8f5bf373c768.slice/crio-16962aea59727abab979a02d6decfef8fe22e9a60374e99da3395d6326096e2e\": RecentStats: unable to find data in memory cache]" Mar 13 12:28:45 crc kubenswrapper[4837]: I0313 12:28:45.057411 4837 scope.go:117] "RemoveContainer" containerID="e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504" Mar 13 12:28:45 crc kubenswrapper[4837]: E0313 12:28:45.058077 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:28:54 crc kubenswrapper[4837]: E0313 12:28:54.260624 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72d56baf_1d17_4cbb_a351_8f5bf373c768.slice/crio-16962aea59727abab979a02d6decfef8fe22e9a60374e99da3395d6326096e2e\": RecentStats: unable to find data in memory cache]" Mar 13 12:28:56 crc kubenswrapper[4837]: I0313 12:28:56.048457 4837 scope.go:117] "RemoveContainer" containerID="e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504" Mar 13 12:28:56 crc kubenswrapper[4837]: E0313 12:28:56.049174 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:29:01 crc kubenswrapper[4837]: I0313 12:29:01.113592 4837 generic.go:334] "Generic (PLEG): container finished" podID="ac15848f-4f6f-4159-828f-d30a77f93a4b" containerID="66c3c97b0179ed9e241ac7948bb5392842023bac50cbf42e15411d47abcc3b2e" exitCode=0 Mar 13 12:29:01 crc kubenswrapper[4837]: I0313 12:29:01.113720 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" event={"ID":"ac15848f-4f6f-4159-828f-d30a77f93a4b","Type":"ContainerDied","Data":"66c3c97b0179ed9e241ac7948bb5392842023bac50cbf42e15411d47abcc3b2e"} Mar 13 12:29:02 crc kubenswrapper[4837]: I0313 12:29:02.506280 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" Mar 13 12:29:02 crc kubenswrapper[4837]: I0313 12:29:02.684528 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-ceilometer-compute-config-data-2\") pod \"ac15848f-4f6f-4159-828f-d30a77f93a4b\" (UID: \"ac15848f-4f6f-4159-828f-d30a77f93a4b\") " Mar 13 12:29:02 crc kubenswrapper[4837]: I0313 12:29:02.684711 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-ssh-key-openstack-edpm-ipam\") pod \"ac15848f-4f6f-4159-828f-d30a77f93a4b\" (UID: \"ac15848f-4f6f-4159-828f-d30a77f93a4b\") " Mar 13 12:29:02 crc kubenswrapper[4837]: I0313 12:29:02.684780 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-ceilometer-compute-config-data-1\") pod \"ac15848f-4f6f-4159-828f-d30a77f93a4b\" (UID: \"ac15848f-4f6f-4159-828f-d30a77f93a4b\") " Mar 13 12:29:02 crc kubenswrapper[4837]: I0313 12:29:02.684811 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7pcx\" (UniqueName: \"kubernetes.io/projected/ac15848f-4f6f-4159-828f-d30a77f93a4b-kube-api-access-w7pcx\") pod \"ac15848f-4f6f-4159-828f-d30a77f93a4b\" (UID: \"ac15848f-4f6f-4159-828f-d30a77f93a4b\") " Mar 13 12:29:02 crc kubenswrapper[4837]: I0313 12:29:02.684891 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-ceilometer-compute-config-data-0\") pod \"ac15848f-4f6f-4159-828f-d30a77f93a4b\" (UID: \"ac15848f-4f6f-4159-828f-d30a77f93a4b\") " Mar 13 12:29:02 crc kubenswrapper[4837]: I0313 12:29:02.684920 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-inventory\") pod \"ac15848f-4f6f-4159-828f-d30a77f93a4b\" (UID: \"ac15848f-4f6f-4159-828f-d30a77f93a4b\") " Mar 13 12:29:02 crc kubenswrapper[4837]: I0313 12:29:02.685034 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-telemetry-combined-ca-bundle\") pod \"ac15848f-4f6f-4159-828f-d30a77f93a4b\" (UID: \"ac15848f-4f6f-4159-828f-d30a77f93a4b\") " Mar 13 12:29:02 crc kubenswrapper[4837]: I0313 12:29:02.691519 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "ac15848f-4f6f-4159-828f-d30a77f93a4b" (UID: "ac15848f-4f6f-4159-828f-d30a77f93a4b"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:29:02 crc kubenswrapper[4837]: I0313 12:29:02.691919 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac15848f-4f6f-4159-828f-d30a77f93a4b-kube-api-access-w7pcx" (OuterVolumeSpecName: "kube-api-access-w7pcx") pod "ac15848f-4f6f-4159-828f-d30a77f93a4b" (UID: "ac15848f-4f6f-4159-828f-d30a77f93a4b"). InnerVolumeSpecName "kube-api-access-w7pcx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:29:02 crc kubenswrapper[4837]: I0313 12:29:02.712805 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "ac15848f-4f6f-4159-828f-d30a77f93a4b" (UID: "ac15848f-4f6f-4159-828f-d30a77f93a4b"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:29:02 crc kubenswrapper[4837]: I0313 12:29:02.715968 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ac15848f-4f6f-4159-828f-d30a77f93a4b" (UID: "ac15848f-4f6f-4159-828f-d30a77f93a4b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:29:02 crc kubenswrapper[4837]: I0313 12:29:02.716440 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-inventory" (OuterVolumeSpecName: "inventory") pod "ac15848f-4f6f-4159-828f-d30a77f93a4b" (UID: "ac15848f-4f6f-4159-828f-d30a77f93a4b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:29:02 crc kubenswrapper[4837]: I0313 12:29:02.717234 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "ac15848f-4f6f-4159-828f-d30a77f93a4b" (UID: "ac15848f-4f6f-4159-828f-d30a77f93a4b"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:29:02 crc kubenswrapper[4837]: I0313 12:29:02.723815 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "ac15848f-4f6f-4159-828f-d30a77f93a4b" (UID: "ac15848f-4f6f-4159-828f-d30a77f93a4b"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:29:02 crc kubenswrapper[4837]: I0313 12:29:02.787877 4837 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 12:29:02 crc kubenswrapper[4837]: I0313 12:29:02.788210 4837 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 13 12:29:02 crc kubenswrapper[4837]: I0313 12:29:02.788347 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 12:29:02 crc kubenswrapper[4837]: I0313 12:29:02.788412 4837 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 13 12:29:02 crc kubenswrapper[4837]: I0313 12:29:02.788475 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7pcx\" (UniqueName: \"kubernetes.io/projected/ac15848f-4f6f-4159-828f-d30a77f93a4b-kube-api-access-w7pcx\") on node \"crc\" DevicePath \"\"" Mar 13 12:29:02 crc kubenswrapper[4837]: I0313 12:29:02.788628 4837 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 13 12:29:02 crc kubenswrapper[4837]: I0313 12:29:02.788725 4837 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac15848f-4f6f-4159-828f-d30a77f93a4b-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 12:29:03 crc kubenswrapper[4837]: I0313 12:29:03.133613 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" event={"ID":"ac15848f-4f6f-4159-828f-d30a77f93a4b","Type":"ContainerDied","Data":"b7bde5f05c7e3a4c4bb83a3a8eca2fac351228924917d2ac26964f39056c8c9f"} Mar 13 12:29:03 crc kubenswrapper[4837]: I0313 12:29:03.133974 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7bde5f05c7e3a4c4bb83a3a8eca2fac351228924917d2ac26964f39056c8c9f" Mar 13 12:29:03 crc kubenswrapper[4837]: I0313 12:29:03.133689 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x" Mar 13 12:29:04 crc kubenswrapper[4837]: E0313 12:29:04.486011 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72d56baf_1d17_4cbb_a351_8f5bf373c768.slice/crio-16962aea59727abab979a02d6decfef8fe22e9a60374e99da3395d6326096e2e\": RecentStats: unable to find data in memory cache]" Mar 13 12:29:07 crc kubenswrapper[4837]: I0313 12:29:07.048682 4837 scope.go:117] "RemoveContainer" containerID="e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504" Mar 13 12:29:07 crc kubenswrapper[4837]: E0313 12:29:07.049514 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:29:21 crc kubenswrapper[4837]: I0313 12:29:21.048109 4837 scope.go:117] "RemoveContainer" containerID="e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504" Mar 13 12:29:21 crc kubenswrapper[4837]: E0313 12:29:21.048758 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:29:35 crc kubenswrapper[4837]: I0313 12:29:35.048449 4837 scope.go:117] "RemoveContainer" containerID="e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504" Mar 13 12:29:35 crc kubenswrapper[4837]: E0313 12:29:35.049215 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:29:50 crc kubenswrapper[4837]: I0313 12:29:50.048241 4837 scope.go:117] "RemoveContainer" containerID="e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504" Mar 13 12:29:50 crc kubenswrapper[4837]: E0313 12:29:50.049083 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:29:59 crc kubenswrapper[4837]: I0313 12:29:59.965921 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Mar 13 12:29:59 crc kubenswrapper[4837]: E0313 12:29:59.967401 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72d56baf-1d17-4cbb-a351-8f5bf373c768" containerName="oc" Mar 13 12:29:59 crc kubenswrapper[4837]: I0313 12:29:59.967436 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="72d56baf-1d17-4cbb-a351-8f5bf373c768" containerName="oc" Mar 13 12:29:59 crc kubenswrapper[4837]: E0313 12:29:59.967460 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac15848f-4f6f-4159-828f-d30a77f93a4b" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 13 12:29:59 crc kubenswrapper[4837]: I0313 12:29:59.967467 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac15848f-4f6f-4159-828f-d30a77f93a4b" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 13 12:29:59 crc kubenswrapper[4837]: I0313 12:29:59.967810 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac15848f-4f6f-4159-828f-d30a77f93a4b" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 13 12:29:59 crc kubenswrapper[4837]: I0313 12:29:59.967874 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="72d56baf-1d17-4cbb-a351-8f5bf373c768" containerName="oc" Mar 13 12:29:59 crc kubenswrapper[4837]: I0313 12:29:59.968869 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 13 12:29:59 crc kubenswrapper[4837]: I0313 12:29:59.972935 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 13 12:29:59 crc kubenswrapper[4837]: I0313 12:29:59.973333 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 13 12:29:59 crc kubenswrapper[4837]: I0313 12:29:59.973066 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-bvdx7" Mar 13 12:29:59 crc kubenswrapper[4837]: I0313 12:29:59.973163 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 13 12:29:59 crc kubenswrapper[4837]: I0313 12:29:59.976328 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.035211 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/66bdda91-c5b6-4879-9adf-21846884c797-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.035294 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.035332 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/66bdda91-c5b6-4879-9adf-21846884c797-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.035388 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/66bdda91-c5b6-4879-9adf-21846884c797-config-data\") pod \"tempest-tests-tempest\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.035445 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/66bdda91-c5b6-4879-9adf-21846884c797-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.035481 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/66bdda91-c5b6-4879-9adf-21846884c797-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.035517 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/66bdda91-c5b6-4879-9adf-21846884c797-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.035558 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdksm\" (UniqueName: \"kubernetes.io/projected/66bdda91-c5b6-4879-9adf-21846884c797-kube-api-access-zdksm\") pod \"tempest-tests-tempest\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.035665 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/66bdda91-c5b6-4879-9adf-21846884c797-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.138042 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/66bdda91-c5b6-4879-9adf-21846884c797-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.138110 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/66bdda91-c5b6-4879-9adf-21846884c797-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.138153 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/66bdda91-c5b6-4879-9adf-21846884c797-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.138203 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdksm\" (UniqueName: \"kubernetes.io/projected/66bdda91-c5b6-4879-9adf-21846884c797-kube-api-access-zdksm\") pod \"tempest-tests-tempest\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.138786 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/66bdda91-c5b6-4879-9adf-21846884c797-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.138935 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/66bdda91-c5b6-4879-9adf-21846884c797-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.139097 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/66bdda91-c5b6-4879-9adf-21846884c797-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.139615 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/66bdda91-c5b6-4879-9adf-21846884c797-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.139708 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.139752 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/66bdda91-c5b6-4879-9adf-21846884c797-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.139829 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/66bdda91-c5b6-4879-9adf-21846884c797-config-data\") pod \"tempest-tests-tempest\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.141203 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/66bdda91-c5b6-4879-9adf-21846884c797-config-data\") pod \"tempest-tests-tempest\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.142113 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.146508 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/66bdda91-c5b6-4879-9adf-21846884c797-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.151672 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/66bdda91-c5b6-4879-9adf-21846884c797-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.152143 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/66bdda91-c5b6-4879-9adf-21846884c797-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.152189 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/66bdda91-c5b6-4879-9adf-21846884c797-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.152299 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556750-xl9g6"] Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.157554 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556750-xl9g6" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.161203 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.161393 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.162047 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.164048 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdksm\" (UniqueName: \"kubernetes.io/projected/66bdda91-c5b6-4879-9adf-21846884c797-kube-api-access-zdksm\") pod \"tempest-tests-tempest\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.167627 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556750-l24ln"] Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.169209 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556750-l24ln" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.172507 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.174271 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.182163 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556750-xl9g6"] Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.192176 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556750-l24ln"] Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.194957 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.241502 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw492\" (UniqueName: \"kubernetes.io/projected/e39a0509-ea55-4b46-a3dc-473bb655cad8-kube-api-access-hw492\") pod \"auto-csr-approver-29556750-xl9g6\" (UID: \"e39a0509-ea55-4b46-a3dc-473bb655cad8\") " pod="openshift-infra/auto-csr-approver-29556750-xl9g6" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.241543 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/76f7e035-5d7e-46a2-befe-a8e414e93d86-config-volume\") pod \"collect-profiles-29556750-l24ln\" (UID: \"76f7e035-5d7e-46a2-befe-a8e414e93d86\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556750-l24ln" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.241793 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qjw7\" (UniqueName: \"kubernetes.io/projected/76f7e035-5d7e-46a2-befe-a8e414e93d86-kube-api-access-4qjw7\") pod \"collect-profiles-29556750-l24ln\" (UID: \"76f7e035-5d7e-46a2-befe-a8e414e93d86\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556750-l24ln" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.241887 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/76f7e035-5d7e-46a2-befe-a8e414e93d86-secret-volume\") pod \"collect-profiles-29556750-l24ln\" (UID: \"76f7e035-5d7e-46a2-befe-a8e414e93d86\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556750-l24ln" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.293616 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.343487 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qjw7\" (UniqueName: \"kubernetes.io/projected/76f7e035-5d7e-46a2-befe-a8e414e93d86-kube-api-access-4qjw7\") pod \"collect-profiles-29556750-l24ln\" (UID: \"76f7e035-5d7e-46a2-befe-a8e414e93d86\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556750-l24ln" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.343564 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/76f7e035-5d7e-46a2-befe-a8e414e93d86-secret-volume\") pod \"collect-profiles-29556750-l24ln\" (UID: \"76f7e035-5d7e-46a2-befe-a8e414e93d86\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556750-l24ln" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.343605 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hw492\" (UniqueName: \"kubernetes.io/projected/e39a0509-ea55-4b46-a3dc-473bb655cad8-kube-api-access-hw492\") pod \"auto-csr-approver-29556750-xl9g6\" (UID: \"e39a0509-ea55-4b46-a3dc-473bb655cad8\") " pod="openshift-infra/auto-csr-approver-29556750-xl9g6" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.343634 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/76f7e035-5d7e-46a2-befe-a8e414e93d86-config-volume\") pod \"collect-profiles-29556750-l24ln\" (UID: \"76f7e035-5d7e-46a2-befe-a8e414e93d86\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556750-l24ln" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.344686 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/76f7e035-5d7e-46a2-befe-a8e414e93d86-config-volume\") pod \"collect-profiles-29556750-l24ln\" (UID: \"76f7e035-5d7e-46a2-befe-a8e414e93d86\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556750-l24ln" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.350555 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/76f7e035-5d7e-46a2-befe-a8e414e93d86-secret-volume\") pod \"collect-profiles-29556750-l24ln\" (UID: \"76f7e035-5d7e-46a2-befe-a8e414e93d86\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556750-l24ln" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.362425 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qjw7\" (UniqueName: \"kubernetes.io/projected/76f7e035-5d7e-46a2-befe-a8e414e93d86-kube-api-access-4qjw7\") pod \"collect-profiles-29556750-l24ln\" (UID: \"76f7e035-5d7e-46a2-befe-a8e414e93d86\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556750-l24ln" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.363004 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw492\" (UniqueName: \"kubernetes.io/projected/e39a0509-ea55-4b46-a3dc-473bb655cad8-kube-api-access-hw492\") pod \"auto-csr-approver-29556750-xl9g6\" (UID: \"e39a0509-ea55-4b46-a3dc-473bb655cad8\") " pod="openshift-infra/auto-csr-approver-29556750-xl9g6" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.572391 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556750-xl9g6" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.581220 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556750-l24ln" Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.739775 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 13 12:30:00 crc kubenswrapper[4837]: I0313 12:30:00.757346 4837 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 12:30:01 crc kubenswrapper[4837]: W0313 12:30:01.025117 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode39a0509_ea55_4b46_a3dc_473bb655cad8.slice/crio-e02b5a85207cd574f3864bf7eedaf6214d2f2763660958e5efd2708fc7a806eb WatchSource:0}: Error finding container e02b5a85207cd574f3864bf7eedaf6214d2f2763660958e5efd2708fc7a806eb: Status 404 returned error can't find the container with id e02b5a85207cd574f3864bf7eedaf6214d2f2763660958e5efd2708fc7a806eb Mar 13 12:30:01 crc kubenswrapper[4837]: I0313 12:30:01.026361 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556750-xl9g6"] Mar 13 12:30:01 crc kubenswrapper[4837]: W0313 12:30:01.092051 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76f7e035_5d7e_46a2_befe_a8e414e93d86.slice/crio-c89b83aad1ee935ee8dd45e177b8e6ce9366ca2dcd96dae48ec255b4f9217c3f WatchSource:0}: Error finding container c89b83aad1ee935ee8dd45e177b8e6ce9366ca2dcd96dae48ec255b4f9217c3f: Status 404 returned error can't find the container with id c89b83aad1ee935ee8dd45e177b8e6ce9366ca2dcd96dae48ec255b4f9217c3f Mar 13 12:30:01 crc kubenswrapper[4837]: I0313 12:30:01.094596 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556750-l24ln"] Mar 13 12:30:01 crc kubenswrapper[4837]: I0313 12:30:01.603290 4837 generic.go:334] "Generic (PLEG): container finished" podID="76f7e035-5d7e-46a2-befe-a8e414e93d86" containerID="9ebad436416607ee3afe8e629996f2bb4d9f2b83ecead032b698d990edb417f1" exitCode=0 Mar 13 12:30:01 crc kubenswrapper[4837]: I0313 12:30:01.603608 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556750-l24ln" event={"ID":"76f7e035-5d7e-46a2-befe-a8e414e93d86","Type":"ContainerDied","Data":"9ebad436416607ee3afe8e629996f2bb4d9f2b83ecead032b698d990edb417f1"} Mar 13 12:30:01 crc kubenswrapper[4837]: I0313 12:30:01.603656 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556750-l24ln" event={"ID":"76f7e035-5d7e-46a2-befe-a8e414e93d86","Type":"ContainerStarted","Data":"c89b83aad1ee935ee8dd45e177b8e6ce9366ca2dcd96dae48ec255b4f9217c3f"} Mar 13 12:30:01 crc kubenswrapper[4837]: I0313 12:30:01.606099 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"66bdda91-c5b6-4879-9adf-21846884c797","Type":"ContainerStarted","Data":"e0109150fdc9bce6fc2a2f4d23a6692ef997ab608b50f1cec0fb2562f9d86611"} Mar 13 12:30:01 crc kubenswrapper[4837]: I0313 12:30:01.608016 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556750-xl9g6" event={"ID":"e39a0509-ea55-4b46-a3dc-473bb655cad8","Type":"ContainerStarted","Data":"e02b5a85207cd574f3864bf7eedaf6214d2f2763660958e5efd2708fc7a806eb"} Mar 13 12:30:02 crc kubenswrapper[4837]: I0313 12:30:02.956698 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556750-l24ln" Mar 13 12:30:03 crc kubenswrapper[4837]: I0313 12:30:03.108289 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/76f7e035-5d7e-46a2-befe-a8e414e93d86-secret-volume\") pod \"76f7e035-5d7e-46a2-befe-a8e414e93d86\" (UID: \"76f7e035-5d7e-46a2-befe-a8e414e93d86\") " Mar 13 12:30:03 crc kubenswrapper[4837]: I0313 12:30:03.108791 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/76f7e035-5d7e-46a2-befe-a8e414e93d86-config-volume\") pod \"76f7e035-5d7e-46a2-befe-a8e414e93d86\" (UID: \"76f7e035-5d7e-46a2-befe-a8e414e93d86\") " Mar 13 12:30:03 crc kubenswrapper[4837]: I0313 12:30:03.109018 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qjw7\" (UniqueName: \"kubernetes.io/projected/76f7e035-5d7e-46a2-befe-a8e414e93d86-kube-api-access-4qjw7\") pod \"76f7e035-5d7e-46a2-befe-a8e414e93d86\" (UID: \"76f7e035-5d7e-46a2-befe-a8e414e93d86\") " Mar 13 12:30:03 crc kubenswrapper[4837]: I0313 12:30:03.110030 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76f7e035-5d7e-46a2-befe-a8e414e93d86-config-volume" (OuterVolumeSpecName: "config-volume") pod "76f7e035-5d7e-46a2-befe-a8e414e93d86" (UID: "76f7e035-5d7e-46a2-befe-a8e414e93d86"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:30:03 crc kubenswrapper[4837]: I0313 12:30:03.117009 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76f7e035-5d7e-46a2-befe-a8e414e93d86-kube-api-access-4qjw7" (OuterVolumeSpecName: "kube-api-access-4qjw7") pod "76f7e035-5d7e-46a2-befe-a8e414e93d86" (UID: "76f7e035-5d7e-46a2-befe-a8e414e93d86"). InnerVolumeSpecName "kube-api-access-4qjw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:30:03 crc kubenswrapper[4837]: I0313 12:30:03.117159 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76f7e035-5d7e-46a2-befe-a8e414e93d86-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "76f7e035-5d7e-46a2-befe-a8e414e93d86" (UID: "76f7e035-5d7e-46a2-befe-a8e414e93d86"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:30:03 crc kubenswrapper[4837]: I0313 12:30:03.211830 4837 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/76f7e035-5d7e-46a2-befe-a8e414e93d86-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 12:30:03 crc kubenswrapper[4837]: I0313 12:30:03.211868 4837 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/76f7e035-5d7e-46a2-befe-a8e414e93d86-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 12:30:03 crc kubenswrapper[4837]: I0313 12:30:03.211878 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qjw7\" (UniqueName: \"kubernetes.io/projected/76f7e035-5d7e-46a2-befe-a8e414e93d86-kube-api-access-4qjw7\") on node \"crc\" DevicePath \"\"" Mar 13 12:30:03 crc kubenswrapper[4837]: I0313 12:30:03.635519 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556750-l24ln" event={"ID":"76f7e035-5d7e-46a2-befe-a8e414e93d86","Type":"ContainerDied","Data":"c89b83aad1ee935ee8dd45e177b8e6ce9366ca2dcd96dae48ec255b4f9217c3f"} Mar 13 12:30:03 crc kubenswrapper[4837]: I0313 12:30:03.635570 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c89b83aad1ee935ee8dd45e177b8e6ce9366ca2dcd96dae48ec255b4f9217c3f" Mar 13 12:30:03 crc kubenswrapper[4837]: I0313 12:30:03.635583 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556750-l24ln" Mar 13 12:30:04 crc kubenswrapper[4837]: I0313 12:30:04.024882 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556705-kllhr"] Mar 13 12:30:04 crc kubenswrapper[4837]: I0313 12:30:04.032409 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556705-kllhr"] Mar 13 12:30:05 crc kubenswrapper[4837]: I0313 12:30:05.087107 4837 scope.go:117] "RemoveContainer" containerID="e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504" Mar 13 12:30:05 crc kubenswrapper[4837]: E0313 12:30:05.087376 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:30:05 crc kubenswrapper[4837]: I0313 12:30:05.087710 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="831db5b2-5229-4b52-8783-f99c640ba856" path="/var/lib/kubelet/pods/831db5b2-5229-4b52-8783-f99c640ba856/volumes" Mar 13 12:30:08 crc kubenswrapper[4837]: I0313 12:30:08.691400 4837 generic.go:334] "Generic (PLEG): container finished" podID="e39a0509-ea55-4b46-a3dc-473bb655cad8" containerID="b4cb982598c9648b581b684b81629524a6916bc2574ae740b552fa7040fb8d2e" exitCode=0 Mar 13 12:30:08 crc kubenswrapper[4837]: I0313 12:30:08.691654 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556750-xl9g6" event={"ID":"e39a0509-ea55-4b46-a3dc-473bb655cad8","Type":"ContainerDied","Data":"b4cb982598c9648b581b684b81629524a6916bc2574ae740b552fa7040fb8d2e"} Mar 13 12:30:10 crc kubenswrapper[4837]: I0313 12:30:10.072390 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556750-xl9g6" Mar 13 12:30:10 crc kubenswrapper[4837]: I0313 12:30:10.156854 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hw492\" (UniqueName: \"kubernetes.io/projected/e39a0509-ea55-4b46-a3dc-473bb655cad8-kube-api-access-hw492\") pod \"e39a0509-ea55-4b46-a3dc-473bb655cad8\" (UID: \"e39a0509-ea55-4b46-a3dc-473bb655cad8\") " Mar 13 12:30:10 crc kubenswrapper[4837]: I0313 12:30:10.164262 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e39a0509-ea55-4b46-a3dc-473bb655cad8-kube-api-access-hw492" (OuterVolumeSpecName: "kube-api-access-hw492") pod "e39a0509-ea55-4b46-a3dc-473bb655cad8" (UID: "e39a0509-ea55-4b46-a3dc-473bb655cad8"). InnerVolumeSpecName "kube-api-access-hw492". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:30:10 crc kubenswrapper[4837]: I0313 12:30:10.259653 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hw492\" (UniqueName: \"kubernetes.io/projected/e39a0509-ea55-4b46-a3dc-473bb655cad8-kube-api-access-hw492\") on node \"crc\" DevicePath \"\"" Mar 13 12:30:10 crc kubenswrapper[4837]: I0313 12:30:10.713415 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556750-xl9g6" event={"ID":"e39a0509-ea55-4b46-a3dc-473bb655cad8","Type":"ContainerDied","Data":"e02b5a85207cd574f3864bf7eedaf6214d2f2763660958e5efd2708fc7a806eb"} Mar 13 12:30:10 crc kubenswrapper[4837]: I0313 12:30:10.713472 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e02b5a85207cd574f3864bf7eedaf6214d2f2763660958e5efd2708fc7a806eb" Mar 13 12:30:10 crc kubenswrapper[4837]: I0313 12:30:10.713482 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556750-xl9g6" Mar 13 12:30:11 crc kubenswrapper[4837]: I0313 12:30:11.163155 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556744-m59fh"] Mar 13 12:30:11 crc kubenswrapper[4837]: I0313 12:30:11.182674 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556744-m59fh"] Mar 13 12:30:13 crc kubenswrapper[4837]: I0313 12:30:13.063201 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e934250-1bb4-41fe-b36e-2acf48194bcf" path="/var/lib/kubelet/pods/1e934250-1bb4-41fe-b36e-2acf48194bcf/volumes" Mar 13 12:30:16 crc kubenswrapper[4837]: I0313 12:30:16.189542 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-28bfn"] Mar 13 12:30:16 crc kubenswrapper[4837]: E0313 12:30:16.190294 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76f7e035-5d7e-46a2-befe-a8e414e93d86" containerName="collect-profiles" Mar 13 12:30:16 crc kubenswrapper[4837]: I0313 12:30:16.190310 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="76f7e035-5d7e-46a2-befe-a8e414e93d86" containerName="collect-profiles" Mar 13 12:30:16 crc kubenswrapper[4837]: E0313 12:30:16.190344 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e39a0509-ea55-4b46-a3dc-473bb655cad8" containerName="oc" Mar 13 12:30:16 crc kubenswrapper[4837]: I0313 12:30:16.190351 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e39a0509-ea55-4b46-a3dc-473bb655cad8" containerName="oc" Mar 13 12:30:16 crc kubenswrapper[4837]: I0313 12:30:16.190566 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="e39a0509-ea55-4b46-a3dc-473bb655cad8" containerName="oc" Mar 13 12:30:16 crc kubenswrapper[4837]: I0313 12:30:16.190588 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="76f7e035-5d7e-46a2-befe-a8e414e93d86" containerName="collect-profiles" Mar 13 12:30:16 crc kubenswrapper[4837]: I0313 12:30:16.193451 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-28bfn" Mar 13 12:30:16 crc kubenswrapper[4837]: I0313 12:30:16.204383 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-28bfn"] Mar 13 12:30:16 crc kubenswrapper[4837]: I0313 12:30:16.278204 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27ee91bc-e007-4c68-99b5-34c7d0582011-utilities\") pod \"certified-operators-28bfn\" (UID: \"27ee91bc-e007-4c68-99b5-34c7d0582011\") " pod="openshift-marketplace/certified-operators-28bfn" Mar 13 12:30:16 crc kubenswrapper[4837]: I0313 12:30:16.278276 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bbsb\" (UniqueName: \"kubernetes.io/projected/27ee91bc-e007-4c68-99b5-34c7d0582011-kube-api-access-8bbsb\") pod \"certified-operators-28bfn\" (UID: \"27ee91bc-e007-4c68-99b5-34c7d0582011\") " pod="openshift-marketplace/certified-operators-28bfn" Mar 13 12:30:16 crc kubenswrapper[4837]: I0313 12:30:16.278347 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27ee91bc-e007-4c68-99b5-34c7d0582011-catalog-content\") pod \"certified-operators-28bfn\" (UID: \"27ee91bc-e007-4c68-99b5-34c7d0582011\") " pod="openshift-marketplace/certified-operators-28bfn" Mar 13 12:30:16 crc kubenswrapper[4837]: I0313 12:30:16.382484 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27ee91bc-e007-4c68-99b5-34c7d0582011-utilities\") pod \"certified-operators-28bfn\" (UID: \"27ee91bc-e007-4c68-99b5-34c7d0582011\") " pod="openshift-marketplace/certified-operators-28bfn" Mar 13 12:30:16 crc kubenswrapper[4837]: I0313 12:30:16.382872 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bbsb\" (UniqueName: \"kubernetes.io/projected/27ee91bc-e007-4c68-99b5-34c7d0582011-kube-api-access-8bbsb\") pod \"certified-operators-28bfn\" (UID: \"27ee91bc-e007-4c68-99b5-34c7d0582011\") " pod="openshift-marketplace/certified-operators-28bfn" Mar 13 12:30:16 crc kubenswrapper[4837]: I0313 12:30:16.383160 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27ee91bc-e007-4c68-99b5-34c7d0582011-utilities\") pod \"certified-operators-28bfn\" (UID: \"27ee91bc-e007-4c68-99b5-34c7d0582011\") " pod="openshift-marketplace/certified-operators-28bfn" Mar 13 12:30:16 crc kubenswrapper[4837]: I0313 12:30:16.383750 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27ee91bc-e007-4c68-99b5-34c7d0582011-catalog-content\") pod \"certified-operators-28bfn\" (UID: \"27ee91bc-e007-4c68-99b5-34c7d0582011\") " pod="openshift-marketplace/certified-operators-28bfn" Mar 13 12:30:16 crc kubenswrapper[4837]: I0313 12:30:16.384230 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27ee91bc-e007-4c68-99b5-34c7d0582011-catalog-content\") pod \"certified-operators-28bfn\" (UID: \"27ee91bc-e007-4c68-99b5-34c7d0582011\") " pod="openshift-marketplace/certified-operators-28bfn" Mar 13 12:30:16 crc kubenswrapper[4837]: I0313 12:30:16.424108 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bbsb\" (UniqueName: \"kubernetes.io/projected/27ee91bc-e007-4c68-99b5-34c7d0582011-kube-api-access-8bbsb\") pod \"certified-operators-28bfn\" (UID: \"27ee91bc-e007-4c68-99b5-34c7d0582011\") " pod="openshift-marketplace/certified-operators-28bfn" Mar 13 12:30:16 crc kubenswrapper[4837]: I0313 12:30:16.534660 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-28bfn" Mar 13 12:30:20 crc kubenswrapper[4837]: I0313 12:30:20.048246 4837 scope.go:117] "RemoveContainer" containerID="e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504" Mar 13 12:30:20 crc kubenswrapper[4837]: E0313 12:30:20.048801 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:30:30 crc kubenswrapper[4837]: I0313 12:30:30.544911 4837 scope.go:117] "RemoveContainer" containerID="965aad43c7ccd189d4d18246f935c745fc24b5e2cfb5b07896f9492e9109fb55" Mar 13 12:30:30 crc kubenswrapper[4837]: E0313 12:30:30.586238 4837 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Mar 13 12:30:30 crc kubenswrapper[4837]: E0313 12:30:30.586418 4837 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zdksm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(66bdda91-c5b6-4879-9adf-21846884c797): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 12:30:30 crc kubenswrapper[4837]: E0313 12:30:30.587666 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="66bdda91-c5b6-4879-9adf-21846884c797" Mar 13 12:30:30 crc kubenswrapper[4837]: I0313 12:30:30.635084 4837 scope.go:117] "RemoveContainer" containerID="5d6f6eb18121de7ac4f3538b881026fd87404ecae22fe7e8d631b874d26990e4" Mar 13 12:30:30 crc kubenswrapper[4837]: E0313 12:30:30.878982 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="66bdda91-c5b6-4879-9adf-21846884c797" Mar 13 12:30:30 crc kubenswrapper[4837]: I0313 12:30:30.937797 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-28bfn"] Mar 13 12:30:30 crc kubenswrapper[4837]: W0313 12:30:30.939465 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27ee91bc_e007_4c68_99b5_34c7d0582011.slice/crio-5fc0ef45e263712833cb4f519304d7593121798056d551c869c283dc2e6747e6 WatchSource:0}: Error finding container 5fc0ef45e263712833cb4f519304d7593121798056d551c869c283dc2e6747e6: Status 404 returned error can't find the container with id 5fc0ef45e263712833cb4f519304d7593121798056d551c869c283dc2e6747e6 Mar 13 12:30:31 crc kubenswrapper[4837]: I0313 12:30:31.887537 4837 generic.go:334] "Generic (PLEG): container finished" podID="27ee91bc-e007-4c68-99b5-34c7d0582011" containerID="93168fca68a8d6317f60d8aeaccee8e87ed38a7a55bd16d7de59d1bf8ee00212" exitCode=0 Mar 13 12:30:31 crc kubenswrapper[4837]: I0313 12:30:31.887752 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-28bfn" event={"ID":"27ee91bc-e007-4c68-99b5-34c7d0582011","Type":"ContainerDied","Data":"93168fca68a8d6317f60d8aeaccee8e87ed38a7a55bd16d7de59d1bf8ee00212"} Mar 13 12:30:31 crc kubenswrapper[4837]: I0313 12:30:31.887864 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-28bfn" event={"ID":"27ee91bc-e007-4c68-99b5-34c7d0582011","Type":"ContainerStarted","Data":"5fc0ef45e263712833cb4f519304d7593121798056d551c869c283dc2e6747e6"} Mar 13 12:30:33 crc kubenswrapper[4837]: I0313 12:30:33.906660 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-28bfn" event={"ID":"27ee91bc-e007-4c68-99b5-34c7d0582011","Type":"ContainerStarted","Data":"a8175e7d0cec8aad075327704d3226eef42ece8f46fb741b77e0c3dfe1833f0f"} Mar 13 12:30:34 crc kubenswrapper[4837]: I0313 12:30:34.917303 4837 generic.go:334] "Generic (PLEG): container finished" podID="27ee91bc-e007-4c68-99b5-34c7d0582011" containerID="a8175e7d0cec8aad075327704d3226eef42ece8f46fb741b77e0c3dfe1833f0f" exitCode=0 Mar 13 12:30:34 crc kubenswrapper[4837]: I0313 12:30:34.917351 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-28bfn" event={"ID":"27ee91bc-e007-4c68-99b5-34c7d0582011","Type":"ContainerDied","Data":"a8175e7d0cec8aad075327704d3226eef42ece8f46fb741b77e0c3dfe1833f0f"} Mar 13 12:30:35 crc kubenswrapper[4837]: I0313 12:30:35.054356 4837 scope.go:117] "RemoveContainer" containerID="e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504" Mar 13 12:30:35 crc kubenswrapper[4837]: E0313 12:30:35.055044 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:30:35 crc kubenswrapper[4837]: I0313 12:30:35.931118 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-28bfn" event={"ID":"27ee91bc-e007-4c68-99b5-34c7d0582011","Type":"ContainerStarted","Data":"3ae07172d6e621f542f887d75f91e3f8cd7080cb359dfe09039ef1cba01ff6fd"} Mar 13 12:30:35 crc kubenswrapper[4837]: I0313 12:30:35.963980 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-28bfn" podStartSLOduration=16.251353086 podStartE2EDuration="19.963946567s" podCreationTimestamp="2026-03-13 12:30:16 +0000 UTC" firstStartedPulling="2026-03-13 12:30:31.889671157 +0000 UTC m=+2547.527937920" lastFinishedPulling="2026-03-13 12:30:35.602264638 +0000 UTC m=+2551.240531401" observedRunningTime="2026-03-13 12:30:35.961977574 +0000 UTC m=+2551.600244337" watchObservedRunningTime="2026-03-13 12:30:35.963946567 +0000 UTC m=+2551.602213330" Mar 13 12:30:36 crc kubenswrapper[4837]: I0313 12:30:36.534945 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-28bfn" Mar 13 12:30:36 crc kubenswrapper[4837]: I0313 12:30:36.535003 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-28bfn" Mar 13 12:30:37 crc kubenswrapper[4837]: I0313 12:30:37.583657 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-28bfn" podUID="27ee91bc-e007-4c68-99b5-34c7d0582011" containerName="registry-server" probeResult="failure" output=< Mar 13 12:30:37 crc kubenswrapper[4837]: timeout: failed to connect service ":50051" within 1s Mar 13 12:30:37 crc kubenswrapper[4837]: > Mar 13 12:30:45 crc kubenswrapper[4837]: I0313 12:30:45.500849 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 13 12:30:46 crc kubenswrapper[4837]: I0313 12:30:46.581157 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-28bfn" Mar 13 12:30:46 crc kubenswrapper[4837]: I0313 12:30:46.630508 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-28bfn" Mar 13 12:30:47 crc kubenswrapper[4837]: I0313 12:30:47.043023 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"66bdda91-c5b6-4879-9adf-21846884c797","Type":"ContainerStarted","Data":"bda10fa8fd12669f2f471650132835bc9a8231ba850dd11df31ebbad360b9cf6"} Mar 13 12:30:47 crc kubenswrapper[4837]: I0313 12:30:47.075380 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.334128991 podStartE2EDuration="49.075359151s" podCreationTimestamp="2026-03-13 12:29:58 +0000 UTC" firstStartedPulling="2026-03-13 12:30:00.756311218 +0000 UTC m=+2516.394577981" lastFinishedPulling="2026-03-13 12:30:45.497541378 +0000 UTC m=+2561.135808141" observedRunningTime="2026-03-13 12:30:47.059195012 +0000 UTC m=+2562.697461775" watchObservedRunningTime="2026-03-13 12:30:47.075359151 +0000 UTC m=+2562.713625914" Mar 13 12:30:47 crc kubenswrapper[4837]: I0313 12:30:47.394011 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-28bfn"] Mar 13 12:30:48 crc kubenswrapper[4837]: I0313 12:30:48.051013 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-28bfn" podUID="27ee91bc-e007-4c68-99b5-34c7d0582011" containerName="registry-server" containerID="cri-o://3ae07172d6e621f542f887d75f91e3f8cd7080cb359dfe09039ef1cba01ff6fd" gracePeriod=2 Mar 13 12:30:48 crc kubenswrapper[4837]: I0313 12:30:48.498595 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-28bfn" Mar 13 12:30:48 crc kubenswrapper[4837]: I0313 12:30:48.640750 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bbsb\" (UniqueName: \"kubernetes.io/projected/27ee91bc-e007-4c68-99b5-34c7d0582011-kube-api-access-8bbsb\") pod \"27ee91bc-e007-4c68-99b5-34c7d0582011\" (UID: \"27ee91bc-e007-4c68-99b5-34c7d0582011\") " Mar 13 12:30:48 crc kubenswrapper[4837]: I0313 12:30:48.640951 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27ee91bc-e007-4c68-99b5-34c7d0582011-utilities\") pod \"27ee91bc-e007-4c68-99b5-34c7d0582011\" (UID: \"27ee91bc-e007-4c68-99b5-34c7d0582011\") " Mar 13 12:30:48 crc kubenswrapper[4837]: I0313 12:30:48.641073 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27ee91bc-e007-4c68-99b5-34c7d0582011-catalog-content\") pod \"27ee91bc-e007-4c68-99b5-34c7d0582011\" (UID: \"27ee91bc-e007-4c68-99b5-34c7d0582011\") " Mar 13 12:30:48 crc kubenswrapper[4837]: I0313 12:30:48.641980 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27ee91bc-e007-4c68-99b5-34c7d0582011-utilities" (OuterVolumeSpecName: "utilities") pod "27ee91bc-e007-4c68-99b5-34c7d0582011" (UID: "27ee91bc-e007-4c68-99b5-34c7d0582011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:30:48 crc kubenswrapper[4837]: I0313 12:30:48.646909 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27ee91bc-e007-4c68-99b5-34c7d0582011-kube-api-access-8bbsb" (OuterVolumeSpecName: "kube-api-access-8bbsb") pod "27ee91bc-e007-4c68-99b5-34c7d0582011" (UID: "27ee91bc-e007-4c68-99b5-34c7d0582011"). InnerVolumeSpecName "kube-api-access-8bbsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:30:48 crc kubenswrapper[4837]: I0313 12:30:48.705260 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27ee91bc-e007-4c68-99b5-34c7d0582011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "27ee91bc-e007-4c68-99b5-34c7d0582011" (UID: "27ee91bc-e007-4c68-99b5-34c7d0582011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:30:48 crc kubenswrapper[4837]: I0313 12:30:48.743174 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bbsb\" (UniqueName: \"kubernetes.io/projected/27ee91bc-e007-4c68-99b5-34c7d0582011-kube-api-access-8bbsb\") on node \"crc\" DevicePath \"\"" Mar 13 12:30:48 crc kubenswrapper[4837]: I0313 12:30:48.743208 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27ee91bc-e007-4c68-99b5-34c7d0582011-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:30:48 crc kubenswrapper[4837]: I0313 12:30:48.743218 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27ee91bc-e007-4c68-99b5-34c7d0582011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:30:49 crc kubenswrapper[4837]: I0313 12:30:49.065432 4837 generic.go:334] "Generic (PLEG): container finished" podID="27ee91bc-e007-4c68-99b5-34c7d0582011" containerID="3ae07172d6e621f542f887d75f91e3f8cd7080cb359dfe09039ef1cba01ff6fd" exitCode=0 Mar 13 12:30:49 crc kubenswrapper[4837]: I0313 12:30:49.065480 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-28bfn" event={"ID":"27ee91bc-e007-4c68-99b5-34c7d0582011","Type":"ContainerDied","Data":"3ae07172d6e621f542f887d75f91e3f8cd7080cb359dfe09039ef1cba01ff6fd"} Mar 13 12:30:49 crc kubenswrapper[4837]: I0313 12:30:49.065509 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-28bfn" event={"ID":"27ee91bc-e007-4c68-99b5-34c7d0582011","Type":"ContainerDied","Data":"5fc0ef45e263712833cb4f519304d7593121798056d551c869c283dc2e6747e6"} Mar 13 12:30:49 crc kubenswrapper[4837]: I0313 12:30:49.065527 4837 scope.go:117] "RemoveContainer" containerID="3ae07172d6e621f542f887d75f91e3f8cd7080cb359dfe09039ef1cba01ff6fd" Mar 13 12:30:49 crc kubenswrapper[4837]: I0313 12:30:49.065700 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-28bfn" Mar 13 12:30:49 crc kubenswrapper[4837]: I0313 12:30:49.109074 4837 scope.go:117] "RemoveContainer" containerID="a8175e7d0cec8aad075327704d3226eef42ece8f46fb741b77e0c3dfe1833f0f" Mar 13 12:30:49 crc kubenswrapper[4837]: I0313 12:30:49.117313 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-28bfn"] Mar 13 12:30:49 crc kubenswrapper[4837]: I0313 12:30:49.133627 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-28bfn"] Mar 13 12:30:49 crc kubenswrapper[4837]: I0313 12:30:49.152516 4837 scope.go:117] "RemoveContainer" containerID="93168fca68a8d6317f60d8aeaccee8e87ed38a7a55bd16d7de59d1bf8ee00212" Mar 13 12:30:49 crc kubenswrapper[4837]: I0313 12:30:49.175320 4837 scope.go:117] "RemoveContainer" containerID="3ae07172d6e621f542f887d75f91e3f8cd7080cb359dfe09039ef1cba01ff6fd" Mar 13 12:30:49 crc kubenswrapper[4837]: E0313 12:30:49.176155 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ae07172d6e621f542f887d75f91e3f8cd7080cb359dfe09039ef1cba01ff6fd\": container with ID starting with 3ae07172d6e621f542f887d75f91e3f8cd7080cb359dfe09039ef1cba01ff6fd not found: ID does not exist" containerID="3ae07172d6e621f542f887d75f91e3f8cd7080cb359dfe09039ef1cba01ff6fd" Mar 13 12:30:49 crc kubenswrapper[4837]: I0313 12:30:49.176202 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ae07172d6e621f542f887d75f91e3f8cd7080cb359dfe09039ef1cba01ff6fd"} err="failed to get container status \"3ae07172d6e621f542f887d75f91e3f8cd7080cb359dfe09039ef1cba01ff6fd\": rpc error: code = NotFound desc = could not find container \"3ae07172d6e621f542f887d75f91e3f8cd7080cb359dfe09039ef1cba01ff6fd\": container with ID starting with 3ae07172d6e621f542f887d75f91e3f8cd7080cb359dfe09039ef1cba01ff6fd not found: ID does not exist" Mar 13 12:30:49 crc kubenswrapper[4837]: I0313 12:30:49.176228 4837 scope.go:117] "RemoveContainer" containerID="a8175e7d0cec8aad075327704d3226eef42ece8f46fb741b77e0c3dfe1833f0f" Mar 13 12:30:49 crc kubenswrapper[4837]: E0313 12:30:49.176579 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8175e7d0cec8aad075327704d3226eef42ece8f46fb741b77e0c3dfe1833f0f\": container with ID starting with a8175e7d0cec8aad075327704d3226eef42ece8f46fb741b77e0c3dfe1833f0f not found: ID does not exist" containerID="a8175e7d0cec8aad075327704d3226eef42ece8f46fb741b77e0c3dfe1833f0f" Mar 13 12:30:49 crc kubenswrapper[4837]: I0313 12:30:49.176608 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8175e7d0cec8aad075327704d3226eef42ece8f46fb741b77e0c3dfe1833f0f"} err="failed to get container status \"a8175e7d0cec8aad075327704d3226eef42ece8f46fb741b77e0c3dfe1833f0f\": rpc error: code = NotFound desc = could not find container \"a8175e7d0cec8aad075327704d3226eef42ece8f46fb741b77e0c3dfe1833f0f\": container with ID starting with a8175e7d0cec8aad075327704d3226eef42ece8f46fb741b77e0c3dfe1833f0f not found: ID does not exist" Mar 13 12:30:49 crc kubenswrapper[4837]: I0313 12:30:49.176626 4837 scope.go:117] "RemoveContainer" containerID="93168fca68a8d6317f60d8aeaccee8e87ed38a7a55bd16d7de59d1bf8ee00212" Mar 13 12:30:49 crc kubenswrapper[4837]: E0313 12:30:49.177142 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93168fca68a8d6317f60d8aeaccee8e87ed38a7a55bd16d7de59d1bf8ee00212\": container with ID starting with 93168fca68a8d6317f60d8aeaccee8e87ed38a7a55bd16d7de59d1bf8ee00212 not found: ID does not exist" containerID="93168fca68a8d6317f60d8aeaccee8e87ed38a7a55bd16d7de59d1bf8ee00212" Mar 13 12:30:49 crc kubenswrapper[4837]: I0313 12:30:49.177167 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93168fca68a8d6317f60d8aeaccee8e87ed38a7a55bd16d7de59d1bf8ee00212"} err="failed to get container status \"93168fca68a8d6317f60d8aeaccee8e87ed38a7a55bd16d7de59d1bf8ee00212\": rpc error: code = NotFound desc = could not find container \"93168fca68a8d6317f60d8aeaccee8e87ed38a7a55bd16d7de59d1bf8ee00212\": container with ID starting with 93168fca68a8d6317f60d8aeaccee8e87ed38a7a55bd16d7de59d1bf8ee00212 not found: ID does not exist" Mar 13 12:30:50 crc kubenswrapper[4837]: I0313 12:30:50.048571 4837 scope.go:117] "RemoveContainer" containerID="e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504" Mar 13 12:30:50 crc kubenswrapper[4837]: E0313 12:30:50.049241 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:30:51 crc kubenswrapper[4837]: I0313 12:30:51.066388 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27ee91bc-e007-4c68-99b5-34c7d0582011" path="/var/lib/kubelet/pods/27ee91bc-e007-4c68-99b5-34c7d0582011/volumes" Mar 13 12:31:02 crc kubenswrapper[4837]: I0313 12:31:02.048150 4837 scope.go:117] "RemoveContainer" containerID="e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504" Mar 13 12:31:02 crc kubenswrapper[4837]: E0313 12:31:02.049215 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:31:16 crc kubenswrapper[4837]: I0313 12:31:16.048129 4837 scope.go:117] "RemoveContainer" containerID="e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504" Mar 13 12:31:16 crc kubenswrapper[4837]: I0313 12:31:16.321965 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerStarted","Data":"2b3696e623d9d39a462c11ad27c35b22549ca870996c4db75be20036c9201734"} Mar 13 12:31:19 crc kubenswrapper[4837]: I0313 12:31:19.633927 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-548kp"] Mar 13 12:31:19 crc kubenswrapper[4837]: E0313 12:31:19.635502 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27ee91bc-e007-4c68-99b5-34c7d0582011" containerName="registry-server" Mar 13 12:31:19 crc kubenswrapper[4837]: I0313 12:31:19.635518 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="27ee91bc-e007-4c68-99b5-34c7d0582011" containerName="registry-server" Mar 13 12:31:19 crc kubenswrapper[4837]: E0313 12:31:19.635533 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27ee91bc-e007-4c68-99b5-34c7d0582011" containerName="extract-content" Mar 13 12:31:19 crc kubenswrapper[4837]: I0313 12:31:19.635540 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="27ee91bc-e007-4c68-99b5-34c7d0582011" containerName="extract-content" Mar 13 12:31:19 crc kubenswrapper[4837]: E0313 12:31:19.635567 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27ee91bc-e007-4c68-99b5-34c7d0582011" containerName="extract-utilities" Mar 13 12:31:19 crc kubenswrapper[4837]: I0313 12:31:19.635575 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="27ee91bc-e007-4c68-99b5-34c7d0582011" containerName="extract-utilities" Mar 13 12:31:19 crc kubenswrapper[4837]: I0313 12:31:19.635815 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="27ee91bc-e007-4c68-99b5-34c7d0582011" containerName="registry-server" Mar 13 12:31:19 crc kubenswrapper[4837]: I0313 12:31:19.637484 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-548kp" Mar 13 12:31:19 crc kubenswrapper[4837]: I0313 12:31:19.680687 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-548kp"] Mar 13 12:31:19 crc kubenswrapper[4837]: I0313 12:31:19.742503 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44f44c76-3281-4bf0-af2e-0bba3d0dd712-catalog-content\") pod \"redhat-operators-548kp\" (UID: \"44f44c76-3281-4bf0-af2e-0bba3d0dd712\") " pod="openshift-marketplace/redhat-operators-548kp" Mar 13 12:31:19 crc kubenswrapper[4837]: I0313 12:31:19.742702 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkpd6\" (UniqueName: \"kubernetes.io/projected/44f44c76-3281-4bf0-af2e-0bba3d0dd712-kube-api-access-qkpd6\") pod \"redhat-operators-548kp\" (UID: \"44f44c76-3281-4bf0-af2e-0bba3d0dd712\") " pod="openshift-marketplace/redhat-operators-548kp" Mar 13 12:31:19 crc kubenswrapper[4837]: I0313 12:31:19.742854 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44f44c76-3281-4bf0-af2e-0bba3d0dd712-utilities\") pod \"redhat-operators-548kp\" (UID: \"44f44c76-3281-4bf0-af2e-0bba3d0dd712\") " pod="openshift-marketplace/redhat-operators-548kp" Mar 13 12:31:19 crc kubenswrapper[4837]: I0313 12:31:19.844655 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkpd6\" (UniqueName: \"kubernetes.io/projected/44f44c76-3281-4bf0-af2e-0bba3d0dd712-kube-api-access-qkpd6\") pod \"redhat-operators-548kp\" (UID: \"44f44c76-3281-4bf0-af2e-0bba3d0dd712\") " pod="openshift-marketplace/redhat-operators-548kp" Mar 13 12:31:19 crc kubenswrapper[4837]: I0313 12:31:19.844766 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44f44c76-3281-4bf0-af2e-0bba3d0dd712-utilities\") pod \"redhat-operators-548kp\" (UID: \"44f44c76-3281-4bf0-af2e-0bba3d0dd712\") " pod="openshift-marketplace/redhat-operators-548kp" Mar 13 12:31:19 crc kubenswrapper[4837]: I0313 12:31:19.844900 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44f44c76-3281-4bf0-af2e-0bba3d0dd712-catalog-content\") pod \"redhat-operators-548kp\" (UID: \"44f44c76-3281-4bf0-af2e-0bba3d0dd712\") " pod="openshift-marketplace/redhat-operators-548kp" Mar 13 12:31:19 crc kubenswrapper[4837]: I0313 12:31:19.845473 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44f44c76-3281-4bf0-af2e-0bba3d0dd712-catalog-content\") pod \"redhat-operators-548kp\" (UID: \"44f44c76-3281-4bf0-af2e-0bba3d0dd712\") " pod="openshift-marketplace/redhat-operators-548kp" Mar 13 12:31:19 crc kubenswrapper[4837]: I0313 12:31:19.845525 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44f44c76-3281-4bf0-af2e-0bba3d0dd712-utilities\") pod \"redhat-operators-548kp\" (UID: \"44f44c76-3281-4bf0-af2e-0bba3d0dd712\") " pod="openshift-marketplace/redhat-operators-548kp" Mar 13 12:31:19 crc kubenswrapper[4837]: I0313 12:31:19.870537 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkpd6\" (UniqueName: \"kubernetes.io/projected/44f44c76-3281-4bf0-af2e-0bba3d0dd712-kube-api-access-qkpd6\") pod \"redhat-operators-548kp\" (UID: \"44f44c76-3281-4bf0-af2e-0bba3d0dd712\") " pod="openshift-marketplace/redhat-operators-548kp" Mar 13 12:31:19 crc kubenswrapper[4837]: I0313 12:31:19.958729 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-548kp" Mar 13 12:31:20 crc kubenswrapper[4837]: I0313 12:31:20.275717 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-548kp"] Mar 13 12:31:20 crc kubenswrapper[4837]: W0313 12:31:20.291009 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44f44c76_3281_4bf0_af2e_0bba3d0dd712.slice/crio-4f4bd9532e4e3bcde381fc829056a99c3b26dd353413052ef976af5de3db5976 WatchSource:0}: Error finding container 4f4bd9532e4e3bcde381fc829056a99c3b26dd353413052ef976af5de3db5976: Status 404 returned error can't find the container with id 4f4bd9532e4e3bcde381fc829056a99c3b26dd353413052ef976af5de3db5976 Mar 13 12:31:20 crc kubenswrapper[4837]: I0313 12:31:20.403361 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-548kp" event={"ID":"44f44c76-3281-4bf0-af2e-0bba3d0dd712","Type":"ContainerStarted","Data":"4f4bd9532e4e3bcde381fc829056a99c3b26dd353413052ef976af5de3db5976"} Mar 13 12:31:21 crc kubenswrapper[4837]: I0313 12:31:21.412028 4837 generic.go:334] "Generic (PLEG): container finished" podID="44f44c76-3281-4bf0-af2e-0bba3d0dd712" containerID="f909d1a24957ee9bf34a08df5acbf22285940797d06b8e41029978786a4c1aae" exitCode=0 Mar 13 12:31:21 crc kubenswrapper[4837]: I0313 12:31:21.412077 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-548kp" event={"ID":"44f44c76-3281-4bf0-af2e-0bba3d0dd712","Type":"ContainerDied","Data":"f909d1a24957ee9bf34a08df5acbf22285940797d06b8e41029978786a4c1aae"} Mar 13 12:31:22 crc kubenswrapper[4837]: I0313 12:31:22.421713 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-548kp" event={"ID":"44f44c76-3281-4bf0-af2e-0bba3d0dd712","Type":"ContainerStarted","Data":"c025e0fcad011221373a532b0452eaa46808dfb3988ff1e006eac648a2c2453b"} Mar 13 12:31:27 crc kubenswrapper[4837]: I0313 12:31:27.468085 4837 generic.go:334] "Generic (PLEG): container finished" podID="44f44c76-3281-4bf0-af2e-0bba3d0dd712" containerID="c025e0fcad011221373a532b0452eaa46808dfb3988ff1e006eac648a2c2453b" exitCode=0 Mar 13 12:31:27 crc kubenswrapper[4837]: I0313 12:31:27.468200 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-548kp" event={"ID":"44f44c76-3281-4bf0-af2e-0bba3d0dd712","Type":"ContainerDied","Data":"c025e0fcad011221373a532b0452eaa46808dfb3988ff1e006eac648a2c2453b"} Mar 13 12:31:28 crc kubenswrapper[4837]: I0313 12:31:28.479860 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-548kp" event={"ID":"44f44c76-3281-4bf0-af2e-0bba3d0dd712","Type":"ContainerStarted","Data":"15e13b316223eb538da03157c2f12378c04650ef2033b6a2e3fdeb73d34837f4"} Mar 13 12:31:28 crc kubenswrapper[4837]: I0313 12:31:28.506518 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-548kp" podStartSLOduration=3.07011843 podStartE2EDuration="9.506497167s" podCreationTimestamp="2026-03-13 12:31:19 +0000 UTC" firstStartedPulling="2026-03-13 12:31:21.414204686 +0000 UTC m=+2597.052471439" lastFinishedPulling="2026-03-13 12:31:27.850583403 +0000 UTC m=+2603.488850176" observedRunningTime="2026-03-13 12:31:28.503387689 +0000 UTC m=+2604.141654472" watchObservedRunningTime="2026-03-13 12:31:28.506497167 +0000 UTC m=+2604.144763950" Mar 13 12:31:29 crc kubenswrapper[4837]: I0313 12:31:29.959353 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-548kp" Mar 13 12:31:29 crc kubenswrapper[4837]: I0313 12:31:29.959656 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-548kp" Mar 13 12:31:31 crc kubenswrapper[4837]: I0313 12:31:31.009313 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-548kp" podUID="44f44c76-3281-4bf0-af2e-0bba3d0dd712" containerName="registry-server" probeResult="failure" output=< Mar 13 12:31:31 crc kubenswrapper[4837]: timeout: failed to connect service ":50051" within 1s Mar 13 12:31:31 crc kubenswrapper[4837]: > Mar 13 12:31:41 crc kubenswrapper[4837]: I0313 12:31:41.004033 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-548kp" podUID="44f44c76-3281-4bf0-af2e-0bba3d0dd712" containerName="registry-server" probeResult="failure" output=< Mar 13 12:31:41 crc kubenswrapper[4837]: timeout: failed to connect service ":50051" within 1s Mar 13 12:31:41 crc kubenswrapper[4837]: > Mar 13 12:31:50 crc kubenswrapper[4837]: I0313 12:31:50.005756 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-548kp" Mar 13 12:31:50 crc kubenswrapper[4837]: I0313 12:31:50.056402 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-548kp" Mar 13 12:31:50 crc kubenswrapper[4837]: I0313 12:31:50.831972 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-548kp"] Mar 13 12:31:51 crc kubenswrapper[4837]: I0313 12:31:51.672486 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-548kp" podUID="44f44c76-3281-4bf0-af2e-0bba3d0dd712" containerName="registry-server" containerID="cri-o://15e13b316223eb538da03157c2f12378c04650ef2033b6a2e3fdeb73d34837f4" gracePeriod=2 Mar 13 12:31:52 crc kubenswrapper[4837]: I0313 12:31:52.307614 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-548kp" Mar 13 12:31:52 crc kubenswrapper[4837]: I0313 12:31:52.441351 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44f44c76-3281-4bf0-af2e-0bba3d0dd712-catalog-content\") pod \"44f44c76-3281-4bf0-af2e-0bba3d0dd712\" (UID: \"44f44c76-3281-4bf0-af2e-0bba3d0dd712\") " Mar 13 12:31:52 crc kubenswrapper[4837]: I0313 12:31:52.441565 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44f44c76-3281-4bf0-af2e-0bba3d0dd712-utilities\") pod \"44f44c76-3281-4bf0-af2e-0bba3d0dd712\" (UID: \"44f44c76-3281-4bf0-af2e-0bba3d0dd712\") " Mar 13 12:31:52 crc kubenswrapper[4837]: I0313 12:31:52.441814 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkpd6\" (UniqueName: \"kubernetes.io/projected/44f44c76-3281-4bf0-af2e-0bba3d0dd712-kube-api-access-qkpd6\") pod \"44f44c76-3281-4bf0-af2e-0bba3d0dd712\" (UID: \"44f44c76-3281-4bf0-af2e-0bba3d0dd712\") " Mar 13 12:31:52 crc kubenswrapper[4837]: I0313 12:31:52.442541 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44f44c76-3281-4bf0-af2e-0bba3d0dd712-utilities" (OuterVolumeSpecName: "utilities") pod "44f44c76-3281-4bf0-af2e-0bba3d0dd712" (UID: "44f44c76-3281-4bf0-af2e-0bba3d0dd712"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:31:52 crc kubenswrapper[4837]: I0313 12:31:52.447601 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44f44c76-3281-4bf0-af2e-0bba3d0dd712-kube-api-access-qkpd6" (OuterVolumeSpecName: "kube-api-access-qkpd6") pod "44f44c76-3281-4bf0-af2e-0bba3d0dd712" (UID: "44f44c76-3281-4bf0-af2e-0bba3d0dd712"). InnerVolumeSpecName "kube-api-access-qkpd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:31:52 crc kubenswrapper[4837]: I0313 12:31:52.544325 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44f44c76-3281-4bf0-af2e-0bba3d0dd712-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:31:52 crc kubenswrapper[4837]: I0313 12:31:52.544362 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkpd6\" (UniqueName: \"kubernetes.io/projected/44f44c76-3281-4bf0-af2e-0bba3d0dd712-kube-api-access-qkpd6\") on node \"crc\" DevicePath \"\"" Mar 13 12:31:52 crc kubenswrapper[4837]: I0313 12:31:52.578663 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44f44c76-3281-4bf0-af2e-0bba3d0dd712-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "44f44c76-3281-4bf0-af2e-0bba3d0dd712" (UID: "44f44c76-3281-4bf0-af2e-0bba3d0dd712"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:31:52 crc kubenswrapper[4837]: I0313 12:31:52.646328 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44f44c76-3281-4bf0-af2e-0bba3d0dd712-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:31:52 crc kubenswrapper[4837]: I0313 12:31:52.683409 4837 generic.go:334] "Generic (PLEG): container finished" podID="44f44c76-3281-4bf0-af2e-0bba3d0dd712" containerID="15e13b316223eb538da03157c2f12378c04650ef2033b6a2e3fdeb73d34837f4" exitCode=0 Mar 13 12:31:52 crc kubenswrapper[4837]: I0313 12:31:52.683453 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-548kp" event={"ID":"44f44c76-3281-4bf0-af2e-0bba3d0dd712","Type":"ContainerDied","Data":"15e13b316223eb538da03157c2f12378c04650ef2033b6a2e3fdeb73d34837f4"} Mar 13 12:31:52 crc kubenswrapper[4837]: I0313 12:31:52.684666 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-548kp" event={"ID":"44f44c76-3281-4bf0-af2e-0bba3d0dd712","Type":"ContainerDied","Data":"4f4bd9532e4e3bcde381fc829056a99c3b26dd353413052ef976af5de3db5976"} Mar 13 12:31:52 crc kubenswrapper[4837]: I0313 12:31:52.683495 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-548kp" Mar 13 12:31:52 crc kubenswrapper[4837]: I0313 12:31:52.684740 4837 scope.go:117] "RemoveContainer" containerID="15e13b316223eb538da03157c2f12378c04650ef2033b6a2e3fdeb73d34837f4" Mar 13 12:31:52 crc kubenswrapper[4837]: I0313 12:31:52.711549 4837 scope.go:117] "RemoveContainer" containerID="c025e0fcad011221373a532b0452eaa46808dfb3988ff1e006eac648a2c2453b" Mar 13 12:31:52 crc kubenswrapper[4837]: I0313 12:31:52.727855 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-548kp"] Mar 13 12:31:52 crc kubenswrapper[4837]: I0313 12:31:52.741325 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-548kp"] Mar 13 12:31:52 crc kubenswrapper[4837]: I0313 12:31:52.744887 4837 scope.go:117] "RemoveContainer" containerID="f909d1a24957ee9bf34a08df5acbf22285940797d06b8e41029978786a4c1aae" Mar 13 12:31:52 crc kubenswrapper[4837]: I0313 12:31:52.778974 4837 scope.go:117] "RemoveContainer" containerID="15e13b316223eb538da03157c2f12378c04650ef2033b6a2e3fdeb73d34837f4" Mar 13 12:31:52 crc kubenswrapper[4837]: E0313 12:31:52.779465 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15e13b316223eb538da03157c2f12378c04650ef2033b6a2e3fdeb73d34837f4\": container with ID starting with 15e13b316223eb538da03157c2f12378c04650ef2033b6a2e3fdeb73d34837f4 not found: ID does not exist" containerID="15e13b316223eb538da03157c2f12378c04650ef2033b6a2e3fdeb73d34837f4" Mar 13 12:31:52 crc kubenswrapper[4837]: I0313 12:31:52.779493 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15e13b316223eb538da03157c2f12378c04650ef2033b6a2e3fdeb73d34837f4"} err="failed to get container status \"15e13b316223eb538da03157c2f12378c04650ef2033b6a2e3fdeb73d34837f4\": rpc error: code = NotFound desc = could not find container \"15e13b316223eb538da03157c2f12378c04650ef2033b6a2e3fdeb73d34837f4\": container with ID starting with 15e13b316223eb538da03157c2f12378c04650ef2033b6a2e3fdeb73d34837f4 not found: ID does not exist" Mar 13 12:31:52 crc kubenswrapper[4837]: I0313 12:31:52.779514 4837 scope.go:117] "RemoveContainer" containerID="c025e0fcad011221373a532b0452eaa46808dfb3988ff1e006eac648a2c2453b" Mar 13 12:31:52 crc kubenswrapper[4837]: E0313 12:31:52.780004 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c025e0fcad011221373a532b0452eaa46808dfb3988ff1e006eac648a2c2453b\": container with ID starting with c025e0fcad011221373a532b0452eaa46808dfb3988ff1e006eac648a2c2453b not found: ID does not exist" containerID="c025e0fcad011221373a532b0452eaa46808dfb3988ff1e006eac648a2c2453b" Mar 13 12:31:52 crc kubenswrapper[4837]: I0313 12:31:52.780029 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c025e0fcad011221373a532b0452eaa46808dfb3988ff1e006eac648a2c2453b"} err="failed to get container status \"c025e0fcad011221373a532b0452eaa46808dfb3988ff1e006eac648a2c2453b\": rpc error: code = NotFound desc = could not find container \"c025e0fcad011221373a532b0452eaa46808dfb3988ff1e006eac648a2c2453b\": container with ID starting with c025e0fcad011221373a532b0452eaa46808dfb3988ff1e006eac648a2c2453b not found: ID does not exist" Mar 13 12:31:52 crc kubenswrapper[4837]: I0313 12:31:52.780043 4837 scope.go:117] "RemoveContainer" containerID="f909d1a24957ee9bf34a08df5acbf22285940797d06b8e41029978786a4c1aae" Mar 13 12:31:52 crc kubenswrapper[4837]: E0313 12:31:52.780294 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f909d1a24957ee9bf34a08df5acbf22285940797d06b8e41029978786a4c1aae\": container with ID starting with f909d1a24957ee9bf34a08df5acbf22285940797d06b8e41029978786a4c1aae not found: ID does not exist" containerID="f909d1a24957ee9bf34a08df5acbf22285940797d06b8e41029978786a4c1aae" Mar 13 12:31:52 crc kubenswrapper[4837]: I0313 12:31:52.780337 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f909d1a24957ee9bf34a08df5acbf22285940797d06b8e41029978786a4c1aae"} err="failed to get container status \"f909d1a24957ee9bf34a08df5acbf22285940797d06b8e41029978786a4c1aae\": rpc error: code = NotFound desc = could not find container \"f909d1a24957ee9bf34a08df5acbf22285940797d06b8e41029978786a4c1aae\": container with ID starting with f909d1a24957ee9bf34a08df5acbf22285940797d06b8e41029978786a4c1aae not found: ID does not exist" Mar 13 12:31:53 crc kubenswrapper[4837]: I0313 12:31:53.064149 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44f44c76-3281-4bf0-af2e-0bba3d0dd712" path="/var/lib/kubelet/pods/44f44c76-3281-4bf0-af2e-0bba3d0dd712/volumes" Mar 13 12:32:00 crc kubenswrapper[4837]: I0313 12:32:00.142276 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556752-5zvbj"] Mar 13 12:32:00 crc kubenswrapper[4837]: E0313 12:32:00.143412 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44f44c76-3281-4bf0-af2e-0bba3d0dd712" containerName="extract-utilities" Mar 13 12:32:00 crc kubenswrapper[4837]: I0313 12:32:00.143432 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="44f44c76-3281-4bf0-af2e-0bba3d0dd712" containerName="extract-utilities" Mar 13 12:32:00 crc kubenswrapper[4837]: E0313 12:32:00.143453 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44f44c76-3281-4bf0-af2e-0bba3d0dd712" containerName="registry-server" Mar 13 12:32:00 crc kubenswrapper[4837]: I0313 12:32:00.143459 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="44f44c76-3281-4bf0-af2e-0bba3d0dd712" containerName="registry-server" Mar 13 12:32:00 crc kubenswrapper[4837]: E0313 12:32:00.143486 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44f44c76-3281-4bf0-af2e-0bba3d0dd712" containerName="extract-content" Mar 13 12:32:00 crc kubenswrapper[4837]: I0313 12:32:00.143491 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="44f44c76-3281-4bf0-af2e-0bba3d0dd712" containerName="extract-content" Mar 13 12:32:00 crc kubenswrapper[4837]: I0313 12:32:00.143720 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="44f44c76-3281-4bf0-af2e-0bba3d0dd712" containerName="registry-server" Mar 13 12:32:00 crc kubenswrapper[4837]: I0313 12:32:00.144427 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556752-5zvbj" Mar 13 12:32:00 crc kubenswrapper[4837]: I0313 12:32:00.147116 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:32:00 crc kubenswrapper[4837]: I0313 12:32:00.147693 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:32:00 crc kubenswrapper[4837]: I0313 12:32:00.148521 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:32:00 crc kubenswrapper[4837]: I0313 12:32:00.150108 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556752-5zvbj"] Mar 13 12:32:00 crc kubenswrapper[4837]: I0313 12:32:00.300301 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7gsn\" (UniqueName: \"kubernetes.io/projected/6b8e58c8-d8ed-4773-99bb-6b480514d2b8-kube-api-access-r7gsn\") pod \"auto-csr-approver-29556752-5zvbj\" (UID: \"6b8e58c8-d8ed-4773-99bb-6b480514d2b8\") " pod="openshift-infra/auto-csr-approver-29556752-5zvbj" Mar 13 12:32:00 crc kubenswrapper[4837]: I0313 12:32:00.402800 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7gsn\" (UniqueName: \"kubernetes.io/projected/6b8e58c8-d8ed-4773-99bb-6b480514d2b8-kube-api-access-r7gsn\") pod \"auto-csr-approver-29556752-5zvbj\" (UID: \"6b8e58c8-d8ed-4773-99bb-6b480514d2b8\") " pod="openshift-infra/auto-csr-approver-29556752-5zvbj" Mar 13 12:32:00 crc kubenswrapper[4837]: I0313 12:32:00.429364 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7gsn\" (UniqueName: \"kubernetes.io/projected/6b8e58c8-d8ed-4773-99bb-6b480514d2b8-kube-api-access-r7gsn\") pod \"auto-csr-approver-29556752-5zvbj\" (UID: \"6b8e58c8-d8ed-4773-99bb-6b480514d2b8\") " pod="openshift-infra/auto-csr-approver-29556752-5zvbj" Mar 13 12:32:00 crc kubenswrapper[4837]: I0313 12:32:00.468032 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556752-5zvbj" Mar 13 12:32:00 crc kubenswrapper[4837]: I0313 12:32:00.903356 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556752-5zvbj"] Mar 13 12:32:01 crc kubenswrapper[4837]: I0313 12:32:01.765916 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556752-5zvbj" event={"ID":"6b8e58c8-d8ed-4773-99bb-6b480514d2b8","Type":"ContainerStarted","Data":"ed83f04bc89e6bcbccb374bf4403804d23818bfe3d206cdaf46734815716f1cf"} Mar 13 12:32:02 crc kubenswrapper[4837]: I0313 12:32:02.777879 4837 generic.go:334] "Generic (PLEG): container finished" podID="6b8e58c8-d8ed-4773-99bb-6b480514d2b8" containerID="17c6f8f105ae4921e05f2c394ef1bdbce7049ed97fc5dda971790fd2b3b77a0d" exitCode=0 Mar 13 12:32:02 crc kubenswrapper[4837]: I0313 12:32:02.777933 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556752-5zvbj" event={"ID":"6b8e58c8-d8ed-4773-99bb-6b480514d2b8","Type":"ContainerDied","Data":"17c6f8f105ae4921e05f2c394ef1bdbce7049ed97fc5dda971790fd2b3b77a0d"} Mar 13 12:32:04 crc kubenswrapper[4837]: I0313 12:32:04.152261 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556752-5zvbj" Mar 13 12:32:04 crc kubenswrapper[4837]: I0313 12:32:04.188836 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7gsn\" (UniqueName: \"kubernetes.io/projected/6b8e58c8-d8ed-4773-99bb-6b480514d2b8-kube-api-access-r7gsn\") pod \"6b8e58c8-d8ed-4773-99bb-6b480514d2b8\" (UID: \"6b8e58c8-d8ed-4773-99bb-6b480514d2b8\") " Mar 13 12:32:04 crc kubenswrapper[4837]: I0313 12:32:04.193912 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b8e58c8-d8ed-4773-99bb-6b480514d2b8-kube-api-access-r7gsn" (OuterVolumeSpecName: "kube-api-access-r7gsn") pod "6b8e58c8-d8ed-4773-99bb-6b480514d2b8" (UID: "6b8e58c8-d8ed-4773-99bb-6b480514d2b8"). InnerVolumeSpecName "kube-api-access-r7gsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:32:04 crc kubenswrapper[4837]: I0313 12:32:04.290892 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7gsn\" (UniqueName: \"kubernetes.io/projected/6b8e58c8-d8ed-4773-99bb-6b480514d2b8-kube-api-access-r7gsn\") on node \"crc\" DevicePath \"\"" Mar 13 12:32:04 crc kubenswrapper[4837]: I0313 12:32:04.796386 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556752-5zvbj" event={"ID":"6b8e58c8-d8ed-4773-99bb-6b480514d2b8","Type":"ContainerDied","Data":"ed83f04bc89e6bcbccb374bf4403804d23818bfe3d206cdaf46734815716f1cf"} Mar 13 12:32:04 crc kubenswrapper[4837]: I0313 12:32:04.796436 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed83f04bc89e6bcbccb374bf4403804d23818bfe3d206cdaf46734815716f1cf" Mar 13 12:32:04 crc kubenswrapper[4837]: I0313 12:32:04.796452 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556752-5zvbj" Mar 13 12:32:05 crc kubenswrapper[4837]: I0313 12:32:05.226953 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556746-vwjkq"] Mar 13 12:32:05 crc kubenswrapper[4837]: I0313 12:32:05.236556 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556746-vwjkq"] Mar 13 12:32:07 crc kubenswrapper[4837]: I0313 12:32:07.059258 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb9a9c7b-13fc-4655-91b2-a388c3870bf8" path="/var/lib/kubelet/pods/eb9a9c7b-13fc-4655-91b2-a388c3870bf8/volumes" Mar 13 12:32:30 crc kubenswrapper[4837]: I0313 12:32:30.775607 4837 scope.go:117] "RemoveContainer" containerID="f67572c8c6ce19fc30d5a363241b6294efe0fe117e547d75720e88fd9323c357" Mar 13 12:32:30 crc kubenswrapper[4837]: I0313 12:32:30.801551 4837 scope.go:117] "RemoveContainer" containerID="7b62114542f297a4d3e9e2cc215c273a290ed34518b0790734229b78c1fdfc3c" Mar 13 12:32:30 crc kubenswrapper[4837]: I0313 12:32:30.843322 4837 scope.go:117] "RemoveContainer" containerID="27d05aedac81655ab98a132d059aa69f642170fd7305465ba1bc55dadd819af6" Mar 13 12:32:30 crc kubenswrapper[4837]: I0313 12:32:30.903307 4837 scope.go:117] "RemoveContainer" containerID="a5eee508b483e64f809508b03aa1f0b24998bdde6d37da5807abe3cdc59f087e" Mar 13 12:33:35 crc kubenswrapper[4837]: I0313 12:33:35.484056 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:33:35 crc kubenswrapper[4837]: I0313 12:33:35.484631 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:34:00 crc kubenswrapper[4837]: I0313 12:34:00.146211 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556754-kvwpx"] Mar 13 12:34:00 crc kubenswrapper[4837]: E0313 12:34:00.147985 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b8e58c8-d8ed-4773-99bb-6b480514d2b8" containerName="oc" Mar 13 12:34:00 crc kubenswrapper[4837]: I0313 12:34:00.148013 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b8e58c8-d8ed-4773-99bb-6b480514d2b8" containerName="oc" Mar 13 12:34:00 crc kubenswrapper[4837]: I0313 12:34:00.148255 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b8e58c8-d8ed-4773-99bb-6b480514d2b8" containerName="oc" Mar 13 12:34:00 crc kubenswrapper[4837]: I0313 12:34:00.149232 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556754-kvwpx" Mar 13 12:34:00 crc kubenswrapper[4837]: I0313 12:34:00.151215 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:34:00 crc kubenswrapper[4837]: I0313 12:34:00.151908 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:34:00 crc kubenswrapper[4837]: I0313 12:34:00.152578 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:34:00 crc kubenswrapper[4837]: I0313 12:34:00.156713 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556754-kvwpx"] Mar 13 12:34:00 crc kubenswrapper[4837]: I0313 12:34:00.200784 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdknf\" (UniqueName: \"kubernetes.io/projected/41916e77-60c2-4138-b622-003f267ac74e-kube-api-access-wdknf\") pod \"auto-csr-approver-29556754-kvwpx\" (UID: \"41916e77-60c2-4138-b622-003f267ac74e\") " pod="openshift-infra/auto-csr-approver-29556754-kvwpx" Mar 13 12:34:00 crc kubenswrapper[4837]: I0313 12:34:00.302987 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdknf\" (UniqueName: \"kubernetes.io/projected/41916e77-60c2-4138-b622-003f267ac74e-kube-api-access-wdknf\") pod \"auto-csr-approver-29556754-kvwpx\" (UID: \"41916e77-60c2-4138-b622-003f267ac74e\") " pod="openshift-infra/auto-csr-approver-29556754-kvwpx" Mar 13 12:34:00 crc kubenswrapper[4837]: I0313 12:34:00.324944 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdknf\" (UniqueName: \"kubernetes.io/projected/41916e77-60c2-4138-b622-003f267ac74e-kube-api-access-wdknf\") pod \"auto-csr-approver-29556754-kvwpx\" (UID: \"41916e77-60c2-4138-b622-003f267ac74e\") " pod="openshift-infra/auto-csr-approver-29556754-kvwpx" Mar 13 12:34:00 crc kubenswrapper[4837]: I0313 12:34:00.468917 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556754-kvwpx" Mar 13 12:34:01 crc kubenswrapper[4837]: I0313 12:34:01.017020 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556754-kvwpx"] Mar 13 12:34:01 crc kubenswrapper[4837]: I0313 12:34:01.862995 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556754-kvwpx" event={"ID":"41916e77-60c2-4138-b622-003f267ac74e","Type":"ContainerStarted","Data":"0d0adfbe072e6cb409e30cd6bb9c285f58303b5968ad08015120f05eedfced64"} Mar 13 12:34:02 crc kubenswrapper[4837]: I0313 12:34:02.871909 4837 generic.go:334] "Generic (PLEG): container finished" podID="41916e77-60c2-4138-b622-003f267ac74e" containerID="d7ac07201d91371d67830ee944bf685a8cc89e1603bec505a0f5a81676127cf5" exitCode=0 Mar 13 12:34:02 crc kubenswrapper[4837]: I0313 12:34:02.871987 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556754-kvwpx" event={"ID":"41916e77-60c2-4138-b622-003f267ac74e","Type":"ContainerDied","Data":"d7ac07201d91371d67830ee944bf685a8cc89e1603bec505a0f5a81676127cf5"} Mar 13 12:34:04 crc kubenswrapper[4837]: I0313 12:34:04.288216 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556754-kvwpx" Mar 13 12:34:04 crc kubenswrapper[4837]: I0313 12:34:04.380399 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdknf\" (UniqueName: \"kubernetes.io/projected/41916e77-60c2-4138-b622-003f267ac74e-kube-api-access-wdknf\") pod \"41916e77-60c2-4138-b622-003f267ac74e\" (UID: \"41916e77-60c2-4138-b622-003f267ac74e\") " Mar 13 12:34:04 crc kubenswrapper[4837]: I0313 12:34:04.387116 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41916e77-60c2-4138-b622-003f267ac74e-kube-api-access-wdknf" (OuterVolumeSpecName: "kube-api-access-wdknf") pod "41916e77-60c2-4138-b622-003f267ac74e" (UID: "41916e77-60c2-4138-b622-003f267ac74e"). InnerVolumeSpecName "kube-api-access-wdknf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:34:04 crc kubenswrapper[4837]: I0313 12:34:04.483088 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdknf\" (UniqueName: \"kubernetes.io/projected/41916e77-60c2-4138-b622-003f267ac74e-kube-api-access-wdknf\") on node \"crc\" DevicePath \"\"" Mar 13 12:34:04 crc kubenswrapper[4837]: I0313 12:34:04.902543 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556754-kvwpx" event={"ID":"41916e77-60c2-4138-b622-003f267ac74e","Type":"ContainerDied","Data":"0d0adfbe072e6cb409e30cd6bb9c285f58303b5968ad08015120f05eedfced64"} Mar 13 12:34:04 crc kubenswrapper[4837]: I0313 12:34:04.902889 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d0adfbe072e6cb409e30cd6bb9c285f58303b5968ad08015120f05eedfced64" Mar 13 12:34:04 crc kubenswrapper[4837]: I0313 12:34:04.902712 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556754-kvwpx" Mar 13 12:34:05 crc kubenswrapper[4837]: I0313 12:34:05.352939 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556748-dgqh7"] Mar 13 12:34:05 crc kubenswrapper[4837]: I0313 12:34:05.359892 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556748-dgqh7"] Mar 13 12:34:05 crc kubenswrapper[4837]: I0313 12:34:05.484114 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:34:05 crc kubenswrapper[4837]: I0313 12:34:05.484183 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:34:07 crc kubenswrapper[4837]: I0313 12:34:07.059070 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72d56baf-1d17-4cbb-a351-8f5bf373c768" path="/var/lib/kubelet/pods/72d56baf-1d17-4cbb-a351-8f5bf373c768/volumes" Mar 13 12:34:31 crc kubenswrapper[4837]: I0313 12:34:30.999796 4837 scope.go:117] "RemoveContainer" containerID="1f6ecae5057b8984cf0bda7716bd86797af96a5f4c3a84ef8f5f85cb3c1def23" Mar 13 12:34:35 crc kubenswrapper[4837]: I0313 12:34:35.483924 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:34:35 crc kubenswrapper[4837]: I0313 12:34:35.484580 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:34:35 crc kubenswrapper[4837]: I0313 12:34:35.484666 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 12:34:35 crc kubenswrapper[4837]: I0313 12:34:35.485422 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2b3696e623d9d39a462c11ad27c35b22549ca870996c4db75be20036c9201734"} pod="openshift-machine-config-operator/machine-config-daemon-2td4d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 12:34:35 crc kubenswrapper[4837]: I0313 12:34:35.485494 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" containerID="cri-o://2b3696e623d9d39a462c11ad27c35b22549ca870996c4db75be20036c9201734" gracePeriod=600 Mar 13 12:34:36 crc kubenswrapper[4837]: I0313 12:34:36.188975 4837 generic.go:334] "Generic (PLEG): container finished" podID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerID="2b3696e623d9d39a462c11ad27c35b22549ca870996c4db75be20036c9201734" exitCode=0 Mar 13 12:34:36 crc kubenswrapper[4837]: I0313 12:34:36.189065 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerDied","Data":"2b3696e623d9d39a462c11ad27c35b22549ca870996c4db75be20036c9201734"} Mar 13 12:34:36 crc kubenswrapper[4837]: I0313 12:34:36.189537 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerStarted","Data":"0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65"} Mar 13 12:34:36 crc kubenswrapper[4837]: I0313 12:34:36.189568 4837 scope.go:117] "RemoveContainer" containerID="e1ee224a94868c84b1a06b1622026924e013599fbae376745c85631013a75504" Mar 13 12:36:00 crc kubenswrapper[4837]: I0313 12:36:00.163214 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556756-xz7n5"] Mar 13 12:36:00 crc kubenswrapper[4837]: E0313 12:36:00.164293 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41916e77-60c2-4138-b622-003f267ac74e" containerName="oc" Mar 13 12:36:00 crc kubenswrapper[4837]: I0313 12:36:00.164310 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="41916e77-60c2-4138-b622-003f267ac74e" containerName="oc" Mar 13 12:36:00 crc kubenswrapper[4837]: I0313 12:36:00.164589 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="41916e77-60c2-4138-b622-003f267ac74e" containerName="oc" Mar 13 12:36:00 crc kubenswrapper[4837]: I0313 12:36:00.168485 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556756-xz7n5" Mar 13 12:36:00 crc kubenswrapper[4837]: I0313 12:36:00.170796 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:36:00 crc kubenswrapper[4837]: I0313 12:36:00.171115 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:36:00 crc kubenswrapper[4837]: I0313 12:36:00.172554 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:36:00 crc kubenswrapper[4837]: I0313 12:36:00.177524 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556756-xz7n5"] Mar 13 12:36:00 crc kubenswrapper[4837]: I0313 12:36:00.269116 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5qgs\" (UniqueName: \"kubernetes.io/projected/8833ed1c-80bb-4529-9f4a-6109d1a39f13-kube-api-access-b5qgs\") pod \"auto-csr-approver-29556756-xz7n5\" (UID: \"8833ed1c-80bb-4529-9f4a-6109d1a39f13\") " pod="openshift-infra/auto-csr-approver-29556756-xz7n5" Mar 13 12:36:00 crc kubenswrapper[4837]: I0313 12:36:00.370443 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5qgs\" (UniqueName: \"kubernetes.io/projected/8833ed1c-80bb-4529-9f4a-6109d1a39f13-kube-api-access-b5qgs\") pod \"auto-csr-approver-29556756-xz7n5\" (UID: \"8833ed1c-80bb-4529-9f4a-6109d1a39f13\") " pod="openshift-infra/auto-csr-approver-29556756-xz7n5" Mar 13 12:36:00 crc kubenswrapper[4837]: I0313 12:36:00.398429 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5qgs\" (UniqueName: \"kubernetes.io/projected/8833ed1c-80bb-4529-9f4a-6109d1a39f13-kube-api-access-b5qgs\") pod \"auto-csr-approver-29556756-xz7n5\" (UID: \"8833ed1c-80bb-4529-9f4a-6109d1a39f13\") " pod="openshift-infra/auto-csr-approver-29556756-xz7n5" Mar 13 12:36:00 crc kubenswrapper[4837]: I0313 12:36:00.488302 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556756-xz7n5" Mar 13 12:36:00 crc kubenswrapper[4837]: I0313 12:36:00.965309 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556756-xz7n5"] Mar 13 12:36:00 crc kubenswrapper[4837]: I0313 12:36:00.973769 4837 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 12:36:01 crc kubenswrapper[4837]: I0313 12:36:01.892490 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556756-xz7n5" event={"ID":"8833ed1c-80bb-4529-9f4a-6109d1a39f13","Type":"ContainerStarted","Data":"756d5bc9a4b96d9378d5949b7aa9590b00ec5ed10f86898e6efa807e9d3a455d"} Mar 13 12:36:02 crc kubenswrapper[4837]: I0313 12:36:02.902578 4837 generic.go:334] "Generic (PLEG): container finished" podID="8833ed1c-80bb-4529-9f4a-6109d1a39f13" containerID="8991bbc909e2098b2d6fb047c31dca6c613e8c861798107378538f426d77e480" exitCode=0 Mar 13 12:36:02 crc kubenswrapper[4837]: I0313 12:36:02.902698 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556756-xz7n5" event={"ID":"8833ed1c-80bb-4529-9f4a-6109d1a39f13","Type":"ContainerDied","Data":"8991bbc909e2098b2d6fb047c31dca6c613e8c861798107378538f426d77e480"} Mar 13 12:36:04 crc kubenswrapper[4837]: I0313 12:36:04.370332 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556756-xz7n5" Mar 13 12:36:04 crc kubenswrapper[4837]: I0313 12:36:04.462617 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5qgs\" (UniqueName: \"kubernetes.io/projected/8833ed1c-80bb-4529-9f4a-6109d1a39f13-kube-api-access-b5qgs\") pod \"8833ed1c-80bb-4529-9f4a-6109d1a39f13\" (UID: \"8833ed1c-80bb-4529-9f4a-6109d1a39f13\") " Mar 13 12:36:04 crc kubenswrapper[4837]: I0313 12:36:04.471320 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8833ed1c-80bb-4529-9f4a-6109d1a39f13-kube-api-access-b5qgs" (OuterVolumeSpecName: "kube-api-access-b5qgs") pod "8833ed1c-80bb-4529-9f4a-6109d1a39f13" (UID: "8833ed1c-80bb-4529-9f4a-6109d1a39f13"). InnerVolumeSpecName "kube-api-access-b5qgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:36:04 crc kubenswrapper[4837]: I0313 12:36:04.568179 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5qgs\" (UniqueName: \"kubernetes.io/projected/8833ed1c-80bb-4529-9f4a-6109d1a39f13-kube-api-access-b5qgs\") on node \"crc\" DevicePath \"\"" Mar 13 12:36:04 crc kubenswrapper[4837]: I0313 12:36:04.920740 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556756-xz7n5" event={"ID":"8833ed1c-80bb-4529-9f4a-6109d1a39f13","Type":"ContainerDied","Data":"756d5bc9a4b96d9378d5949b7aa9590b00ec5ed10f86898e6efa807e9d3a455d"} Mar 13 12:36:04 crc kubenswrapper[4837]: I0313 12:36:04.920981 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="756d5bc9a4b96d9378d5949b7aa9590b00ec5ed10f86898e6efa807e9d3a455d" Mar 13 12:36:04 crc kubenswrapper[4837]: I0313 12:36:04.920791 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556756-xz7n5" Mar 13 12:36:05 crc kubenswrapper[4837]: I0313 12:36:05.435599 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556750-xl9g6"] Mar 13 12:36:05 crc kubenswrapper[4837]: I0313 12:36:05.443429 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556750-xl9g6"] Mar 13 12:36:07 crc kubenswrapper[4837]: I0313 12:36:07.100250 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e39a0509-ea55-4b46-a3dc-473bb655cad8" path="/var/lib/kubelet/pods/e39a0509-ea55-4b46-a3dc-473bb655cad8/volumes" Mar 13 12:36:31 crc kubenswrapper[4837]: I0313 12:36:31.093881 4837 scope.go:117] "RemoveContainer" containerID="b4cb982598c9648b581b684b81629524a6916bc2574ae740b552fa7040fb8d2e" Mar 13 12:36:35 crc kubenswrapper[4837]: I0313 12:36:35.483974 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:36:35 crc kubenswrapper[4837]: I0313 12:36:35.484523 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:37:05 crc kubenswrapper[4837]: I0313 12:37:05.484301 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:37:05 crc kubenswrapper[4837]: I0313 12:37:05.484890 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:37:35 crc kubenswrapper[4837]: I0313 12:37:35.483701 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:37:35 crc kubenswrapper[4837]: I0313 12:37:35.484240 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:37:35 crc kubenswrapper[4837]: I0313 12:37:35.484291 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 12:37:35 crc kubenswrapper[4837]: I0313 12:37:35.485125 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65"} pod="openshift-machine-config-operator/machine-config-daemon-2td4d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 12:37:35 crc kubenswrapper[4837]: I0313 12:37:35.485186 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" containerID="cri-o://0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" gracePeriod=600 Mar 13 12:37:35 crc kubenswrapper[4837]: E0313 12:37:35.604809 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:37:36 crc kubenswrapper[4837]: I0313 12:37:36.077003 4837 generic.go:334] "Generic (PLEG): container finished" podID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerID="0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" exitCode=0 Mar 13 12:37:36 crc kubenswrapper[4837]: I0313 12:37:36.077086 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerDied","Data":"0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65"} Mar 13 12:37:36 crc kubenswrapper[4837]: I0313 12:37:36.077541 4837 scope.go:117] "RemoveContainer" containerID="2b3696e623d9d39a462c11ad27c35b22549ca870996c4db75be20036c9201734" Mar 13 12:37:36 crc kubenswrapper[4837]: I0313 12:37:36.078221 4837 scope.go:117] "RemoveContainer" containerID="0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" Mar 13 12:37:36 crc kubenswrapper[4837]: E0313 12:37:36.078491 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:37:49 crc kubenswrapper[4837]: I0313 12:37:49.048462 4837 scope.go:117] "RemoveContainer" containerID="0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" Mar 13 12:37:49 crc kubenswrapper[4837]: E0313 12:37:49.049066 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:38:00 crc kubenswrapper[4837]: I0313 12:38:00.048470 4837 scope.go:117] "RemoveContainer" containerID="0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" Mar 13 12:38:00 crc kubenswrapper[4837]: E0313 12:38:00.049240 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:38:00 crc kubenswrapper[4837]: I0313 12:38:00.149816 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556758-srcvt"] Mar 13 12:38:00 crc kubenswrapper[4837]: E0313 12:38:00.150304 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8833ed1c-80bb-4529-9f4a-6109d1a39f13" containerName="oc" Mar 13 12:38:00 crc kubenswrapper[4837]: I0313 12:38:00.150328 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="8833ed1c-80bb-4529-9f4a-6109d1a39f13" containerName="oc" Mar 13 12:38:00 crc kubenswrapper[4837]: I0313 12:38:00.150521 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="8833ed1c-80bb-4529-9f4a-6109d1a39f13" containerName="oc" Mar 13 12:38:00 crc kubenswrapper[4837]: I0313 12:38:00.151152 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556758-srcvt" Mar 13 12:38:00 crc kubenswrapper[4837]: I0313 12:38:00.157193 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:38:00 crc kubenswrapper[4837]: I0313 12:38:00.157460 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:38:00 crc kubenswrapper[4837]: I0313 12:38:00.157729 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:38:00 crc kubenswrapper[4837]: I0313 12:38:00.172318 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556758-srcvt"] Mar 13 12:38:00 crc kubenswrapper[4837]: I0313 12:38:00.249378 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sngz\" (UniqueName: \"kubernetes.io/projected/0ea011c6-ad1b-46cf-b5c4-11ac11fd6a38-kube-api-access-4sngz\") pod \"auto-csr-approver-29556758-srcvt\" (UID: \"0ea011c6-ad1b-46cf-b5c4-11ac11fd6a38\") " pod="openshift-infra/auto-csr-approver-29556758-srcvt" Mar 13 12:38:00 crc kubenswrapper[4837]: I0313 12:38:00.351436 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sngz\" (UniqueName: \"kubernetes.io/projected/0ea011c6-ad1b-46cf-b5c4-11ac11fd6a38-kube-api-access-4sngz\") pod \"auto-csr-approver-29556758-srcvt\" (UID: \"0ea011c6-ad1b-46cf-b5c4-11ac11fd6a38\") " pod="openshift-infra/auto-csr-approver-29556758-srcvt" Mar 13 12:38:00 crc kubenswrapper[4837]: I0313 12:38:00.370252 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sngz\" (UniqueName: \"kubernetes.io/projected/0ea011c6-ad1b-46cf-b5c4-11ac11fd6a38-kube-api-access-4sngz\") pod \"auto-csr-approver-29556758-srcvt\" (UID: \"0ea011c6-ad1b-46cf-b5c4-11ac11fd6a38\") " pod="openshift-infra/auto-csr-approver-29556758-srcvt" Mar 13 12:38:00 crc kubenswrapper[4837]: I0313 12:38:00.476445 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556758-srcvt" Mar 13 12:38:00 crc kubenswrapper[4837]: I0313 12:38:00.894891 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556758-srcvt"] Mar 13 12:38:01 crc kubenswrapper[4837]: I0313 12:38:01.450370 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556758-srcvt" event={"ID":"0ea011c6-ad1b-46cf-b5c4-11ac11fd6a38","Type":"ContainerStarted","Data":"88ecdaf5d9fd6adb3ae64176b2aaffe17d690c37bef9bf13945ecd3020fa68dd"} Mar 13 12:38:02 crc kubenswrapper[4837]: I0313 12:38:02.460126 4837 generic.go:334] "Generic (PLEG): container finished" podID="0ea011c6-ad1b-46cf-b5c4-11ac11fd6a38" containerID="62169da6c39018c4d64900197bc422e10f99368271388e87ca1a65e2ba0fb126" exitCode=0 Mar 13 12:38:02 crc kubenswrapper[4837]: I0313 12:38:02.460167 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556758-srcvt" event={"ID":"0ea011c6-ad1b-46cf-b5c4-11ac11fd6a38","Type":"ContainerDied","Data":"62169da6c39018c4d64900197bc422e10f99368271388e87ca1a65e2ba0fb126"} Mar 13 12:38:03 crc kubenswrapper[4837]: I0313 12:38:03.886810 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556758-srcvt" Mar 13 12:38:04 crc kubenswrapper[4837]: I0313 12:38:04.026438 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4sngz\" (UniqueName: \"kubernetes.io/projected/0ea011c6-ad1b-46cf-b5c4-11ac11fd6a38-kube-api-access-4sngz\") pod \"0ea011c6-ad1b-46cf-b5c4-11ac11fd6a38\" (UID: \"0ea011c6-ad1b-46cf-b5c4-11ac11fd6a38\") " Mar 13 12:38:04 crc kubenswrapper[4837]: I0313 12:38:04.032801 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ea011c6-ad1b-46cf-b5c4-11ac11fd6a38-kube-api-access-4sngz" (OuterVolumeSpecName: "kube-api-access-4sngz") pod "0ea011c6-ad1b-46cf-b5c4-11ac11fd6a38" (UID: "0ea011c6-ad1b-46cf-b5c4-11ac11fd6a38"). InnerVolumeSpecName "kube-api-access-4sngz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:38:04 crc kubenswrapper[4837]: I0313 12:38:04.129357 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4sngz\" (UniqueName: \"kubernetes.io/projected/0ea011c6-ad1b-46cf-b5c4-11ac11fd6a38-kube-api-access-4sngz\") on node \"crc\" DevicePath \"\"" Mar 13 12:38:04 crc kubenswrapper[4837]: I0313 12:38:04.476791 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556758-srcvt" event={"ID":"0ea011c6-ad1b-46cf-b5c4-11ac11fd6a38","Type":"ContainerDied","Data":"88ecdaf5d9fd6adb3ae64176b2aaffe17d690c37bef9bf13945ecd3020fa68dd"} Mar 13 12:38:04 crc kubenswrapper[4837]: I0313 12:38:04.476845 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88ecdaf5d9fd6adb3ae64176b2aaffe17d690c37bef9bf13945ecd3020fa68dd" Mar 13 12:38:04 crc kubenswrapper[4837]: I0313 12:38:04.476849 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556758-srcvt" Mar 13 12:38:04 crc kubenswrapper[4837]: I0313 12:38:04.954048 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556752-5zvbj"] Mar 13 12:38:04 crc kubenswrapper[4837]: I0313 12:38:04.962440 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556752-5zvbj"] Mar 13 12:38:05 crc kubenswrapper[4837]: I0313 12:38:05.060368 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b8e58c8-d8ed-4773-99bb-6b480514d2b8" path="/var/lib/kubelet/pods/6b8e58c8-d8ed-4773-99bb-6b480514d2b8/volumes" Mar 13 12:38:11 crc kubenswrapper[4837]: I0313 12:38:11.048622 4837 scope.go:117] "RemoveContainer" containerID="0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" Mar 13 12:38:11 crc kubenswrapper[4837]: E0313 12:38:11.049458 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:38:22 crc kubenswrapper[4837]: I0313 12:38:22.048538 4837 scope.go:117] "RemoveContainer" containerID="0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" Mar 13 12:38:22 crc kubenswrapper[4837]: E0313 12:38:22.049381 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:38:31 crc kubenswrapper[4837]: I0313 12:38:31.195892 4837 scope.go:117] "RemoveContainer" containerID="17c6f8f105ae4921e05f2c394ef1bdbce7049ed97fc5dda971790fd2b3b77a0d" Mar 13 12:38:35 crc kubenswrapper[4837]: I0313 12:38:35.055997 4837 scope.go:117] "RemoveContainer" containerID="0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" Mar 13 12:38:35 crc kubenswrapper[4837]: E0313 12:38:35.056829 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:38:46 crc kubenswrapper[4837]: I0313 12:38:46.048447 4837 scope.go:117] "RemoveContainer" containerID="0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" Mar 13 12:38:46 crc kubenswrapper[4837]: E0313 12:38:46.049471 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:38:58 crc kubenswrapper[4837]: I0313 12:38:58.048138 4837 scope.go:117] "RemoveContainer" containerID="0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" Mar 13 12:38:58 crc kubenswrapper[4837]: E0313 12:38:58.048753 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:39:12 crc kubenswrapper[4837]: I0313 12:39:12.232311 4837 scope.go:117] "RemoveContainer" containerID="0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" Mar 13 12:39:12 crc kubenswrapper[4837]: E0313 12:39:12.233234 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:39:24 crc kubenswrapper[4837]: I0313 12:39:24.048868 4837 scope.go:117] "RemoveContainer" containerID="0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" Mar 13 12:39:24 crc kubenswrapper[4837]: E0313 12:39:24.049796 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:39:35 crc kubenswrapper[4837]: I0313 12:39:35.054149 4837 scope.go:117] "RemoveContainer" containerID="0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" Mar 13 12:39:35 crc kubenswrapper[4837]: E0313 12:39:35.054977 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:39:47 crc kubenswrapper[4837]: I0313 12:39:47.047924 4837 scope.go:117] "RemoveContainer" containerID="0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" Mar 13 12:39:47 crc kubenswrapper[4837]: E0313 12:39:47.048837 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:39:59 crc kubenswrapper[4837]: I0313 12:39:59.048897 4837 scope.go:117] "RemoveContainer" containerID="0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" Mar 13 12:39:59 crc kubenswrapper[4837]: E0313 12:39:59.049691 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:40:00 crc kubenswrapper[4837]: I0313 12:40:00.142913 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556760-zcnxn"] Mar 13 12:40:00 crc kubenswrapper[4837]: E0313 12:40:00.143633 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ea011c6-ad1b-46cf-b5c4-11ac11fd6a38" containerName="oc" Mar 13 12:40:00 crc kubenswrapper[4837]: I0313 12:40:00.143668 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ea011c6-ad1b-46cf-b5c4-11ac11fd6a38" containerName="oc" Mar 13 12:40:00 crc kubenswrapper[4837]: I0313 12:40:00.143838 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ea011c6-ad1b-46cf-b5c4-11ac11fd6a38" containerName="oc" Mar 13 12:40:00 crc kubenswrapper[4837]: I0313 12:40:00.144508 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556760-zcnxn" Mar 13 12:40:00 crc kubenswrapper[4837]: I0313 12:40:00.147265 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:40:00 crc kubenswrapper[4837]: I0313 12:40:00.147429 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:40:00 crc kubenswrapper[4837]: I0313 12:40:00.147743 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:40:00 crc kubenswrapper[4837]: I0313 12:40:00.151035 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556760-zcnxn"] Mar 13 12:40:00 crc kubenswrapper[4837]: I0313 12:40:00.211266 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6lfz\" (UniqueName: \"kubernetes.io/projected/a273cb74-6dcc-4e87-8f25-db5c77132250-kube-api-access-r6lfz\") pod \"auto-csr-approver-29556760-zcnxn\" (UID: \"a273cb74-6dcc-4e87-8f25-db5c77132250\") " pod="openshift-infra/auto-csr-approver-29556760-zcnxn" Mar 13 12:40:00 crc kubenswrapper[4837]: I0313 12:40:00.312975 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6lfz\" (UniqueName: \"kubernetes.io/projected/a273cb74-6dcc-4e87-8f25-db5c77132250-kube-api-access-r6lfz\") pod \"auto-csr-approver-29556760-zcnxn\" (UID: \"a273cb74-6dcc-4e87-8f25-db5c77132250\") " pod="openshift-infra/auto-csr-approver-29556760-zcnxn" Mar 13 12:40:00 crc kubenswrapper[4837]: I0313 12:40:00.331805 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6lfz\" (UniqueName: \"kubernetes.io/projected/a273cb74-6dcc-4e87-8f25-db5c77132250-kube-api-access-r6lfz\") pod \"auto-csr-approver-29556760-zcnxn\" (UID: \"a273cb74-6dcc-4e87-8f25-db5c77132250\") " pod="openshift-infra/auto-csr-approver-29556760-zcnxn" Mar 13 12:40:00 crc kubenswrapper[4837]: I0313 12:40:00.497571 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556760-zcnxn" Mar 13 12:40:00 crc kubenswrapper[4837]: I0313 12:40:00.922046 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556760-zcnxn"] Mar 13 12:40:01 crc kubenswrapper[4837]: I0313 12:40:01.548866 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556760-zcnxn" event={"ID":"a273cb74-6dcc-4e87-8f25-db5c77132250","Type":"ContainerStarted","Data":"35a75c11f21fd3aad88d6e5d5ecb767c99ae66bc7ca8f14c0484d9bd5481efb2"} Mar 13 12:40:02 crc kubenswrapper[4837]: I0313 12:40:02.561413 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556760-zcnxn" event={"ID":"a273cb74-6dcc-4e87-8f25-db5c77132250","Type":"ContainerStarted","Data":"da2f9878f57615785241ef1796e14de81a297b17cfd5ebaf3f55711c66c5482b"} Mar 13 12:40:02 crc kubenswrapper[4837]: I0313 12:40:02.578422 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556760-zcnxn" podStartSLOduration=1.279592313 podStartE2EDuration="2.578368828s" podCreationTimestamp="2026-03-13 12:40:00 +0000 UTC" firstStartedPulling="2026-03-13 12:40:00.927726458 +0000 UTC m=+3116.565993221" lastFinishedPulling="2026-03-13 12:40:02.226502973 +0000 UTC m=+3117.864769736" observedRunningTime="2026-03-13 12:40:02.574083383 +0000 UTC m=+3118.212350176" watchObservedRunningTime="2026-03-13 12:40:02.578368828 +0000 UTC m=+3118.216635601" Mar 13 12:40:03 crc kubenswrapper[4837]: I0313 12:40:03.573827 4837 generic.go:334] "Generic (PLEG): container finished" podID="a273cb74-6dcc-4e87-8f25-db5c77132250" containerID="da2f9878f57615785241ef1796e14de81a297b17cfd5ebaf3f55711c66c5482b" exitCode=0 Mar 13 12:40:03 crc kubenswrapper[4837]: I0313 12:40:03.573872 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556760-zcnxn" event={"ID":"a273cb74-6dcc-4e87-8f25-db5c77132250","Type":"ContainerDied","Data":"da2f9878f57615785241ef1796e14de81a297b17cfd5ebaf3f55711c66c5482b"} Mar 13 12:40:04 crc kubenswrapper[4837]: I0313 12:40:04.962723 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556760-zcnxn" Mar 13 12:40:05 crc kubenswrapper[4837]: I0313 12:40:05.121112 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6lfz\" (UniqueName: \"kubernetes.io/projected/a273cb74-6dcc-4e87-8f25-db5c77132250-kube-api-access-r6lfz\") pod \"a273cb74-6dcc-4e87-8f25-db5c77132250\" (UID: \"a273cb74-6dcc-4e87-8f25-db5c77132250\") " Mar 13 12:40:05 crc kubenswrapper[4837]: I0313 12:40:05.127495 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a273cb74-6dcc-4e87-8f25-db5c77132250-kube-api-access-r6lfz" (OuterVolumeSpecName: "kube-api-access-r6lfz") pod "a273cb74-6dcc-4e87-8f25-db5c77132250" (UID: "a273cb74-6dcc-4e87-8f25-db5c77132250"). InnerVolumeSpecName "kube-api-access-r6lfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:40:05 crc kubenswrapper[4837]: I0313 12:40:05.223561 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6lfz\" (UniqueName: \"kubernetes.io/projected/a273cb74-6dcc-4e87-8f25-db5c77132250-kube-api-access-r6lfz\") on node \"crc\" DevicePath \"\"" Mar 13 12:40:05 crc kubenswrapper[4837]: I0313 12:40:05.590883 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556760-zcnxn" event={"ID":"a273cb74-6dcc-4e87-8f25-db5c77132250","Type":"ContainerDied","Data":"35a75c11f21fd3aad88d6e5d5ecb767c99ae66bc7ca8f14c0484d9bd5481efb2"} Mar 13 12:40:05 crc kubenswrapper[4837]: I0313 12:40:05.591180 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35a75c11f21fd3aad88d6e5d5ecb767c99ae66bc7ca8f14c0484d9bd5481efb2" Mar 13 12:40:05 crc kubenswrapper[4837]: I0313 12:40:05.590956 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556760-zcnxn" Mar 13 12:40:05 crc kubenswrapper[4837]: I0313 12:40:05.645931 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556754-kvwpx"] Mar 13 12:40:05 crc kubenswrapper[4837]: I0313 12:40:05.653766 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556754-kvwpx"] Mar 13 12:40:07 crc kubenswrapper[4837]: I0313 12:40:07.058925 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41916e77-60c2-4138-b622-003f267ac74e" path="/var/lib/kubelet/pods/41916e77-60c2-4138-b622-003f267ac74e/volumes" Mar 13 12:40:12 crc kubenswrapper[4837]: I0313 12:40:12.048318 4837 scope.go:117] "RemoveContainer" containerID="0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" Mar 13 12:40:12 crc kubenswrapper[4837]: E0313 12:40:12.049287 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:40:23 crc kubenswrapper[4837]: I0313 12:40:23.049228 4837 scope.go:117] "RemoveContainer" containerID="0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" Mar 13 12:40:23 crc kubenswrapper[4837]: E0313 12:40:23.050816 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:40:31 crc kubenswrapper[4837]: I0313 12:40:31.282994 4837 scope.go:117] "RemoveContainer" containerID="d7ac07201d91371d67830ee944bf685a8cc89e1603bec505a0f5a81676127cf5" Mar 13 12:40:35 crc kubenswrapper[4837]: I0313 12:40:35.063969 4837 scope.go:117] "RemoveContainer" containerID="0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" Mar 13 12:40:35 crc kubenswrapper[4837]: E0313 12:40:35.064718 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:40:47 crc kubenswrapper[4837]: I0313 12:40:47.050045 4837 scope.go:117] "RemoveContainer" containerID="0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" Mar 13 12:40:47 crc kubenswrapper[4837]: E0313 12:40:47.055310 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:40:58 crc kubenswrapper[4837]: I0313 12:40:58.048263 4837 scope.go:117] "RemoveContainer" containerID="0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" Mar 13 12:40:58 crc kubenswrapper[4837]: E0313 12:40:58.049048 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:41:06 crc kubenswrapper[4837]: I0313 12:41:06.143085 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bdzjd"] Mar 13 12:41:06 crc kubenswrapper[4837]: E0313 12:41:06.144289 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a273cb74-6dcc-4e87-8f25-db5c77132250" containerName="oc" Mar 13 12:41:06 crc kubenswrapper[4837]: I0313 12:41:06.144309 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="a273cb74-6dcc-4e87-8f25-db5c77132250" containerName="oc" Mar 13 12:41:06 crc kubenswrapper[4837]: I0313 12:41:06.144592 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="a273cb74-6dcc-4e87-8f25-db5c77132250" containerName="oc" Mar 13 12:41:06 crc kubenswrapper[4837]: I0313 12:41:06.146472 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bdzjd" Mar 13 12:41:06 crc kubenswrapper[4837]: I0313 12:41:06.155113 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bdzjd"] Mar 13 12:41:06 crc kubenswrapper[4837]: I0313 12:41:06.274944 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkv2w\" (UniqueName: \"kubernetes.io/projected/6859fd59-d276-46f7-85ce-3e4a1d934bc0-kube-api-access-rkv2w\") pod \"certified-operators-bdzjd\" (UID: \"6859fd59-d276-46f7-85ce-3e4a1d934bc0\") " pod="openshift-marketplace/certified-operators-bdzjd" Mar 13 12:41:06 crc kubenswrapper[4837]: I0313 12:41:06.275073 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6859fd59-d276-46f7-85ce-3e4a1d934bc0-catalog-content\") pod \"certified-operators-bdzjd\" (UID: \"6859fd59-d276-46f7-85ce-3e4a1d934bc0\") " pod="openshift-marketplace/certified-operators-bdzjd" Mar 13 12:41:06 crc kubenswrapper[4837]: I0313 12:41:06.275118 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6859fd59-d276-46f7-85ce-3e4a1d934bc0-utilities\") pod \"certified-operators-bdzjd\" (UID: \"6859fd59-d276-46f7-85ce-3e4a1d934bc0\") " pod="openshift-marketplace/certified-operators-bdzjd" Mar 13 12:41:06 crc kubenswrapper[4837]: I0313 12:41:06.377938 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkv2w\" (UniqueName: \"kubernetes.io/projected/6859fd59-d276-46f7-85ce-3e4a1d934bc0-kube-api-access-rkv2w\") pod \"certified-operators-bdzjd\" (UID: \"6859fd59-d276-46f7-85ce-3e4a1d934bc0\") " pod="openshift-marketplace/certified-operators-bdzjd" Mar 13 12:41:06 crc kubenswrapper[4837]: I0313 12:41:06.378073 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6859fd59-d276-46f7-85ce-3e4a1d934bc0-catalog-content\") pod \"certified-operators-bdzjd\" (UID: \"6859fd59-d276-46f7-85ce-3e4a1d934bc0\") " pod="openshift-marketplace/certified-operators-bdzjd" Mar 13 12:41:06 crc kubenswrapper[4837]: I0313 12:41:06.378120 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6859fd59-d276-46f7-85ce-3e4a1d934bc0-utilities\") pod \"certified-operators-bdzjd\" (UID: \"6859fd59-d276-46f7-85ce-3e4a1d934bc0\") " pod="openshift-marketplace/certified-operators-bdzjd" Mar 13 12:41:06 crc kubenswrapper[4837]: I0313 12:41:06.378732 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6859fd59-d276-46f7-85ce-3e4a1d934bc0-utilities\") pod \"certified-operators-bdzjd\" (UID: \"6859fd59-d276-46f7-85ce-3e4a1d934bc0\") " pod="openshift-marketplace/certified-operators-bdzjd" Mar 13 12:41:06 crc kubenswrapper[4837]: I0313 12:41:06.378746 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6859fd59-d276-46f7-85ce-3e4a1d934bc0-catalog-content\") pod \"certified-operators-bdzjd\" (UID: \"6859fd59-d276-46f7-85ce-3e4a1d934bc0\") " pod="openshift-marketplace/certified-operators-bdzjd" Mar 13 12:41:06 crc kubenswrapper[4837]: I0313 12:41:06.396972 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkv2w\" (UniqueName: \"kubernetes.io/projected/6859fd59-d276-46f7-85ce-3e4a1d934bc0-kube-api-access-rkv2w\") pod \"certified-operators-bdzjd\" (UID: \"6859fd59-d276-46f7-85ce-3e4a1d934bc0\") " pod="openshift-marketplace/certified-operators-bdzjd" Mar 13 12:41:06 crc kubenswrapper[4837]: I0313 12:41:06.464961 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bdzjd" Mar 13 12:41:06 crc kubenswrapper[4837]: I0313 12:41:06.971050 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bdzjd"] Mar 13 12:41:07 crc kubenswrapper[4837]: I0313 12:41:07.507450 4837 generic.go:334] "Generic (PLEG): container finished" podID="6859fd59-d276-46f7-85ce-3e4a1d934bc0" containerID="560f57b3491ce1ee56c3ba3082a4da29491c357f456b4c3fd4fe07ad4ec44958" exitCode=0 Mar 13 12:41:07 crc kubenswrapper[4837]: I0313 12:41:07.507530 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bdzjd" event={"ID":"6859fd59-d276-46f7-85ce-3e4a1d934bc0","Type":"ContainerDied","Data":"560f57b3491ce1ee56c3ba3082a4da29491c357f456b4c3fd4fe07ad4ec44958"} Mar 13 12:41:07 crc kubenswrapper[4837]: I0313 12:41:07.507771 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bdzjd" event={"ID":"6859fd59-d276-46f7-85ce-3e4a1d934bc0","Type":"ContainerStarted","Data":"70fa120127e07316b3b39c8800b91193ea0056418fb3accc2bf31bf0968af42a"} Mar 13 12:41:07 crc kubenswrapper[4837]: I0313 12:41:07.509598 4837 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 12:41:09 crc kubenswrapper[4837]: I0313 12:41:09.528542 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bdzjd" event={"ID":"6859fd59-d276-46f7-85ce-3e4a1d934bc0","Type":"ContainerStarted","Data":"73dc7e0603990c2da015458d2bed2e5aee4a0bf61d78fa122e11168fdabe881b"} Mar 13 12:41:10 crc kubenswrapper[4837]: I0313 12:41:10.048802 4837 scope.go:117] "RemoveContainer" containerID="0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" Mar 13 12:41:10 crc kubenswrapper[4837]: E0313 12:41:10.049158 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:41:11 crc kubenswrapper[4837]: I0313 12:41:11.550167 4837 generic.go:334] "Generic (PLEG): container finished" podID="6859fd59-d276-46f7-85ce-3e4a1d934bc0" containerID="73dc7e0603990c2da015458d2bed2e5aee4a0bf61d78fa122e11168fdabe881b" exitCode=0 Mar 13 12:41:11 crc kubenswrapper[4837]: I0313 12:41:11.550266 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bdzjd" event={"ID":"6859fd59-d276-46f7-85ce-3e4a1d934bc0","Type":"ContainerDied","Data":"73dc7e0603990c2da015458d2bed2e5aee4a0bf61d78fa122e11168fdabe881b"} Mar 13 12:41:13 crc kubenswrapper[4837]: I0313 12:41:13.581520 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bdzjd" event={"ID":"6859fd59-d276-46f7-85ce-3e4a1d934bc0","Type":"ContainerStarted","Data":"18ba41893d2fe107d0c2a7ad10a77546c6c462cb3679fbbb958f219373bc9137"} Mar 13 12:41:13 crc kubenswrapper[4837]: I0313 12:41:13.606436 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bdzjd" podStartSLOduration=2.53221213 podStartE2EDuration="7.606412647s" podCreationTimestamp="2026-03-13 12:41:06 +0000 UTC" firstStartedPulling="2026-03-13 12:41:07.509331939 +0000 UTC m=+3183.147598712" lastFinishedPulling="2026-03-13 12:41:12.583532476 +0000 UTC m=+3188.221799229" observedRunningTime="2026-03-13 12:41:13.605321993 +0000 UTC m=+3189.243588776" watchObservedRunningTime="2026-03-13 12:41:13.606412647 +0000 UTC m=+3189.244679420" Mar 13 12:41:16 crc kubenswrapper[4837]: I0313 12:41:16.465103 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bdzjd" Mar 13 12:41:16 crc kubenswrapper[4837]: I0313 12:41:16.465409 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bdzjd" Mar 13 12:41:16 crc kubenswrapper[4837]: I0313 12:41:16.510934 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bdzjd" Mar 13 12:41:21 crc kubenswrapper[4837]: I0313 12:41:21.671742 4837 generic.go:334] "Generic (PLEG): container finished" podID="66bdda91-c5b6-4879-9adf-21846884c797" containerID="bda10fa8fd12669f2f471650132835bc9a8231ba850dd11df31ebbad360b9cf6" exitCode=0 Mar 13 12:41:21 crc kubenswrapper[4837]: I0313 12:41:21.671875 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"66bdda91-c5b6-4879-9adf-21846884c797","Type":"ContainerDied","Data":"bda10fa8fd12669f2f471650132835bc9a8231ba850dd11df31ebbad360b9cf6"} Mar 13 12:41:22 crc kubenswrapper[4837]: I0313 12:41:22.048875 4837 scope.go:117] "RemoveContainer" containerID="0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" Mar 13 12:41:22 crc kubenswrapper[4837]: E0313 12:41:22.049188 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.108483 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.259245 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/66bdda91-c5b6-4879-9adf-21846884c797-openstack-config-secret\") pod \"66bdda91-c5b6-4879-9adf-21846884c797\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.259314 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/66bdda91-c5b6-4879-9adf-21846884c797-openstack-config\") pod \"66bdda91-c5b6-4879-9adf-21846884c797\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.259337 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/66bdda91-c5b6-4879-9adf-21846884c797-config-data\") pod \"66bdda91-c5b6-4879-9adf-21846884c797\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.259364 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"66bdda91-c5b6-4879-9adf-21846884c797\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.259512 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/66bdda91-c5b6-4879-9adf-21846884c797-test-operator-ephemeral-temporary\") pod \"66bdda91-c5b6-4879-9adf-21846884c797\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.259549 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/66bdda91-c5b6-4879-9adf-21846884c797-ssh-key\") pod \"66bdda91-c5b6-4879-9adf-21846884c797\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.259672 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/66bdda91-c5b6-4879-9adf-21846884c797-test-operator-ephemeral-workdir\") pod \"66bdda91-c5b6-4879-9adf-21846884c797\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.259699 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdksm\" (UniqueName: \"kubernetes.io/projected/66bdda91-c5b6-4879-9adf-21846884c797-kube-api-access-zdksm\") pod \"66bdda91-c5b6-4879-9adf-21846884c797\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.259728 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/66bdda91-c5b6-4879-9adf-21846884c797-ca-certs\") pod \"66bdda91-c5b6-4879-9adf-21846884c797\" (UID: \"66bdda91-c5b6-4879-9adf-21846884c797\") " Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.260886 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66bdda91-c5b6-4879-9adf-21846884c797-config-data" (OuterVolumeSpecName: "config-data") pod "66bdda91-c5b6-4879-9adf-21846884c797" (UID: "66bdda91-c5b6-4879-9adf-21846884c797"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.261361 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66bdda91-c5b6-4879-9adf-21846884c797-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "66bdda91-c5b6-4879-9adf-21846884c797" (UID: "66bdda91-c5b6-4879-9adf-21846884c797"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.266653 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66bdda91-c5b6-4879-9adf-21846884c797-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "66bdda91-c5b6-4879-9adf-21846884c797" (UID: "66bdda91-c5b6-4879-9adf-21846884c797"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.266805 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66bdda91-c5b6-4879-9adf-21846884c797-kube-api-access-zdksm" (OuterVolumeSpecName: "kube-api-access-zdksm") pod "66bdda91-c5b6-4879-9adf-21846884c797" (UID: "66bdda91-c5b6-4879-9adf-21846884c797"). InnerVolumeSpecName "kube-api-access-zdksm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.266904 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "test-operator-logs") pod "66bdda91-c5b6-4879-9adf-21846884c797" (UID: "66bdda91-c5b6-4879-9adf-21846884c797"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.305195 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66bdda91-c5b6-4879-9adf-21846884c797-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "66bdda91-c5b6-4879-9adf-21846884c797" (UID: "66bdda91-c5b6-4879-9adf-21846884c797"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.306970 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66bdda91-c5b6-4879-9adf-21846884c797-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "66bdda91-c5b6-4879-9adf-21846884c797" (UID: "66bdda91-c5b6-4879-9adf-21846884c797"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.310920 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66bdda91-c5b6-4879-9adf-21846884c797-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "66bdda91-c5b6-4879-9adf-21846884c797" (UID: "66bdda91-c5b6-4879-9adf-21846884c797"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.322076 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66bdda91-c5b6-4879-9adf-21846884c797-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "66bdda91-c5b6-4879-9adf-21846884c797" (UID: "66bdda91-c5b6-4879-9adf-21846884c797"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.362450 4837 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/66bdda91-c5b6-4879-9adf-21846884c797-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.362482 4837 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/66bdda91-c5b6-4879-9adf-21846884c797-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.362492 4837 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/66bdda91-c5b6-4879-9adf-21846884c797-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.362502 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdksm\" (UniqueName: \"kubernetes.io/projected/66bdda91-c5b6-4879-9adf-21846884c797-kube-api-access-zdksm\") on node \"crc\" DevicePath \"\"" Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.362511 4837 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/66bdda91-c5b6-4879-9adf-21846884c797-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.362519 4837 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/66bdda91-c5b6-4879-9adf-21846884c797-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.362528 4837 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/66bdda91-c5b6-4879-9adf-21846884c797-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.362536 4837 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/66bdda91-c5b6-4879-9adf-21846884c797-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.362567 4837 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.381959 4837 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.464357 4837 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.691431 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"66bdda91-c5b6-4879-9adf-21846884c797","Type":"ContainerDied","Data":"e0109150fdc9bce6fc2a2f4d23a6692ef997ab608b50f1cec0fb2562f9d86611"} Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.691489 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0109150fdc9bce6fc2a2f4d23a6692ef997ab608b50f1cec0fb2562f9d86611" Mar 13 12:41:23 crc kubenswrapper[4837]: I0313 12:41:23.691531 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 13 12:41:26 crc kubenswrapper[4837]: I0313 12:41:26.515274 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bdzjd" Mar 13 12:41:26 crc kubenswrapper[4837]: I0313 12:41:26.580966 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bdzjd"] Mar 13 12:41:26 crc kubenswrapper[4837]: I0313 12:41:26.723953 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bdzjd" podUID="6859fd59-d276-46f7-85ce-3e4a1d934bc0" containerName="registry-server" containerID="cri-o://18ba41893d2fe107d0c2a7ad10a77546c6c462cb3679fbbb958f219373bc9137" gracePeriod=2 Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.150403 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bdzjd" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.251330 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkv2w\" (UniqueName: \"kubernetes.io/projected/6859fd59-d276-46f7-85ce-3e4a1d934bc0-kube-api-access-rkv2w\") pod \"6859fd59-d276-46f7-85ce-3e4a1d934bc0\" (UID: \"6859fd59-d276-46f7-85ce-3e4a1d934bc0\") " Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.251435 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6859fd59-d276-46f7-85ce-3e4a1d934bc0-catalog-content\") pod \"6859fd59-d276-46f7-85ce-3e4a1d934bc0\" (UID: \"6859fd59-d276-46f7-85ce-3e4a1d934bc0\") " Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.251467 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6859fd59-d276-46f7-85ce-3e4a1d934bc0-utilities\") pod \"6859fd59-d276-46f7-85ce-3e4a1d934bc0\" (UID: \"6859fd59-d276-46f7-85ce-3e4a1d934bc0\") " Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.252667 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6859fd59-d276-46f7-85ce-3e4a1d934bc0-utilities" (OuterVolumeSpecName: "utilities") pod "6859fd59-d276-46f7-85ce-3e4a1d934bc0" (UID: "6859fd59-d276-46f7-85ce-3e4a1d934bc0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.261122 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6859fd59-d276-46f7-85ce-3e4a1d934bc0-kube-api-access-rkv2w" (OuterVolumeSpecName: "kube-api-access-rkv2w") pod "6859fd59-d276-46f7-85ce-3e4a1d934bc0" (UID: "6859fd59-d276-46f7-85ce-3e4a1d934bc0"). InnerVolumeSpecName "kube-api-access-rkv2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.328959 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6859fd59-d276-46f7-85ce-3e4a1d934bc0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6859fd59-d276-46f7-85ce-3e4a1d934bc0" (UID: "6859fd59-d276-46f7-85ce-3e4a1d934bc0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.353158 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkv2w\" (UniqueName: \"kubernetes.io/projected/6859fd59-d276-46f7-85ce-3e4a1d934bc0-kube-api-access-rkv2w\") on node \"crc\" DevicePath \"\"" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.353197 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6859fd59-d276-46f7-85ce-3e4a1d934bc0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.353210 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6859fd59-d276-46f7-85ce-3e4a1d934bc0-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.622501 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 13 12:41:27 crc kubenswrapper[4837]: E0313 12:41:27.623074 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6859fd59-d276-46f7-85ce-3e4a1d934bc0" containerName="extract-content" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.623095 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6859fd59-d276-46f7-85ce-3e4a1d934bc0" containerName="extract-content" Mar 13 12:41:27 crc kubenswrapper[4837]: E0313 12:41:27.623122 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6859fd59-d276-46f7-85ce-3e4a1d934bc0" containerName="registry-server" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.623131 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6859fd59-d276-46f7-85ce-3e4a1d934bc0" containerName="registry-server" Mar 13 12:41:27 crc kubenswrapper[4837]: E0313 12:41:27.623150 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66bdda91-c5b6-4879-9adf-21846884c797" containerName="tempest-tests-tempest-tests-runner" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.623159 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="66bdda91-c5b6-4879-9adf-21846884c797" containerName="tempest-tests-tempest-tests-runner" Mar 13 12:41:27 crc kubenswrapper[4837]: E0313 12:41:27.623179 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6859fd59-d276-46f7-85ce-3e4a1d934bc0" containerName="extract-utilities" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.623187 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="6859fd59-d276-46f7-85ce-3e4a1d934bc0" containerName="extract-utilities" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.623429 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="6859fd59-d276-46f7-85ce-3e4a1d934bc0" containerName="registry-server" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.623468 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="66bdda91-c5b6-4879-9adf-21846884c797" containerName="tempest-tests-tempest-tests-runner" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.624252 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.626731 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-bvdx7" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.630392 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.657241 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0244acef-b630-4b97-9bb5-9f99de391613\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.657548 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4xwf\" (UniqueName: \"kubernetes.io/projected/0244acef-b630-4b97-9bb5-9f99de391613-kube-api-access-g4xwf\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0244acef-b630-4b97-9bb5-9f99de391613\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.737314 4837 generic.go:334] "Generic (PLEG): container finished" podID="6859fd59-d276-46f7-85ce-3e4a1d934bc0" containerID="18ba41893d2fe107d0c2a7ad10a77546c6c462cb3679fbbb958f219373bc9137" exitCode=0 Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.737351 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bdzjd" event={"ID":"6859fd59-d276-46f7-85ce-3e4a1d934bc0","Type":"ContainerDied","Data":"18ba41893d2fe107d0c2a7ad10a77546c6c462cb3679fbbb958f219373bc9137"} Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.737406 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bdzjd" event={"ID":"6859fd59-d276-46f7-85ce-3e4a1d934bc0","Type":"ContainerDied","Data":"70fa120127e07316b3b39c8800b91193ea0056418fb3accc2bf31bf0968af42a"} Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.737418 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bdzjd" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.737435 4837 scope.go:117] "RemoveContainer" containerID="18ba41893d2fe107d0c2a7ad10a77546c6c462cb3679fbbb958f219373bc9137" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.758932 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0244acef-b630-4b97-9bb5-9f99de391613\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.759021 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4xwf\" (UniqueName: \"kubernetes.io/projected/0244acef-b630-4b97-9bb5-9f99de391613-kube-api-access-g4xwf\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0244acef-b630-4b97-9bb5-9f99de391613\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.760452 4837 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0244acef-b630-4b97-9bb5-9f99de391613\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.776211 4837 scope.go:117] "RemoveContainer" containerID="73dc7e0603990c2da015458d2bed2e5aee4a0bf61d78fa122e11168fdabe881b" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.790522 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4xwf\" (UniqueName: \"kubernetes.io/projected/0244acef-b630-4b97-9bb5-9f99de391613-kube-api-access-g4xwf\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0244acef-b630-4b97-9bb5-9f99de391613\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.797448 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bdzjd"] Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.808511 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0244acef-b630-4b97-9bb5-9f99de391613\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.809687 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bdzjd"] Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.847956 4837 scope.go:117] "RemoveContainer" containerID="560f57b3491ce1ee56c3ba3082a4da29491c357f456b4c3fd4fe07ad4ec44958" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.917320 4837 scope.go:117] "RemoveContainer" containerID="18ba41893d2fe107d0c2a7ad10a77546c6c462cb3679fbbb958f219373bc9137" Mar 13 12:41:27 crc kubenswrapper[4837]: E0313 12:41:27.921387 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18ba41893d2fe107d0c2a7ad10a77546c6c462cb3679fbbb958f219373bc9137\": container with ID starting with 18ba41893d2fe107d0c2a7ad10a77546c6c462cb3679fbbb958f219373bc9137 not found: ID does not exist" containerID="18ba41893d2fe107d0c2a7ad10a77546c6c462cb3679fbbb958f219373bc9137" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.921457 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18ba41893d2fe107d0c2a7ad10a77546c6c462cb3679fbbb958f219373bc9137"} err="failed to get container status \"18ba41893d2fe107d0c2a7ad10a77546c6c462cb3679fbbb958f219373bc9137\": rpc error: code = NotFound desc = could not find container \"18ba41893d2fe107d0c2a7ad10a77546c6c462cb3679fbbb958f219373bc9137\": container with ID starting with 18ba41893d2fe107d0c2a7ad10a77546c6c462cb3679fbbb958f219373bc9137 not found: ID does not exist" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.921483 4837 scope.go:117] "RemoveContainer" containerID="73dc7e0603990c2da015458d2bed2e5aee4a0bf61d78fa122e11168fdabe881b" Mar 13 12:41:27 crc kubenswrapper[4837]: E0313 12:41:27.922395 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73dc7e0603990c2da015458d2bed2e5aee4a0bf61d78fa122e11168fdabe881b\": container with ID starting with 73dc7e0603990c2da015458d2bed2e5aee4a0bf61d78fa122e11168fdabe881b not found: ID does not exist" containerID="73dc7e0603990c2da015458d2bed2e5aee4a0bf61d78fa122e11168fdabe881b" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.922420 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73dc7e0603990c2da015458d2bed2e5aee4a0bf61d78fa122e11168fdabe881b"} err="failed to get container status \"73dc7e0603990c2da015458d2bed2e5aee4a0bf61d78fa122e11168fdabe881b\": rpc error: code = NotFound desc = could not find container \"73dc7e0603990c2da015458d2bed2e5aee4a0bf61d78fa122e11168fdabe881b\": container with ID starting with 73dc7e0603990c2da015458d2bed2e5aee4a0bf61d78fa122e11168fdabe881b not found: ID does not exist" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.922434 4837 scope.go:117] "RemoveContainer" containerID="560f57b3491ce1ee56c3ba3082a4da29491c357f456b4c3fd4fe07ad4ec44958" Mar 13 12:41:27 crc kubenswrapper[4837]: E0313 12:41:27.922731 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"560f57b3491ce1ee56c3ba3082a4da29491c357f456b4c3fd4fe07ad4ec44958\": container with ID starting with 560f57b3491ce1ee56c3ba3082a4da29491c357f456b4c3fd4fe07ad4ec44958 not found: ID does not exist" containerID="560f57b3491ce1ee56c3ba3082a4da29491c357f456b4c3fd4fe07ad4ec44958" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.922765 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"560f57b3491ce1ee56c3ba3082a4da29491c357f456b4c3fd4fe07ad4ec44958"} err="failed to get container status \"560f57b3491ce1ee56c3ba3082a4da29491c357f456b4c3fd4fe07ad4ec44958\": rpc error: code = NotFound desc = could not find container \"560f57b3491ce1ee56c3ba3082a4da29491c357f456b4c3fd4fe07ad4ec44958\": container with ID starting with 560f57b3491ce1ee56c3ba3082a4da29491c357f456b4c3fd4fe07ad4ec44958 not found: ID does not exist" Mar 13 12:41:27 crc kubenswrapper[4837]: I0313 12:41:27.940332 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 13 12:41:28 crc kubenswrapper[4837]: I0313 12:41:28.388096 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 13 12:41:28 crc kubenswrapper[4837]: I0313 12:41:28.746237 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"0244acef-b630-4b97-9bb5-9f99de391613","Type":"ContainerStarted","Data":"bc32204be51f88f37836971c45cfffe3d3563a242517dbcf9f55f53e20d96bf0"} Mar 13 12:41:29 crc kubenswrapper[4837]: I0313 12:41:29.058338 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6859fd59-d276-46f7-85ce-3e4a1d934bc0" path="/var/lib/kubelet/pods/6859fd59-d276-46f7-85ce-3e4a1d934bc0/volumes" Mar 13 12:41:29 crc kubenswrapper[4837]: I0313 12:41:29.760106 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"0244acef-b630-4b97-9bb5-9f99de391613","Type":"ContainerStarted","Data":"8276c52e1e7b4f82cdb660276ad4c0dc37d71176853a44bf73d49eadf6bf1474"} Mar 13 12:41:29 crc kubenswrapper[4837]: I0313 12:41:29.782478 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.028978374 podStartE2EDuration="2.782452487s" podCreationTimestamp="2026-03-13 12:41:27 +0000 UTC" firstStartedPulling="2026-03-13 12:41:28.392491689 +0000 UTC m=+3204.030758452" lastFinishedPulling="2026-03-13 12:41:29.145965802 +0000 UTC m=+3204.784232565" observedRunningTime="2026-03-13 12:41:29.779003928 +0000 UTC m=+3205.417270701" watchObservedRunningTime="2026-03-13 12:41:29.782452487 +0000 UTC m=+3205.420719260" Mar 13 12:41:30 crc kubenswrapper[4837]: I0313 12:41:30.170990 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qxmkr"] Mar 13 12:41:30 crc kubenswrapper[4837]: I0313 12:41:30.173038 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qxmkr" Mar 13 12:41:30 crc kubenswrapper[4837]: I0313 12:41:30.179099 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qxmkr"] Mar 13 12:41:30 crc kubenswrapper[4837]: I0313 12:41:30.306604 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7daa3751-d057-474f-9a0f-79fdada329a2-utilities\") pod \"redhat-operators-qxmkr\" (UID: \"7daa3751-d057-474f-9a0f-79fdada329a2\") " pod="openshift-marketplace/redhat-operators-qxmkr" Mar 13 12:41:30 crc kubenswrapper[4837]: I0313 12:41:30.306653 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7daa3751-d057-474f-9a0f-79fdada329a2-catalog-content\") pod \"redhat-operators-qxmkr\" (UID: \"7daa3751-d057-474f-9a0f-79fdada329a2\") " pod="openshift-marketplace/redhat-operators-qxmkr" Mar 13 12:41:30 crc kubenswrapper[4837]: I0313 12:41:30.306956 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mz5c2\" (UniqueName: \"kubernetes.io/projected/7daa3751-d057-474f-9a0f-79fdada329a2-kube-api-access-mz5c2\") pod \"redhat-operators-qxmkr\" (UID: \"7daa3751-d057-474f-9a0f-79fdada329a2\") " pod="openshift-marketplace/redhat-operators-qxmkr" Mar 13 12:41:30 crc kubenswrapper[4837]: I0313 12:41:30.408579 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mz5c2\" (UniqueName: \"kubernetes.io/projected/7daa3751-d057-474f-9a0f-79fdada329a2-kube-api-access-mz5c2\") pod \"redhat-operators-qxmkr\" (UID: \"7daa3751-d057-474f-9a0f-79fdada329a2\") " pod="openshift-marketplace/redhat-operators-qxmkr" Mar 13 12:41:30 crc kubenswrapper[4837]: I0313 12:41:30.408722 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7daa3751-d057-474f-9a0f-79fdada329a2-utilities\") pod \"redhat-operators-qxmkr\" (UID: \"7daa3751-d057-474f-9a0f-79fdada329a2\") " pod="openshift-marketplace/redhat-operators-qxmkr" Mar 13 12:41:30 crc kubenswrapper[4837]: I0313 12:41:30.408743 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7daa3751-d057-474f-9a0f-79fdada329a2-catalog-content\") pod \"redhat-operators-qxmkr\" (UID: \"7daa3751-d057-474f-9a0f-79fdada329a2\") " pod="openshift-marketplace/redhat-operators-qxmkr" Mar 13 12:41:30 crc kubenswrapper[4837]: I0313 12:41:30.409254 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7daa3751-d057-474f-9a0f-79fdada329a2-catalog-content\") pod \"redhat-operators-qxmkr\" (UID: \"7daa3751-d057-474f-9a0f-79fdada329a2\") " pod="openshift-marketplace/redhat-operators-qxmkr" Mar 13 12:41:30 crc kubenswrapper[4837]: I0313 12:41:30.409397 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7daa3751-d057-474f-9a0f-79fdada329a2-utilities\") pod \"redhat-operators-qxmkr\" (UID: \"7daa3751-d057-474f-9a0f-79fdada329a2\") " pod="openshift-marketplace/redhat-operators-qxmkr" Mar 13 12:41:30 crc kubenswrapper[4837]: I0313 12:41:30.429240 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mz5c2\" (UniqueName: \"kubernetes.io/projected/7daa3751-d057-474f-9a0f-79fdada329a2-kube-api-access-mz5c2\") pod \"redhat-operators-qxmkr\" (UID: \"7daa3751-d057-474f-9a0f-79fdada329a2\") " pod="openshift-marketplace/redhat-operators-qxmkr" Mar 13 12:41:30 crc kubenswrapper[4837]: I0313 12:41:30.493461 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qxmkr" Mar 13 12:41:31 crc kubenswrapper[4837]: W0313 12:41:31.016206 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7daa3751_d057_474f_9a0f_79fdada329a2.slice/crio-d198cd5e4de064380ab84dfd25c9bf6271c0c2b325c9d7fcdbcadff172704d51 WatchSource:0}: Error finding container d198cd5e4de064380ab84dfd25c9bf6271c0c2b325c9d7fcdbcadff172704d51: Status 404 returned error can't find the container with id d198cd5e4de064380ab84dfd25c9bf6271c0c2b325c9d7fcdbcadff172704d51 Mar 13 12:41:31 crc kubenswrapper[4837]: I0313 12:41:31.020974 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qxmkr"] Mar 13 12:41:31 crc kubenswrapper[4837]: I0313 12:41:31.779479 4837 generic.go:334] "Generic (PLEG): container finished" podID="7daa3751-d057-474f-9a0f-79fdada329a2" containerID="1b2c7f9c7376794e5560f9e528d8a765c014e6561edefe605eb3c2dd6333a2a0" exitCode=0 Mar 13 12:41:31 crc kubenswrapper[4837]: I0313 12:41:31.779604 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qxmkr" event={"ID":"7daa3751-d057-474f-9a0f-79fdada329a2","Type":"ContainerDied","Data":"1b2c7f9c7376794e5560f9e528d8a765c014e6561edefe605eb3c2dd6333a2a0"} Mar 13 12:41:31 crc kubenswrapper[4837]: I0313 12:41:31.780138 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qxmkr" event={"ID":"7daa3751-d057-474f-9a0f-79fdada329a2","Type":"ContainerStarted","Data":"d198cd5e4de064380ab84dfd25c9bf6271c0c2b325c9d7fcdbcadff172704d51"} Mar 13 12:41:32 crc kubenswrapper[4837]: I0313 12:41:32.790729 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qxmkr" event={"ID":"7daa3751-d057-474f-9a0f-79fdada329a2","Type":"ContainerStarted","Data":"75c1c0316d400d5f2fc848d84d2ba2bd6d6be3266e3bc1c32dfc91a9178306cc"} Mar 13 12:41:34 crc kubenswrapper[4837]: I0313 12:41:34.048409 4837 scope.go:117] "RemoveContainer" containerID="0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" Mar 13 12:41:34 crc kubenswrapper[4837]: E0313 12:41:34.048964 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:41:37 crc kubenswrapper[4837]: I0313 12:41:37.856910 4837 generic.go:334] "Generic (PLEG): container finished" podID="7daa3751-d057-474f-9a0f-79fdada329a2" containerID="75c1c0316d400d5f2fc848d84d2ba2bd6d6be3266e3bc1c32dfc91a9178306cc" exitCode=0 Mar 13 12:41:37 crc kubenswrapper[4837]: I0313 12:41:37.857007 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qxmkr" event={"ID":"7daa3751-d057-474f-9a0f-79fdada329a2","Type":"ContainerDied","Data":"75c1c0316d400d5f2fc848d84d2ba2bd6d6be3266e3bc1c32dfc91a9178306cc"} Mar 13 12:41:38 crc kubenswrapper[4837]: I0313 12:41:38.871854 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qxmkr" event={"ID":"7daa3751-d057-474f-9a0f-79fdada329a2","Type":"ContainerStarted","Data":"d7c4eda8f971bf0f17ec252261cf6e6a9dc759fc4a7ea3f6c6a13900e540a85f"} Mar 13 12:41:38 crc kubenswrapper[4837]: I0313 12:41:38.899011 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qxmkr" podStartSLOduration=2.337802442 podStartE2EDuration="8.898987945s" podCreationTimestamp="2026-03-13 12:41:30 +0000 UTC" firstStartedPulling="2026-03-13 12:41:31.781313117 +0000 UTC m=+3207.419579880" lastFinishedPulling="2026-03-13 12:41:38.34249862 +0000 UTC m=+3213.980765383" observedRunningTime="2026-03-13 12:41:38.894244426 +0000 UTC m=+3214.532511189" watchObservedRunningTime="2026-03-13 12:41:38.898987945 +0000 UTC m=+3214.537254708" Mar 13 12:41:39 crc kubenswrapper[4837]: I0313 12:41:39.314148 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xpq8f"] Mar 13 12:41:39 crc kubenswrapper[4837]: I0313 12:41:39.316713 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xpq8f" Mar 13 12:41:39 crc kubenswrapper[4837]: I0313 12:41:39.336726 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xpq8f"] Mar 13 12:41:39 crc kubenswrapper[4837]: I0313 12:41:39.483573 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7db90d1b-c7eb-4de2-8783-417fd25bdc6f-utilities\") pod \"redhat-marketplace-xpq8f\" (UID: \"7db90d1b-c7eb-4de2-8783-417fd25bdc6f\") " pod="openshift-marketplace/redhat-marketplace-xpq8f" Mar 13 12:41:39 crc kubenswrapper[4837]: I0313 12:41:39.483889 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5gjp\" (UniqueName: \"kubernetes.io/projected/7db90d1b-c7eb-4de2-8783-417fd25bdc6f-kube-api-access-q5gjp\") pod \"redhat-marketplace-xpq8f\" (UID: \"7db90d1b-c7eb-4de2-8783-417fd25bdc6f\") " pod="openshift-marketplace/redhat-marketplace-xpq8f" Mar 13 12:41:39 crc kubenswrapper[4837]: I0313 12:41:39.484305 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7db90d1b-c7eb-4de2-8783-417fd25bdc6f-catalog-content\") pod \"redhat-marketplace-xpq8f\" (UID: \"7db90d1b-c7eb-4de2-8783-417fd25bdc6f\") " pod="openshift-marketplace/redhat-marketplace-xpq8f" Mar 13 12:41:39 crc kubenswrapper[4837]: I0313 12:41:39.585839 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5gjp\" (UniqueName: \"kubernetes.io/projected/7db90d1b-c7eb-4de2-8783-417fd25bdc6f-kube-api-access-q5gjp\") pod \"redhat-marketplace-xpq8f\" (UID: \"7db90d1b-c7eb-4de2-8783-417fd25bdc6f\") " pod="openshift-marketplace/redhat-marketplace-xpq8f" Mar 13 12:41:39 crc kubenswrapper[4837]: I0313 12:41:39.586200 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7db90d1b-c7eb-4de2-8783-417fd25bdc6f-catalog-content\") pod \"redhat-marketplace-xpq8f\" (UID: \"7db90d1b-c7eb-4de2-8783-417fd25bdc6f\") " pod="openshift-marketplace/redhat-marketplace-xpq8f" Mar 13 12:41:39 crc kubenswrapper[4837]: I0313 12:41:39.586326 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7db90d1b-c7eb-4de2-8783-417fd25bdc6f-utilities\") pod \"redhat-marketplace-xpq8f\" (UID: \"7db90d1b-c7eb-4de2-8783-417fd25bdc6f\") " pod="openshift-marketplace/redhat-marketplace-xpq8f" Mar 13 12:41:39 crc kubenswrapper[4837]: I0313 12:41:39.586703 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7db90d1b-c7eb-4de2-8783-417fd25bdc6f-catalog-content\") pod \"redhat-marketplace-xpq8f\" (UID: \"7db90d1b-c7eb-4de2-8783-417fd25bdc6f\") " pod="openshift-marketplace/redhat-marketplace-xpq8f" Mar 13 12:41:39 crc kubenswrapper[4837]: I0313 12:41:39.586819 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7db90d1b-c7eb-4de2-8783-417fd25bdc6f-utilities\") pod \"redhat-marketplace-xpq8f\" (UID: \"7db90d1b-c7eb-4de2-8783-417fd25bdc6f\") " pod="openshift-marketplace/redhat-marketplace-xpq8f" Mar 13 12:41:39 crc kubenswrapper[4837]: I0313 12:41:39.609696 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5gjp\" (UniqueName: \"kubernetes.io/projected/7db90d1b-c7eb-4de2-8783-417fd25bdc6f-kube-api-access-q5gjp\") pod \"redhat-marketplace-xpq8f\" (UID: \"7db90d1b-c7eb-4de2-8783-417fd25bdc6f\") " pod="openshift-marketplace/redhat-marketplace-xpq8f" Mar 13 12:41:39 crc kubenswrapper[4837]: I0313 12:41:39.646107 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xpq8f" Mar 13 12:41:40 crc kubenswrapper[4837]: I0313 12:41:40.115716 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xpq8f"] Mar 13 12:41:40 crc kubenswrapper[4837]: W0313 12:41:40.122900 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7db90d1b_c7eb_4de2_8783_417fd25bdc6f.slice/crio-af64dc414a6b7437b27100700bc03b998a48953755354439146e81c31e764a10 WatchSource:0}: Error finding container af64dc414a6b7437b27100700bc03b998a48953755354439146e81c31e764a10: Status 404 returned error can't find the container with id af64dc414a6b7437b27100700bc03b998a48953755354439146e81c31e764a10 Mar 13 12:41:40 crc kubenswrapper[4837]: I0313 12:41:40.494624 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qxmkr" Mar 13 12:41:40 crc kubenswrapper[4837]: I0313 12:41:40.494700 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qxmkr" Mar 13 12:41:40 crc kubenswrapper[4837]: I0313 12:41:40.891614 4837 generic.go:334] "Generic (PLEG): container finished" podID="7db90d1b-c7eb-4de2-8783-417fd25bdc6f" containerID="673af5193713526abe23d3ad06717349574812d657dea12723d9f30f91f6c7a3" exitCode=0 Mar 13 12:41:40 crc kubenswrapper[4837]: I0313 12:41:40.891677 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xpq8f" event={"ID":"7db90d1b-c7eb-4de2-8783-417fd25bdc6f","Type":"ContainerDied","Data":"673af5193713526abe23d3ad06717349574812d657dea12723d9f30f91f6c7a3"} Mar 13 12:41:40 crc kubenswrapper[4837]: I0313 12:41:40.891982 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xpq8f" event={"ID":"7db90d1b-c7eb-4de2-8783-417fd25bdc6f","Type":"ContainerStarted","Data":"af64dc414a6b7437b27100700bc03b998a48953755354439146e81c31e764a10"} Mar 13 12:41:41 crc kubenswrapper[4837]: I0313 12:41:41.551381 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qxmkr" podUID="7daa3751-d057-474f-9a0f-79fdada329a2" containerName="registry-server" probeResult="failure" output=< Mar 13 12:41:41 crc kubenswrapper[4837]: timeout: failed to connect service ":50051" within 1s Mar 13 12:41:41 crc kubenswrapper[4837]: > Mar 13 12:41:41 crc kubenswrapper[4837]: I0313 12:41:41.702995 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wdpnx"] Mar 13 12:41:41 crc kubenswrapper[4837]: I0313 12:41:41.706477 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wdpnx" Mar 13 12:41:41 crc kubenswrapper[4837]: I0313 12:41:41.714400 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wdpnx"] Mar 13 12:41:41 crc kubenswrapper[4837]: I0313 12:41:41.832227 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75dcf43e-e9a3-4956-9582-9663efc7a07b-utilities\") pod \"community-operators-wdpnx\" (UID: \"75dcf43e-e9a3-4956-9582-9663efc7a07b\") " pod="openshift-marketplace/community-operators-wdpnx" Mar 13 12:41:41 crc kubenswrapper[4837]: I0313 12:41:41.832296 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cmm5\" (UniqueName: \"kubernetes.io/projected/75dcf43e-e9a3-4956-9582-9663efc7a07b-kube-api-access-8cmm5\") pod \"community-operators-wdpnx\" (UID: \"75dcf43e-e9a3-4956-9582-9663efc7a07b\") " pod="openshift-marketplace/community-operators-wdpnx" Mar 13 12:41:41 crc kubenswrapper[4837]: I0313 12:41:41.832682 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75dcf43e-e9a3-4956-9582-9663efc7a07b-catalog-content\") pod \"community-operators-wdpnx\" (UID: \"75dcf43e-e9a3-4956-9582-9663efc7a07b\") " pod="openshift-marketplace/community-operators-wdpnx" Mar 13 12:41:41 crc kubenswrapper[4837]: I0313 12:41:41.905001 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xpq8f" event={"ID":"7db90d1b-c7eb-4de2-8783-417fd25bdc6f","Type":"ContainerStarted","Data":"0275010ad2cdf8863266a601f875f06c0f5d019bb4aa2ea17b90064fa41c4bf5"} Mar 13 12:41:41 crc kubenswrapper[4837]: I0313 12:41:41.934535 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75dcf43e-e9a3-4956-9582-9663efc7a07b-catalog-content\") pod \"community-operators-wdpnx\" (UID: \"75dcf43e-e9a3-4956-9582-9663efc7a07b\") " pod="openshift-marketplace/community-operators-wdpnx" Mar 13 12:41:41 crc kubenswrapper[4837]: I0313 12:41:41.934941 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75dcf43e-e9a3-4956-9582-9663efc7a07b-utilities\") pod \"community-operators-wdpnx\" (UID: \"75dcf43e-e9a3-4956-9582-9663efc7a07b\") " pod="openshift-marketplace/community-operators-wdpnx" Mar 13 12:41:41 crc kubenswrapper[4837]: I0313 12:41:41.935064 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cmm5\" (UniqueName: \"kubernetes.io/projected/75dcf43e-e9a3-4956-9582-9663efc7a07b-kube-api-access-8cmm5\") pod \"community-operators-wdpnx\" (UID: \"75dcf43e-e9a3-4956-9582-9663efc7a07b\") " pod="openshift-marketplace/community-operators-wdpnx" Mar 13 12:41:41 crc kubenswrapper[4837]: I0313 12:41:41.935387 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75dcf43e-e9a3-4956-9582-9663efc7a07b-utilities\") pod \"community-operators-wdpnx\" (UID: \"75dcf43e-e9a3-4956-9582-9663efc7a07b\") " pod="openshift-marketplace/community-operators-wdpnx" Mar 13 12:41:41 crc kubenswrapper[4837]: I0313 12:41:41.935498 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75dcf43e-e9a3-4956-9582-9663efc7a07b-catalog-content\") pod \"community-operators-wdpnx\" (UID: \"75dcf43e-e9a3-4956-9582-9663efc7a07b\") " pod="openshift-marketplace/community-operators-wdpnx" Mar 13 12:41:41 crc kubenswrapper[4837]: I0313 12:41:41.957849 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cmm5\" (UniqueName: \"kubernetes.io/projected/75dcf43e-e9a3-4956-9582-9663efc7a07b-kube-api-access-8cmm5\") pod \"community-operators-wdpnx\" (UID: \"75dcf43e-e9a3-4956-9582-9663efc7a07b\") " pod="openshift-marketplace/community-operators-wdpnx" Mar 13 12:41:42 crc kubenswrapper[4837]: I0313 12:41:42.042373 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wdpnx" Mar 13 12:41:42 crc kubenswrapper[4837]: I0313 12:41:42.600965 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wdpnx"] Mar 13 12:41:42 crc kubenswrapper[4837]: I0313 12:41:42.914693 4837 generic.go:334] "Generic (PLEG): container finished" podID="7db90d1b-c7eb-4de2-8783-417fd25bdc6f" containerID="0275010ad2cdf8863266a601f875f06c0f5d019bb4aa2ea17b90064fa41c4bf5" exitCode=0 Mar 13 12:41:42 crc kubenswrapper[4837]: I0313 12:41:42.914990 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xpq8f" event={"ID":"7db90d1b-c7eb-4de2-8783-417fd25bdc6f","Type":"ContainerDied","Data":"0275010ad2cdf8863266a601f875f06c0f5d019bb4aa2ea17b90064fa41c4bf5"} Mar 13 12:41:42 crc kubenswrapper[4837]: I0313 12:41:42.917357 4837 generic.go:334] "Generic (PLEG): container finished" podID="75dcf43e-e9a3-4956-9582-9663efc7a07b" containerID="1cdd21f387fca0b63738dbf1a676060e6b1a2034a411336f47b42f6f70b348d1" exitCode=0 Mar 13 12:41:42 crc kubenswrapper[4837]: I0313 12:41:42.917414 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wdpnx" event={"ID":"75dcf43e-e9a3-4956-9582-9663efc7a07b","Type":"ContainerDied","Data":"1cdd21f387fca0b63738dbf1a676060e6b1a2034a411336f47b42f6f70b348d1"} Mar 13 12:41:42 crc kubenswrapper[4837]: I0313 12:41:42.917458 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wdpnx" event={"ID":"75dcf43e-e9a3-4956-9582-9663efc7a07b","Type":"ContainerStarted","Data":"e5e7a4dd77c0df97d0b938b4b0928d9a15ca84c095db93e2aee641e194eec7e0"} Mar 13 12:41:43 crc kubenswrapper[4837]: I0313 12:41:43.931904 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xpq8f" event={"ID":"7db90d1b-c7eb-4de2-8783-417fd25bdc6f","Type":"ContainerStarted","Data":"13a40aecebc8beb6c6f0fd5033d8b1148b58792521473e01cfa82acbe62fe7ff"} Mar 13 12:41:43 crc kubenswrapper[4837]: I0313 12:41:43.937227 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wdpnx" event={"ID":"75dcf43e-e9a3-4956-9582-9663efc7a07b","Type":"ContainerStarted","Data":"6f8327024dd21bd081ba023805b2e005959158663382bf7c861522d50eaf9255"} Mar 13 12:41:43 crc kubenswrapper[4837]: I0313 12:41:43.979138 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xpq8f" podStartSLOduration=2.560836048 podStartE2EDuration="4.979120539s" podCreationTimestamp="2026-03-13 12:41:39 +0000 UTC" firstStartedPulling="2026-03-13 12:41:40.893261021 +0000 UTC m=+3216.531527784" lastFinishedPulling="2026-03-13 12:41:43.311545512 +0000 UTC m=+3218.949812275" observedRunningTime="2026-03-13 12:41:43.956443362 +0000 UTC m=+3219.594710125" watchObservedRunningTime="2026-03-13 12:41:43.979120539 +0000 UTC m=+3219.617387302" Mar 13 12:41:46 crc kubenswrapper[4837]: I0313 12:41:46.979813 4837 generic.go:334] "Generic (PLEG): container finished" podID="75dcf43e-e9a3-4956-9582-9663efc7a07b" containerID="6f8327024dd21bd081ba023805b2e005959158663382bf7c861522d50eaf9255" exitCode=0 Mar 13 12:41:46 crc kubenswrapper[4837]: I0313 12:41:46.979877 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wdpnx" event={"ID":"75dcf43e-e9a3-4956-9582-9663efc7a07b","Type":"ContainerDied","Data":"6f8327024dd21bd081ba023805b2e005959158663382bf7c861522d50eaf9255"} Mar 13 12:41:47 crc kubenswrapper[4837]: I0313 12:41:47.992774 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wdpnx" event={"ID":"75dcf43e-e9a3-4956-9582-9663efc7a07b","Type":"ContainerStarted","Data":"491e172a4fdd94d834f91503b7344d1840993d62c62190274ec8d7067e9948b7"} Mar 13 12:41:48 crc kubenswrapper[4837]: I0313 12:41:48.013131 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wdpnx" podStartSLOduration=2.524175416 podStartE2EDuration="7.013112227s" podCreationTimestamp="2026-03-13 12:41:41 +0000 UTC" firstStartedPulling="2026-03-13 12:41:42.92053695 +0000 UTC m=+3218.558803713" lastFinishedPulling="2026-03-13 12:41:47.409473761 +0000 UTC m=+3223.047740524" observedRunningTime="2026-03-13 12:41:48.007284303 +0000 UTC m=+3223.645551066" watchObservedRunningTime="2026-03-13 12:41:48.013112227 +0000 UTC m=+3223.651379000" Mar 13 12:41:48 crc kubenswrapper[4837]: I0313 12:41:48.048580 4837 scope.go:117] "RemoveContainer" containerID="0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" Mar 13 12:41:48 crc kubenswrapper[4837]: E0313 12:41:48.048881 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:41:49 crc kubenswrapper[4837]: I0313 12:41:49.647722 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xpq8f" Mar 13 12:41:49 crc kubenswrapper[4837]: I0313 12:41:49.648088 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xpq8f" Mar 13 12:41:49 crc kubenswrapper[4837]: I0313 12:41:49.693906 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xpq8f" Mar 13 12:41:50 crc kubenswrapper[4837]: I0313 12:41:50.061510 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xpq8f" Mar 13 12:41:51 crc kubenswrapper[4837]: I0313 12:41:51.296254 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xpq8f"] Mar 13 12:41:51 crc kubenswrapper[4837]: I0313 12:41:51.557608 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qxmkr" podUID="7daa3751-d057-474f-9a0f-79fdada329a2" containerName="registry-server" probeResult="failure" output=< Mar 13 12:41:51 crc kubenswrapper[4837]: timeout: failed to connect service ":50051" within 1s Mar 13 12:41:51 crc kubenswrapper[4837]: > Mar 13 12:41:52 crc kubenswrapper[4837]: I0313 12:41:52.033397 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xpq8f" podUID="7db90d1b-c7eb-4de2-8783-417fd25bdc6f" containerName="registry-server" containerID="cri-o://13a40aecebc8beb6c6f0fd5033d8b1148b58792521473e01cfa82acbe62fe7ff" gracePeriod=2 Mar 13 12:41:52 crc kubenswrapper[4837]: I0313 12:41:52.043225 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wdpnx" Mar 13 12:41:52 crc kubenswrapper[4837]: I0313 12:41:52.043296 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wdpnx" Mar 13 12:41:52 crc kubenswrapper[4837]: I0313 12:41:52.105240 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wdpnx" Mar 13 12:41:52 crc kubenswrapper[4837]: I0313 12:41:52.491495 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jkb99/must-gather-4lckb"] Mar 13 12:41:52 crc kubenswrapper[4837]: I0313 12:41:52.493079 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jkb99/must-gather-4lckb" Mar 13 12:41:52 crc kubenswrapper[4837]: I0313 12:41:52.497004 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-jkb99"/"default-dockercfg-6g2c8" Mar 13 12:41:52 crc kubenswrapper[4837]: I0313 12:41:52.497004 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-jkb99"/"openshift-service-ca.crt" Mar 13 12:41:52 crc kubenswrapper[4837]: I0313 12:41:52.497200 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-jkb99"/"kube-root-ca.crt" Mar 13 12:41:52 crc kubenswrapper[4837]: I0313 12:41:52.507695 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xpq8f" Mar 13 12:41:52 crc kubenswrapper[4837]: I0313 12:41:52.523136 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jkb99/must-gather-4lckb"] Mar 13 12:41:52 crc kubenswrapper[4837]: I0313 12:41:52.665588 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7db90d1b-c7eb-4de2-8783-417fd25bdc6f-utilities\") pod \"7db90d1b-c7eb-4de2-8783-417fd25bdc6f\" (UID: \"7db90d1b-c7eb-4de2-8783-417fd25bdc6f\") " Mar 13 12:41:52 crc kubenswrapper[4837]: I0313 12:41:52.665660 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5gjp\" (UniqueName: \"kubernetes.io/projected/7db90d1b-c7eb-4de2-8783-417fd25bdc6f-kube-api-access-q5gjp\") pod \"7db90d1b-c7eb-4de2-8783-417fd25bdc6f\" (UID: \"7db90d1b-c7eb-4de2-8783-417fd25bdc6f\") " Mar 13 12:41:52 crc kubenswrapper[4837]: I0313 12:41:52.665813 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7db90d1b-c7eb-4de2-8783-417fd25bdc6f-catalog-content\") pod \"7db90d1b-c7eb-4de2-8783-417fd25bdc6f\" (UID: \"7db90d1b-c7eb-4de2-8783-417fd25bdc6f\") " Mar 13 12:41:52 crc kubenswrapper[4837]: I0313 12:41:52.666074 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbzbt\" (UniqueName: \"kubernetes.io/projected/8822de14-eaa5-4016-91fd-611718d9b51a-kube-api-access-zbzbt\") pod \"must-gather-4lckb\" (UID: \"8822de14-eaa5-4016-91fd-611718d9b51a\") " pod="openshift-must-gather-jkb99/must-gather-4lckb" Mar 13 12:41:52 crc kubenswrapper[4837]: I0313 12:41:52.666187 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8822de14-eaa5-4016-91fd-611718d9b51a-must-gather-output\") pod \"must-gather-4lckb\" (UID: \"8822de14-eaa5-4016-91fd-611718d9b51a\") " pod="openshift-must-gather-jkb99/must-gather-4lckb" Mar 13 12:41:52 crc kubenswrapper[4837]: I0313 12:41:52.666854 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7db90d1b-c7eb-4de2-8783-417fd25bdc6f-utilities" (OuterVolumeSpecName: "utilities") pod "7db90d1b-c7eb-4de2-8783-417fd25bdc6f" (UID: "7db90d1b-c7eb-4de2-8783-417fd25bdc6f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:41:52 crc kubenswrapper[4837]: I0313 12:41:52.681457 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7db90d1b-c7eb-4de2-8783-417fd25bdc6f-kube-api-access-q5gjp" (OuterVolumeSpecName: "kube-api-access-q5gjp") pod "7db90d1b-c7eb-4de2-8783-417fd25bdc6f" (UID: "7db90d1b-c7eb-4de2-8783-417fd25bdc6f"). InnerVolumeSpecName "kube-api-access-q5gjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:41:52 crc kubenswrapper[4837]: I0313 12:41:52.692265 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7db90d1b-c7eb-4de2-8783-417fd25bdc6f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7db90d1b-c7eb-4de2-8783-417fd25bdc6f" (UID: "7db90d1b-c7eb-4de2-8783-417fd25bdc6f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:41:52 crc kubenswrapper[4837]: I0313 12:41:52.768303 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbzbt\" (UniqueName: \"kubernetes.io/projected/8822de14-eaa5-4016-91fd-611718d9b51a-kube-api-access-zbzbt\") pod \"must-gather-4lckb\" (UID: \"8822de14-eaa5-4016-91fd-611718d9b51a\") " pod="openshift-must-gather-jkb99/must-gather-4lckb" Mar 13 12:41:52 crc kubenswrapper[4837]: I0313 12:41:52.768446 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8822de14-eaa5-4016-91fd-611718d9b51a-must-gather-output\") pod \"must-gather-4lckb\" (UID: \"8822de14-eaa5-4016-91fd-611718d9b51a\") " pod="openshift-must-gather-jkb99/must-gather-4lckb" Mar 13 12:41:52 crc kubenswrapper[4837]: I0313 12:41:52.768497 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7db90d1b-c7eb-4de2-8783-417fd25bdc6f-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:41:52 crc kubenswrapper[4837]: I0313 12:41:52.768510 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5gjp\" (UniqueName: \"kubernetes.io/projected/7db90d1b-c7eb-4de2-8783-417fd25bdc6f-kube-api-access-q5gjp\") on node \"crc\" DevicePath \"\"" Mar 13 12:41:52 crc kubenswrapper[4837]: I0313 12:41:52.768519 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7db90d1b-c7eb-4de2-8783-417fd25bdc6f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:41:52 crc kubenswrapper[4837]: I0313 12:41:52.768985 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8822de14-eaa5-4016-91fd-611718d9b51a-must-gather-output\") pod \"must-gather-4lckb\" (UID: \"8822de14-eaa5-4016-91fd-611718d9b51a\") " pod="openshift-must-gather-jkb99/must-gather-4lckb" Mar 13 12:41:52 crc kubenswrapper[4837]: I0313 12:41:52.799469 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbzbt\" (UniqueName: \"kubernetes.io/projected/8822de14-eaa5-4016-91fd-611718d9b51a-kube-api-access-zbzbt\") pod \"must-gather-4lckb\" (UID: \"8822de14-eaa5-4016-91fd-611718d9b51a\") " pod="openshift-must-gather-jkb99/must-gather-4lckb" Mar 13 12:41:52 crc kubenswrapper[4837]: I0313 12:41:52.821826 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jkb99/must-gather-4lckb" Mar 13 12:41:53 crc kubenswrapper[4837]: I0313 12:41:53.048499 4837 generic.go:334] "Generic (PLEG): container finished" podID="7db90d1b-c7eb-4de2-8783-417fd25bdc6f" containerID="13a40aecebc8beb6c6f0fd5033d8b1148b58792521473e01cfa82acbe62fe7ff" exitCode=0 Mar 13 12:41:53 crc kubenswrapper[4837]: I0313 12:41:53.048717 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xpq8f" Mar 13 12:41:53 crc kubenswrapper[4837]: I0313 12:41:53.068408 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xpq8f" event={"ID":"7db90d1b-c7eb-4de2-8783-417fd25bdc6f","Type":"ContainerDied","Data":"13a40aecebc8beb6c6f0fd5033d8b1148b58792521473e01cfa82acbe62fe7ff"} Mar 13 12:41:53 crc kubenswrapper[4837]: I0313 12:41:53.068767 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xpq8f" event={"ID":"7db90d1b-c7eb-4de2-8783-417fd25bdc6f","Type":"ContainerDied","Data":"af64dc414a6b7437b27100700bc03b998a48953755354439146e81c31e764a10"} Mar 13 12:41:53 crc kubenswrapper[4837]: I0313 12:41:53.068796 4837 scope.go:117] "RemoveContainer" containerID="13a40aecebc8beb6c6f0fd5033d8b1148b58792521473e01cfa82acbe62fe7ff" Mar 13 12:41:53 crc kubenswrapper[4837]: I0313 12:41:53.095058 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xpq8f"] Mar 13 12:41:53 crc kubenswrapper[4837]: I0313 12:41:53.104536 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xpq8f"] Mar 13 12:41:53 crc kubenswrapper[4837]: I0313 12:41:53.106591 4837 scope.go:117] "RemoveContainer" containerID="0275010ad2cdf8863266a601f875f06c0f5d019bb4aa2ea17b90064fa41c4bf5" Mar 13 12:41:53 crc kubenswrapper[4837]: I0313 12:41:53.109755 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wdpnx" Mar 13 12:41:53 crc kubenswrapper[4837]: I0313 12:41:53.132600 4837 scope.go:117] "RemoveContainer" containerID="673af5193713526abe23d3ad06717349574812d657dea12723d9f30f91f6c7a3" Mar 13 12:41:53 crc kubenswrapper[4837]: I0313 12:41:53.149982 4837 scope.go:117] "RemoveContainer" containerID="13a40aecebc8beb6c6f0fd5033d8b1148b58792521473e01cfa82acbe62fe7ff" Mar 13 12:41:53 crc kubenswrapper[4837]: E0313 12:41:53.151153 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13a40aecebc8beb6c6f0fd5033d8b1148b58792521473e01cfa82acbe62fe7ff\": container with ID starting with 13a40aecebc8beb6c6f0fd5033d8b1148b58792521473e01cfa82acbe62fe7ff not found: ID does not exist" containerID="13a40aecebc8beb6c6f0fd5033d8b1148b58792521473e01cfa82acbe62fe7ff" Mar 13 12:41:53 crc kubenswrapper[4837]: I0313 12:41:53.151180 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13a40aecebc8beb6c6f0fd5033d8b1148b58792521473e01cfa82acbe62fe7ff"} err="failed to get container status \"13a40aecebc8beb6c6f0fd5033d8b1148b58792521473e01cfa82acbe62fe7ff\": rpc error: code = NotFound desc = could not find container \"13a40aecebc8beb6c6f0fd5033d8b1148b58792521473e01cfa82acbe62fe7ff\": container with ID starting with 13a40aecebc8beb6c6f0fd5033d8b1148b58792521473e01cfa82acbe62fe7ff not found: ID does not exist" Mar 13 12:41:53 crc kubenswrapper[4837]: I0313 12:41:53.151198 4837 scope.go:117] "RemoveContainer" containerID="0275010ad2cdf8863266a601f875f06c0f5d019bb4aa2ea17b90064fa41c4bf5" Mar 13 12:41:53 crc kubenswrapper[4837]: E0313 12:41:53.151442 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0275010ad2cdf8863266a601f875f06c0f5d019bb4aa2ea17b90064fa41c4bf5\": container with ID starting with 0275010ad2cdf8863266a601f875f06c0f5d019bb4aa2ea17b90064fa41c4bf5 not found: ID does not exist" containerID="0275010ad2cdf8863266a601f875f06c0f5d019bb4aa2ea17b90064fa41c4bf5" Mar 13 12:41:53 crc kubenswrapper[4837]: I0313 12:41:53.151458 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0275010ad2cdf8863266a601f875f06c0f5d019bb4aa2ea17b90064fa41c4bf5"} err="failed to get container status \"0275010ad2cdf8863266a601f875f06c0f5d019bb4aa2ea17b90064fa41c4bf5\": rpc error: code = NotFound desc = could not find container \"0275010ad2cdf8863266a601f875f06c0f5d019bb4aa2ea17b90064fa41c4bf5\": container with ID starting with 0275010ad2cdf8863266a601f875f06c0f5d019bb4aa2ea17b90064fa41c4bf5 not found: ID does not exist" Mar 13 12:41:53 crc kubenswrapper[4837]: I0313 12:41:53.151470 4837 scope.go:117] "RemoveContainer" containerID="673af5193713526abe23d3ad06717349574812d657dea12723d9f30f91f6c7a3" Mar 13 12:41:53 crc kubenswrapper[4837]: E0313 12:41:53.151759 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"673af5193713526abe23d3ad06717349574812d657dea12723d9f30f91f6c7a3\": container with ID starting with 673af5193713526abe23d3ad06717349574812d657dea12723d9f30f91f6c7a3 not found: ID does not exist" containerID="673af5193713526abe23d3ad06717349574812d657dea12723d9f30f91f6c7a3" Mar 13 12:41:53 crc kubenswrapper[4837]: I0313 12:41:53.151776 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"673af5193713526abe23d3ad06717349574812d657dea12723d9f30f91f6c7a3"} err="failed to get container status \"673af5193713526abe23d3ad06717349574812d657dea12723d9f30f91f6c7a3\": rpc error: code = NotFound desc = could not find container \"673af5193713526abe23d3ad06717349574812d657dea12723d9f30f91f6c7a3\": container with ID starting with 673af5193713526abe23d3ad06717349574812d657dea12723d9f30f91f6c7a3 not found: ID does not exist" Mar 13 12:41:53 crc kubenswrapper[4837]: I0313 12:41:53.268734 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jkb99/must-gather-4lckb"] Mar 13 12:41:54 crc kubenswrapper[4837]: I0313 12:41:54.065685 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jkb99/must-gather-4lckb" event={"ID":"8822de14-eaa5-4016-91fd-611718d9b51a","Type":"ContainerStarted","Data":"fad50242e977d7ae08cb6453193c2359ef62abea67978e1d59225423ce6fb7f7"} Mar 13 12:41:55 crc kubenswrapper[4837]: I0313 12:41:55.066916 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7db90d1b-c7eb-4de2-8783-417fd25bdc6f" path="/var/lib/kubelet/pods/7db90d1b-c7eb-4de2-8783-417fd25bdc6f/volumes" Mar 13 12:41:56 crc kubenswrapper[4837]: I0313 12:41:56.697446 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wdpnx"] Mar 13 12:41:56 crc kubenswrapper[4837]: I0313 12:41:56.697696 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wdpnx" podUID="75dcf43e-e9a3-4956-9582-9663efc7a07b" containerName="registry-server" containerID="cri-o://491e172a4fdd94d834f91503b7344d1840993d62c62190274ec8d7067e9948b7" gracePeriod=2 Mar 13 12:41:57 crc kubenswrapper[4837]: I0313 12:41:57.112352 4837 generic.go:334] "Generic (PLEG): container finished" podID="75dcf43e-e9a3-4956-9582-9663efc7a07b" containerID="491e172a4fdd94d834f91503b7344d1840993d62c62190274ec8d7067e9948b7" exitCode=0 Mar 13 12:41:57 crc kubenswrapper[4837]: I0313 12:41:57.112415 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wdpnx" event={"ID":"75dcf43e-e9a3-4956-9582-9663efc7a07b","Type":"ContainerDied","Data":"491e172a4fdd94d834f91503b7344d1840993d62c62190274ec8d7067e9948b7"} Mar 13 12:41:59 crc kubenswrapper[4837]: I0313 12:41:59.069135 4837 scope.go:117] "RemoveContainer" containerID="0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" Mar 13 12:41:59 crc kubenswrapper[4837]: E0313 12:41:59.070086 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:41:59 crc kubenswrapper[4837]: I0313 12:41:59.740663 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wdpnx" Mar 13 12:41:59 crc kubenswrapper[4837]: I0313 12:41:59.855507 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75dcf43e-e9a3-4956-9582-9663efc7a07b-catalog-content\") pod \"75dcf43e-e9a3-4956-9582-9663efc7a07b\" (UID: \"75dcf43e-e9a3-4956-9582-9663efc7a07b\") " Mar 13 12:41:59 crc kubenswrapper[4837]: I0313 12:41:59.855792 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cmm5\" (UniqueName: \"kubernetes.io/projected/75dcf43e-e9a3-4956-9582-9663efc7a07b-kube-api-access-8cmm5\") pod \"75dcf43e-e9a3-4956-9582-9663efc7a07b\" (UID: \"75dcf43e-e9a3-4956-9582-9663efc7a07b\") " Mar 13 12:41:59 crc kubenswrapper[4837]: I0313 12:41:59.855814 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75dcf43e-e9a3-4956-9582-9663efc7a07b-utilities\") pod \"75dcf43e-e9a3-4956-9582-9663efc7a07b\" (UID: \"75dcf43e-e9a3-4956-9582-9663efc7a07b\") " Mar 13 12:41:59 crc kubenswrapper[4837]: I0313 12:41:59.857686 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75dcf43e-e9a3-4956-9582-9663efc7a07b-utilities" (OuterVolumeSpecName: "utilities") pod "75dcf43e-e9a3-4956-9582-9663efc7a07b" (UID: "75dcf43e-e9a3-4956-9582-9663efc7a07b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:41:59 crc kubenswrapper[4837]: I0313 12:41:59.861408 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75dcf43e-e9a3-4956-9582-9663efc7a07b-kube-api-access-8cmm5" (OuterVolumeSpecName: "kube-api-access-8cmm5") pod "75dcf43e-e9a3-4956-9582-9663efc7a07b" (UID: "75dcf43e-e9a3-4956-9582-9663efc7a07b"). InnerVolumeSpecName "kube-api-access-8cmm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:41:59 crc kubenswrapper[4837]: I0313 12:41:59.906175 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75dcf43e-e9a3-4956-9582-9663efc7a07b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "75dcf43e-e9a3-4956-9582-9663efc7a07b" (UID: "75dcf43e-e9a3-4956-9582-9663efc7a07b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:41:59 crc kubenswrapper[4837]: I0313 12:41:59.958057 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cmm5\" (UniqueName: \"kubernetes.io/projected/75dcf43e-e9a3-4956-9582-9663efc7a07b-kube-api-access-8cmm5\") on node \"crc\" DevicePath \"\"" Mar 13 12:41:59 crc kubenswrapper[4837]: I0313 12:41:59.958102 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75dcf43e-e9a3-4956-9582-9663efc7a07b-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:41:59 crc kubenswrapper[4837]: I0313 12:41:59.958117 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75dcf43e-e9a3-4956-9582-9663efc7a07b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:42:00 crc kubenswrapper[4837]: I0313 12:42:00.157762 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wdpnx" event={"ID":"75dcf43e-e9a3-4956-9582-9663efc7a07b","Type":"ContainerDied","Data":"e5e7a4dd77c0df97d0b938b4b0928d9a15ca84c095db93e2aee641e194eec7e0"} Mar 13 12:42:00 crc kubenswrapper[4837]: I0313 12:42:00.158088 4837 scope.go:117] "RemoveContainer" containerID="491e172a4fdd94d834f91503b7344d1840993d62c62190274ec8d7067e9948b7" Mar 13 12:42:00 crc kubenswrapper[4837]: I0313 12:42:00.157811 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wdpnx" Mar 13 12:42:00 crc kubenswrapper[4837]: I0313 12:42:00.164027 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556762-g52qb"] Mar 13 12:42:00 crc kubenswrapper[4837]: E0313 12:42:00.164620 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75dcf43e-e9a3-4956-9582-9663efc7a07b" containerName="registry-server" Mar 13 12:42:00 crc kubenswrapper[4837]: I0313 12:42:00.164663 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="75dcf43e-e9a3-4956-9582-9663efc7a07b" containerName="registry-server" Mar 13 12:42:00 crc kubenswrapper[4837]: E0313 12:42:00.164676 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75dcf43e-e9a3-4956-9582-9663efc7a07b" containerName="extract-content" Mar 13 12:42:00 crc kubenswrapper[4837]: I0313 12:42:00.164682 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="75dcf43e-e9a3-4956-9582-9663efc7a07b" containerName="extract-content" Mar 13 12:42:00 crc kubenswrapper[4837]: E0313 12:42:00.164693 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7db90d1b-c7eb-4de2-8783-417fd25bdc6f" containerName="extract-utilities" Mar 13 12:42:00 crc kubenswrapper[4837]: I0313 12:42:00.164700 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="7db90d1b-c7eb-4de2-8783-417fd25bdc6f" containerName="extract-utilities" Mar 13 12:42:00 crc kubenswrapper[4837]: E0313 12:42:00.164712 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75dcf43e-e9a3-4956-9582-9663efc7a07b" containerName="extract-utilities" Mar 13 12:42:00 crc kubenswrapper[4837]: I0313 12:42:00.164742 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="75dcf43e-e9a3-4956-9582-9663efc7a07b" containerName="extract-utilities" Mar 13 12:42:00 crc kubenswrapper[4837]: E0313 12:42:00.164762 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7db90d1b-c7eb-4de2-8783-417fd25bdc6f" containerName="registry-server" Mar 13 12:42:00 crc kubenswrapper[4837]: I0313 12:42:00.164768 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="7db90d1b-c7eb-4de2-8783-417fd25bdc6f" containerName="registry-server" Mar 13 12:42:00 crc kubenswrapper[4837]: E0313 12:42:00.164780 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7db90d1b-c7eb-4de2-8783-417fd25bdc6f" containerName="extract-content" Mar 13 12:42:00 crc kubenswrapper[4837]: I0313 12:42:00.164785 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="7db90d1b-c7eb-4de2-8783-417fd25bdc6f" containerName="extract-content" Mar 13 12:42:00 crc kubenswrapper[4837]: I0313 12:42:00.165019 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="75dcf43e-e9a3-4956-9582-9663efc7a07b" containerName="registry-server" Mar 13 12:42:00 crc kubenswrapper[4837]: I0313 12:42:00.165035 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="7db90d1b-c7eb-4de2-8783-417fd25bdc6f" containerName="registry-server" Mar 13 12:42:00 crc kubenswrapper[4837]: I0313 12:42:00.165963 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556762-g52qb" Mar 13 12:42:00 crc kubenswrapper[4837]: I0313 12:42:00.175722 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556762-g52qb"] Mar 13 12:42:00 crc kubenswrapper[4837]: I0313 12:42:00.198631 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:42:00 crc kubenswrapper[4837]: I0313 12:42:00.198760 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:42:00 crc kubenswrapper[4837]: I0313 12:42:00.199550 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:42:00 crc kubenswrapper[4837]: I0313 12:42:00.203231 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jkb99/must-gather-4lckb" event={"ID":"8822de14-eaa5-4016-91fd-611718d9b51a","Type":"ContainerStarted","Data":"46bee07e0cc64861f34813174541cff76485e4b8cd9b5fb84ab93fd9eff59fed"} Mar 13 12:42:00 crc kubenswrapper[4837]: I0313 12:42:00.233436 4837 scope.go:117] "RemoveContainer" containerID="6f8327024dd21bd081ba023805b2e005959158663382bf7c861522d50eaf9255" Mar 13 12:42:00 crc kubenswrapper[4837]: I0313 12:42:00.238992 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wdpnx"] Mar 13 12:42:00 crc kubenswrapper[4837]: I0313 12:42:00.262724 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wdpnx"] Mar 13 12:42:00 crc kubenswrapper[4837]: I0313 12:42:00.263738 4837 scope.go:117] "RemoveContainer" containerID="1cdd21f387fca0b63738dbf1a676060e6b1a2034a411336f47b42f6f70b348d1" Mar 13 12:42:00 crc kubenswrapper[4837]: I0313 12:42:00.264836 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpj48\" (UniqueName: \"kubernetes.io/projected/e5bfa2ad-f9e7-42e8-b9ea-cf4a1a5c6ca4-kube-api-access-tpj48\") pod \"auto-csr-approver-29556762-g52qb\" (UID: \"e5bfa2ad-f9e7-42e8-b9ea-cf4a1a5c6ca4\") " pod="openshift-infra/auto-csr-approver-29556762-g52qb" Mar 13 12:42:00 crc kubenswrapper[4837]: I0313 12:42:00.367190 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpj48\" (UniqueName: \"kubernetes.io/projected/e5bfa2ad-f9e7-42e8-b9ea-cf4a1a5c6ca4-kube-api-access-tpj48\") pod \"auto-csr-approver-29556762-g52qb\" (UID: \"e5bfa2ad-f9e7-42e8-b9ea-cf4a1a5c6ca4\") " pod="openshift-infra/auto-csr-approver-29556762-g52qb" Mar 13 12:42:00 crc kubenswrapper[4837]: I0313 12:42:00.383454 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpj48\" (UniqueName: \"kubernetes.io/projected/e5bfa2ad-f9e7-42e8-b9ea-cf4a1a5c6ca4-kube-api-access-tpj48\") pod \"auto-csr-approver-29556762-g52qb\" (UID: \"e5bfa2ad-f9e7-42e8-b9ea-cf4a1a5c6ca4\") " pod="openshift-infra/auto-csr-approver-29556762-g52qb" Mar 13 12:42:00 crc kubenswrapper[4837]: I0313 12:42:00.522080 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556762-g52qb" Mar 13 12:42:00 crc kubenswrapper[4837]: I0313 12:42:00.972745 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556762-g52qb"] Mar 13 12:42:01 crc kubenswrapper[4837]: I0313 12:42:01.058510 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75dcf43e-e9a3-4956-9582-9663efc7a07b" path="/var/lib/kubelet/pods/75dcf43e-e9a3-4956-9582-9663efc7a07b/volumes" Mar 13 12:42:01 crc kubenswrapper[4837]: I0313 12:42:01.212661 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556762-g52qb" event={"ID":"e5bfa2ad-f9e7-42e8-b9ea-cf4a1a5c6ca4","Type":"ContainerStarted","Data":"f39f1cc1116054070489cd8e691ba3a60eebf43d9d1a29eb7b4eab84d3b1b1f3"} Mar 13 12:42:01 crc kubenswrapper[4837]: I0313 12:42:01.214414 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jkb99/must-gather-4lckb" event={"ID":"8822de14-eaa5-4016-91fd-611718d9b51a","Type":"ContainerStarted","Data":"d7374ab200a788a99f53fe2448f4035d1be2d4984c27b0031e0578210408765b"} Mar 13 12:42:01 crc kubenswrapper[4837]: I0313 12:42:01.234808 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jkb99/must-gather-4lckb" podStartSLOduration=2.788961667 podStartE2EDuration="9.234791402s" podCreationTimestamp="2026-03-13 12:41:52 +0000 UTC" firstStartedPulling="2026-03-13 12:41:53.277113075 +0000 UTC m=+3228.915379848" lastFinishedPulling="2026-03-13 12:41:59.72294282 +0000 UTC m=+3235.361209583" observedRunningTime="2026-03-13 12:42:01.228823223 +0000 UTC m=+3236.867089986" watchObservedRunningTime="2026-03-13 12:42:01.234791402 +0000 UTC m=+3236.873058165" Mar 13 12:42:01 crc kubenswrapper[4837]: I0313 12:42:01.556355 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qxmkr" podUID="7daa3751-d057-474f-9a0f-79fdada329a2" containerName="registry-server" probeResult="failure" output=< Mar 13 12:42:01 crc kubenswrapper[4837]: timeout: failed to connect service ":50051" within 1s Mar 13 12:42:01 crc kubenswrapper[4837]: > Mar 13 12:42:02 crc kubenswrapper[4837]: I0313 12:42:02.243064 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556762-g52qb" event={"ID":"e5bfa2ad-f9e7-42e8-b9ea-cf4a1a5c6ca4","Type":"ContainerStarted","Data":"d6ca53672f75fdcf8f31c32bb76f3e903dae1282d3f22ff4ff5cc9e6da3282e1"} Mar 13 12:42:02 crc kubenswrapper[4837]: I0313 12:42:02.262386 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556762-g52qb" podStartSLOduration=1.457253045 podStartE2EDuration="2.262366872s" podCreationTimestamp="2026-03-13 12:42:00 +0000 UTC" firstStartedPulling="2026-03-13 12:42:00.983808746 +0000 UTC m=+3236.622075549" lastFinishedPulling="2026-03-13 12:42:01.788922603 +0000 UTC m=+3237.427189376" observedRunningTime="2026-03-13 12:42:02.256726184 +0000 UTC m=+3237.894992947" watchObservedRunningTime="2026-03-13 12:42:02.262366872 +0000 UTC m=+3237.900633635" Mar 13 12:42:02 crc kubenswrapper[4837]: E0313 12:42:02.710706 4837 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.138:40232->38.102.83.138:43005: read tcp 38.102.83.138:40232->38.102.83.138:43005: read: connection reset by peer Mar 13 12:42:03 crc kubenswrapper[4837]: I0313 12:42:03.255982 4837 generic.go:334] "Generic (PLEG): container finished" podID="e5bfa2ad-f9e7-42e8-b9ea-cf4a1a5c6ca4" containerID="d6ca53672f75fdcf8f31c32bb76f3e903dae1282d3f22ff4ff5cc9e6da3282e1" exitCode=0 Mar 13 12:42:03 crc kubenswrapper[4837]: I0313 12:42:03.256044 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556762-g52qb" event={"ID":"e5bfa2ad-f9e7-42e8-b9ea-cf4a1a5c6ca4","Type":"ContainerDied","Data":"d6ca53672f75fdcf8f31c32bb76f3e903dae1282d3f22ff4ff5cc9e6da3282e1"} Mar 13 12:42:03 crc kubenswrapper[4837]: I0313 12:42:03.838332 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jkb99/crc-debug-wvz2q"] Mar 13 12:42:03 crc kubenswrapper[4837]: I0313 12:42:03.840171 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jkb99/crc-debug-wvz2q" Mar 13 12:42:03 crc kubenswrapper[4837]: I0313 12:42:03.937271 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsfj2\" (UniqueName: \"kubernetes.io/projected/50b16782-2e4c-48fd-ba3e-f0557fdbaae8-kube-api-access-nsfj2\") pod \"crc-debug-wvz2q\" (UID: \"50b16782-2e4c-48fd-ba3e-f0557fdbaae8\") " pod="openshift-must-gather-jkb99/crc-debug-wvz2q" Mar 13 12:42:03 crc kubenswrapper[4837]: I0313 12:42:03.937530 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/50b16782-2e4c-48fd-ba3e-f0557fdbaae8-host\") pod \"crc-debug-wvz2q\" (UID: \"50b16782-2e4c-48fd-ba3e-f0557fdbaae8\") " pod="openshift-must-gather-jkb99/crc-debug-wvz2q" Mar 13 12:42:04 crc kubenswrapper[4837]: I0313 12:42:04.039695 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsfj2\" (UniqueName: \"kubernetes.io/projected/50b16782-2e4c-48fd-ba3e-f0557fdbaae8-kube-api-access-nsfj2\") pod \"crc-debug-wvz2q\" (UID: \"50b16782-2e4c-48fd-ba3e-f0557fdbaae8\") " pod="openshift-must-gather-jkb99/crc-debug-wvz2q" Mar 13 12:42:04 crc kubenswrapper[4837]: I0313 12:42:04.039755 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/50b16782-2e4c-48fd-ba3e-f0557fdbaae8-host\") pod \"crc-debug-wvz2q\" (UID: \"50b16782-2e4c-48fd-ba3e-f0557fdbaae8\") " pod="openshift-must-gather-jkb99/crc-debug-wvz2q" Mar 13 12:42:04 crc kubenswrapper[4837]: I0313 12:42:04.039936 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/50b16782-2e4c-48fd-ba3e-f0557fdbaae8-host\") pod \"crc-debug-wvz2q\" (UID: \"50b16782-2e4c-48fd-ba3e-f0557fdbaae8\") " pod="openshift-must-gather-jkb99/crc-debug-wvz2q" Mar 13 12:42:04 crc kubenswrapper[4837]: I0313 12:42:04.058420 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsfj2\" (UniqueName: \"kubernetes.io/projected/50b16782-2e4c-48fd-ba3e-f0557fdbaae8-kube-api-access-nsfj2\") pod \"crc-debug-wvz2q\" (UID: \"50b16782-2e4c-48fd-ba3e-f0557fdbaae8\") " pod="openshift-must-gather-jkb99/crc-debug-wvz2q" Mar 13 12:42:04 crc kubenswrapper[4837]: I0313 12:42:04.158721 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jkb99/crc-debug-wvz2q" Mar 13 12:42:04 crc kubenswrapper[4837]: W0313 12:42:04.192230 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50b16782_2e4c_48fd_ba3e_f0557fdbaae8.slice/crio-e787c8c6bb3dbe5c16cf7bc30198ada072e5d26721721acf3e78813724296110 WatchSource:0}: Error finding container e787c8c6bb3dbe5c16cf7bc30198ada072e5d26721721acf3e78813724296110: Status 404 returned error can't find the container with id e787c8c6bb3dbe5c16cf7bc30198ada072e5d26721721acf3e78813724296110 Mar 13 12:42:04 crc kubenswrapper[4837]: I0313 12:42:04.266273 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jkb99/crc-debug-wvz2q" event={"ID":"50b16782-2e4c-48fd-ba3e-f0557fdbaae8","Type":"ContainerStarted","Data":"e787c8c6bb3dbe5c16cf7bc30198ada072e5d26721721acf3e78813724296110"} Mar 13 12:42:04 crc kubenswrapper[4837]: I0313 12:42:04.655206 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556762-g52qb" Mar 13 12:42:04 crc kubenswrapper[4837]: I0313 12:42:04.751805 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpj48\" (UniqueName: \"kubernetes.io/projected/e5bfa2ad-f9e7-42e8-b9ea-cf4a1a5c6ca4-kube-api-access-tpj48\") pod \"e5bfa2ad-f9e7-42e8-b9ea-cf4a1a5c6ca4\" (UID: \"e5bfa2ad-f9e7-42e8-b9ea-cf4a1a5c6ca4\") " Mar 13 12:42:04 crc kubenswrapper[4837]: I0313 12:42:04.758582 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5bfa2ad-f9e7-42e8-b9ea-cf4a1a5c6ca4-kube-api-access-tpj48" (OuterVolumeSpecName: "kube-api-access-tpj48") pod "e5bfa2ad-f9e7-42e8-b9ea-cf4a1a5c6ca4" (UID: "e5bfa2ad-f9e7-42e8-b9ea-cf4a1a5c6ca4"). InnerVolumeSpecName "kube-api-access-tpj48". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:42:04 crc kubenswrapper[4837]: I0313 12:42:04.854630 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpj48\" (UniqueName: \"kubernetes.io/projected/e5bfa2ad-f9e7-42e8-b9ea-cf4a1a5c6ca4-kube-api-access-tpj48\") on node \"crc\" DevicePath \"\"" Mar 13 12:42:05 crc kubenswrapper[4837]: I0313 12:42:05.278167 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556762-g52qb" event={"ID":"e5bfa2ad-f9e7-42e8-b9ea-cf4a1a5c6ca4","Type":"ContainerDied","Data":"f39f1cc1116054070489cd8e691ba3a60eebf43d9d1a29eb7b4eab84d3b1b1f3"} Mar 13 12:42:05 crc kubenswrapper[4837]: I0313 12:42:05.278205 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f39f1cc1116054070489cd8e691ba3a60eebf43d9d1a29eb7b4eab84d3b1b1f3" Mar 13 12:42:05 crc kubenswrapper[4837]: I0313 12:42:05.278221 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556762-g52qb" Mar 13 12:42:05 crc kubenswrapper[4837]: I0313 12:42:05.328527 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556756-xz7n5"] Mar 13 12:42:05 crc kubenswrapper[4837]: I0313 12:42:05.341751 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556756-xz7n5"] Mar 13 12:42:07 crc kubenswrapper[4837]: I0313 12:42:07.066247 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8833ed1c-80bb-4529-9f4a-6109d1a39f13" path="/var/lib/kubelet/pods/8833ed1c-80bb-4529-9f4a-6109d1a39f13/volumes" Mar 13 12:42:11 crc kubenswrapper[4837]: I0313 12:42:11.543810 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qxmkr" podUID="7daa3751-d057-474f-9a0f-79fdada329a2" containerName="registry-server" probeResult="failure" output=< Mar 13 12:42:11 crc kubenswrapper[4837]: timeout: failed to connect service ":50051" within 1s Mar 13 12:42:11 crc kubenswrapper[4837]: > Mar 13 12:42:12 crc kubenswrapper[4837]: I0313 12:42:12.049706 4837 scope.go:117] "RemoveContainer" containerID="0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" Mar 13 12:42:12 crc kubenswrapper[4837]: E0313 12:42:12.050017 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:42:16 crc kubenswrapper[4837]: I0313 12:42:16.386390 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jkb99/crc-debug-wvz2q" event={"ID":"50b16782-2e4c-48fd-ba3e-f0557fdbaae8","Type":"ContainerStarted","Data":"5a28fcc3eaaccd2193eb19c5505852014f77b072025f81e4dfef900c7784ce98"} Mar 13 12:42:16 crc kubenswrapper[4837]: I0313 12:42:16.403946 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jkb99/crc-debug-wvz2q" podStartSLOduration=1.8321656339999999 podStartE2EDuration="13.403925032s" podCreationTimestamp="2026-03-13 12:42:03 +0000 UTC" firstStartedPulling="2026-03-13 12:42:04.195937158 +0000 UTC m=+3239.834203921" lastFinishedPulling="2026-03-13 12:42:15.767696556 +0000 UTC m=+3251.405963319" observedRunningTime="2026-03-13 12:42:16.396659942 +0000 UTC m=+3252.034926715" watchObservedRunningTime="2026-03-13 12:42:16.403925032 +0000 UTC m=+3252.042191795" Mar 13 12:42:20 crc kubenswrapper[4837]: I0313 12:42:20.543161 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qxmkr" Mar 13 12:42:20 crc kubenswrapper[4837]: I0313 12:42:20.591869 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qxmkr" Mar 13 12:42:20 crc kubenswrapper[4837]: I0313 12:42:20.783136 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qxmkr"] Mar 13 12:42:22 crc kubenswrapper[4837]: I0313 12:42:22.430799 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qxmkr" podUID="7daa3751-d057-474f-9a0f-79fdada329a2" containerName="registry-server" containerID="cri-o://d7c4eda8f971bf0f17ec252261cf6e6a9dc759fc4a7ea3f6c6a13900e540a85f" gracePeriod=2 Mar 13 12:42:22 crc kubenswrapper[4837]: I0313 12:42:22.914981 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qxmkr" Mar 13 12:42:23 crc kubenswrapper[4837]: I0313 12:42:23.019412 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7daa3751-d057-474f-9a0f-79fdada329a2-utilities\") pod \"7daa3751-d057-474f-9a0f-79fdada329a2\" (UID: \"7daa3751-d057-474f-9a0f-79fdada329a2\") " Mar 13 12:42:23 crc kubenswrapper[4837]: I0313 12:42:23.019889 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7daa3751-d057-474f-9a0f-79fdada329a2-catalog-content\") pod \"7daa3751-d057-474f-9a0f-79fdada329a2\" (UID: \"7daa3751-d057-474f-9a0f-79fdada329a2\") " Mar 13 12:42:23 crc kubenswrapper[4837]: I0313 12:42:23.019954 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mz5c2\" (UniqueName: \"kubernetes.io/projected/7daa3751-d057-474f-9a0f-79fdada329a2-kube-api-access-mz5c2\") pod \"7daa3751-d057-474f-9a0f-79fdada329a2\" (UID: \"7daa3751-d057-474f-9a0f-79fdada329a2\") " Mar 13 12:42:23 crc kubenswrapper[4837]: I0313 12:42:23.020509 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7daa3751-d057-474f-9a0f-79fdada329a2-utilities" (OuterVolumeSpecName: "utilities") pod "7daa3751-d057-474f-9a0f-79fdada329a2" (UID: "7daa3751-d057-474f-9a0f-79fdada329a2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:42:23 crc kubenswrapper[4837]: I0313 12:42:23.038370 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7daa3751-d057-474f-9a0f-79fdada329a2-kube-api-access-mz5c2" (OuterVolumeSpecName: "kube-api-access-mz5c2") pod "7daa3751-d057-474f-9a0f-79fdada329a2" (UID: "7daa3751-d057-474f-9a0f-79fdada329a2"). InnerVolumeSpecName "kube-api-access-mz5c2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:42:23 crc kubenswrapper[4837]: I0313 12:42:23.122764 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7daa3751-d057-474f-9a0f-79fdada329a2-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:42:23 crc kubenswrapper[4837]: I0313 12:42:23.122992 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mz5c2\" (UniqueName: \"kubernetes.io/projected/7daa3751-d057-474f-9a0f-79fdada329a2-kube-api-access-mz5c2\") on node \"crc\" DevicePath \"\"" Mar 13 12:42:23 crc kubenswrapper[4837]: I0313 12:42:23.140542 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7daa3751-d057-474f-9a0f-79fdada329a2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7daa3751-d057-474f-9a0f-79fdada329a2" (UID: "7daa3751-d057-474f-9a0f-79fdada329a2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:42:23 crc kubenswrapper[4837]: I0313 12:42:23.224384 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7daa3751-d057-474f-9a0f-79fdada329a2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:42:23 crc kubenswrapper[4837]: I0313 12:42:23.440415 4837 generic.go:334] "Generic (PLEG): container finished" podID="7daa3751-d057-474f-9a0f-79fdada329a2" containerID="d7c4eda8f971bf0f17ec252261cf6e6a9dc759fc4a7ea3f6c6a13900e540a85f" exitCode=0 Mar 13 12:42:23 crc kubenswrapper[4837]: I0313 12:42:23.440459 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qxmkr" event={"ID":"7daa3751-d057-474f-9a0f-79fdada329a2","Type":"ContainerDied","Data":"d7c4eda8f971bf0f17ec252261cf6e6a9dc759fc4a7ea3f6c6a13900e540a85f"} Mar 13 12:42:23 crc kubenswrapper[4837]: I0313 12:42:23.440509 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qxmkr" event={"ID":"7daa3751-d057-474f-9a0f-79fdada329a2","Type":"ContainerDied","Data":"d198cd5e4de064380ab84dfd25c9bf6271c0c2b325c9d7fcdbcadff172704d51"} Mar 13 12:42:23 crc kubenswrapper[4837]: I0313 12:42:23.440515 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qxmkr" Mar 13 12:42:23 crc kubenswrapper[4837]: I0313 12:42:23.440527 4837 scope.go:117] "RemoveContainer" containerID="d7c4eda8f971bf0f17ec252261cf6e6a9dc759fc4a7ea3f6c6a13900e540a85f" Mar 13 12:42:23 crc kubenswrapper[4837]: I0313 12:42:23.459989 4837 scope.go:117] "RemoveContainer" containerID="75c1c0316d400d5f2fc848d84d2ba2bd6d6be3266e3bc1c32dfc91a9178306cc" Mar 13 12:42:23 crc kubenswrapper[4837]: I0313 12:42:23.485700 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qxmkr"] Mar 13 12:42:23 crc kubenswrapper[4837]: I0313 12:42:23.488776 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qxmkr"] Mar 13 12:42:23 crc kubenswrapper[4837]: I0313 12:42:23.489041 4837 scope.go:117] "RemoveContainer" containerID="1b2c7f9c7376794e5560f9e528d8a765c014e6561edefe605eb3c2dd6333a2a0" Mar 13 12:42:23 crc kubenswrapper[4837]: I0313 12:42:23.534832 4837 scope.go:117] "RemoveContainer" containerID="d7c4eda8f971bf0f17ec252261cf6e6a9dc759fc4a7ea3f6c6a13900e540a85f" Mar 13 12:42:23 crc kubenswrapper[4837]: E0313 12:42:23.535182 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7c4eda8f971bf0f17ec252261cf6e6a9dc759fc4a7ea3f6c6a13900e540a85f\": container with ID starting with d7c4eda8f971bf0f17ec252261cf6e6a9dc759fc4a7ea3f6c6a13900e540a85f not found: ID does not exist" containerID="d7c4eda8f971bf0f17ec252261cf6e6a9dc759fc4a7ea3f6c6a13900e540a85f" Mar 13 12:42:23 crc kubenswrapper[4837]: I0313 12:42:23.535212 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7c4eda8f971bf0f17ec252261cf6e6a9dc759fc4a7ea3f6c6a13900e540a85f"} err="failed to get container status \"d7c4eda8f971bf0f17ec252261cf6e6a9dc759fc4a7ea3f6c6a13900e540a85f\": rpc error: code = NotFound desc = could not find container \"d7c4eda8f971bf0f17ec252261cf6e6a9dc759fc4a7ea3f6c6a13900e540a85f\": container with ID starting with d7c4eda8f971bf0f17ec252261cf6e6a9dc759fc4a7ea3f6c6a13900e540a85f not found: ID does not exist" Mar 13 12:42:23 crc kubenswrapper[4837]: I0313 12:42:23.535231 4837 scope.go:117] "RemoveContainer" containerID="75c1c0316d400d5f2fc848d84d2ba2bd6d6be3266e3bc1c32dfc91a9178306cc" Mar 13 12:42:23 crc kubenswrapper[4837]: E0313 12:42:23.535487 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75c1c0316d400d5f2fc848d84d2ba2bd6d6be3266e3bc1c32dfc91a9178306cc\": container with ID starting with 75c1c0316d400d5f2fc848d84d2ba2bd6d6be3266e3bc1c32dfc91a9178306cc not found: ID does not exist" containerID="75c1c0316d400d5f2fc848d84d2ba2bd6d6be3266e3bc1c32dfc91a9178306cc" Mar 13 12:42:23 crc kubenswrapper[4837]: I0313 12:42:23.535514 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75c1c0316d400d5f2fc848d84d2ba2bd6d6be3266e3bc1c32dfc91a9178306cc"} err="failed to get container status \"75c1c0316d400d5f2fc848d84d2ba2bd6d6be3266e3bc1c32dfc91a9178306cc\": rpc error: code = NotFound desc = could not find container \"75c1c0316d400d5f2fc848d84d2ba2bd6d6be3266e3bc1c32dfc91a9178306cc\": container with ID starting with 75c1c0316d400d5f2fc848d84d2ba2bd6d6be3266e3bc1c32dfc91a9178306cc not found: ID does not exist" Mar 13 12:42:23 crc kubenswrapper[4837]: I0313 12:42:23.535527 4837 scope.go:117] "RemoveContainer" containerID="1b2c7f9c7376794e5560f9e528d8a765c014e6561edefe605eb3c2dd6333a2a0" Mar 13 12:42:23 crc kubenswrapper[4837]: E0313 12:42:23.535766 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b2c7f9c7376794e5560f9e528d8a765c014e6561edefe605eb3c2dd6333a2a0\": container with ID starting with 1b2c7f9c7376794e5560f9e528d8a765c014e6561edefe605eb3c2dd6333a2a0 not found: ID does not exist" containerID="1b2c7f9c7376794e5560f9e528d8a765c014e6561edefe605eb3c2dd6333a2a0" Mar 13 12:42:23 crc kubenswrapper[4837]: I0313 12:42:23.535796 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b2c7f9c7376794e5560f9e528d8a765c014e6561edefe605eb3c2dd6333a2a0"} err="failed to get container status \"1b2c7f9c7376794e5560f9e528d8a765c014e6561edefe605eb3c2dd6333a2a0\": rpc error: code = NotFound desc = could not find container \"1b2c7f9c7376794e5560f9e528d8a765c014e6561edefe605eb3c2dd6333a2a0\": container with ID starting with 1b2c7f9c7376794e5560f9e528d8a765c014e6561edefe605eb3c2dd6333a2a0 not found: ID does not exist" Mar 13 12:42:25 crc kubenswrapper[4837]: I0313 12:42:25.062451 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7daa3751-d057-474f-9a0f-79fdada329a2" path="/var/lib/kubelet/pods/7daa3751-d057-474f-9a0f-79fdada329a2/volumes" Mar 13 12:42:27 crc kubenswrapper[4837]: I0313 12:42:27.048346 4837 scope.go:117] "RemoveContainer" containerID="0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" Mar 13 12:42:27 crc kubenswrapper[4837]: E0313 12:42:27.049305 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:42:31 crc kubenswrapper[4837]: I0313 12:42:31.392334 4837 scope.go:117] "RemoveContainer" containerID="8991bbc909e2098b2d6fb047c31dca6c613e8c861798107378538f426d77e480" Mar 13 12:42:38 crc kubenswrapper[4837]: I0313 12:42:38.048951 4837 scope.go:117] "RemoveContainer" containerID="0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" Mar 13 12:42:38 crc kubenswrapper[4837]: I0313 12:42:38.575989 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerStarted","Data":"42895e7497f11e52c6189b0227f4673591fee559ac68adfdd28355562f8112bd"} Mar 13 12:42:57 crc kubenswrapper[4837]: I0313 12:42:57.767249 4837 generic.go:334] "Generic (PLEG): container finished" podID="50b16782-2e4c-48fd-ba3e-f0557fdbaae8" containerID="5a28fcc3eaaccd2193eb19c5505852014f77b072025f81e4dfef900c7784ce98" exitCode=0 Mar 13 12:42:57 crc kubenswrapper[4837]: I0313 12:42:57.767314 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jkb99/crc-debug-wvz2q" event={"ID":"50b16782-2e4c-48fd-ba3e-f0557fdbaae8","Type":"ContainerDied","Data":"5a28fcc3eaaccd2193eb19c5505852014f77b072025f81e4dfef900c7784ce98"} Mar 13 12:42:58 crc kubenswrapper[4837]: I0313 12:42:58.868631 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jkb99/crc-debug-wvz2q" Mar 13 12:42:58 crc kubenswrapper[4837]: I0313 12:42:58.897392 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jkb99/crc-debug-wvz2q"] Mar 13 12:42:58 crc kubenswrapper[4837]: I0313 12:42:58.904939 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jkb99/crc-debug-wvz2q"] Mar 13 12:42:59 crc kubenswrapper[4837]: I0313 12:42:59.048068 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsfj2\" (UniqueName: \"kubernetes.io/projected/50b16782-2e4c-48fd-ba3e-f0557fdbaae8-kube-api-access-nsfj2\") pod \"50b16782-2e4c-48fd-ba3e-f0557fdbaae8\" (UID: \"50b16782-2e4c-48fd-ba3e-f0557fdbaae8\") " Mar 13 12:42:59 crc kubenswrapper[4837]: I0313 12:42:59.049069 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/50b16782-2e4c-48fd-ba3e-f0557fdbaae8-host\") pod \"50b16782-2e4c-48fd-ba3e-f0557fdbaae8\" (UID: \"50b16782-2e4c-48fd-ba3e-f0557fdbaae8\") " Mar 13 12:42:59 crc kubenswrapper[4837]: I0313 12:42:59.049228 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/50b16782-2e4c-48fd-ba3e-f0557fdbaae8-host" (OuterVolumeSpecName: "host") pod "50b16782-2e4c-48fd-ba3e-f0557fdbaae8" (UID: "50b16782-2e4c-48fd-ba3e-f0557fdbaae8"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:42:59 crc kubenswrapper[4837]: I0313 12:42:59.050305 4837 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/50b16782-2e4c-48fd-ba3e-f0557fdbaae8-host\") on node \"crc\" DevicePath \"\"" Mar 13 12:42:59 crc kubenswrapper[4837]: I0313 12:42:59.059228 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50b16782-2e4c-48fd-ba3e-f0557fdbaae8-kube-api-access-nsfj2" (OuterVolumeSpecName: "kube-api-access-nsfj2") pod "50b16782-2e4c-48fd-ba3e-f0557fdbaae8" (UID: "50b16782-2e4c-48fd-ba3e-f0557fdbaae8"). InnerVolumeSpecName "kube-api-access-nsfj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:42:59 crc kubenswrapper[4837]: I0313 12:42:59.069722 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50b16782-2e4c-48fd-ba3e-f0557fdbaae8" path="/var/lib/kubelet/pods/50b16782-2e4c-48fd-ba3e-f0557fdbaae8/volumes" Mar 13 12:42:59 crc kubenswrapper[4837]: I0313 12:42:59.152492 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nsfj2\" (UniqueName: \"kubernetes.io/projected/50b16782-2e4c-48fd-ba3e-f0557fdbaae8-kube-api-access-nsfj2\") on node \"crc\" DevicePath \"\"" Mar 13 12:42:59 crc kubenswrapper[4837]: I0313 12:42:59.792940 4837 scope.go:117] "RemoveContainer" containerID="5a28fcc3eaaccd2193eb19c5505852014f77b072025f81e4dfef900c7784ce98" Mar 13 12:42:59 crc kubenswrapper[4837]: I0313 12:42:59.792976 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jkb99/crc-debug-wvz2q" Mar 13 12:43:00 crc kubenswrapper[4837]: I0313 12:43:00.129081 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jkb99/crc-debug-qdfmc"] Mar 13 12:43:00 crc kubenswrapper[4837]: E0313 12:43:00.130000 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7daa3751-d057-474f-9a0f-79fdada329a2" containerName="extract-content" Mar 13 12:43:00 crc kubenswrapper[4837]: I0313 12:43:00.130022 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="7daa3751-d057-474f-9a0f-79fdada329a2" containerName="extract-content" Mar 13 12:43:00 crc kubenswrapper[4837]: E0313 12:43:00.130059 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7daa3751-d057-474f-9a0f-79fdada329a2" containerName="extract-utilities" Mar 13 12:43:00 crc kubenswrapper[4837]: I0313 12:43:00.130069 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="7daa3751-d057-474f-9a0f-79fdada329a2" containerName="extract-utilities" Mar 13 12:43:00 crc kubenswrapper[4837]: E0313 12:43:00.130091 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50b16782-2e4c-48fd-ba3e-f0557fdbaae8" containerName="container-00" Mar 13 12:43:00 crc kubenswrapper[4837]: I0313 12:43:00.130106 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="50b16782-2e4c-48fd-ba3e-f0557fdbaae8" containerName="container-00" Mar 13 12:43:00 crc kubenswrapper[4837]: E0313 12:43:00.130128 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5bfa2ad-f9e7-42e8-b9ea-cf4a1a5c6ca4" containerName="oc" Mar 13 12:43:00 crc kubenswrapper[4837]: I0313 12:43:00.130138 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5bfa2ad-f9e7-42e8-b9ea-cf4a1a5c6ca4" containerName="oc" Mar 13 12:43:00 crc kubenswrapper[4837]: E0313 12:43:00.130165 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7daa3751-d057-474f-9a0f-79fdada329a2" containerName="registry-server" Mar 13 12:43:00 crc kubenswrapper[4837]: I0313 12:43:00.130174 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="7daa3751-d057-474f-9a0f-79fdada329a2" containerName="registry-server" Mar 13 12:43:00 crc kubenswrapper[4837]: I0313 12:43:00.130467 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="7daa3751-d057-474f-9a0f-79fdada329a2" containerName="registry-server" Mar 13 12:43:00 crc kubenswrapper[4837]: I0313 12:43:00.130493 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="50b16782-2e4c-48fd-ba3e-f0557fdbaae8" containerName="container-00" Mar 13 12:43:00 crc kubenswrapper[4837]: I0313 12:43:00.130534 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5bfa2ad-f9e7-42e8-b9ea-cf4a1a5c6ca4" containerName="oc" Mar 13 12:43:00 crc kubenswrapper[4837]: I0313 12:43:00.131443 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jkb99/crc-debug-qdfmc" Mar 13 12:43:00 crc kubenswrapper[4837]: I0313 12:43:00.173014 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58h52\" (UniqueName: \"kubernetes.io/projected/8b157bd4-1b09-44fc-ba60-6b9f3e008253-kube-api-access-58h52\") pod \"crc-debug-qdfmc\" (UID: \"8b157bd4-1b09-44fc-ba60-6b9f3e008253\") " pod="openshift-must-gather-jkb99/crc-debug-qdfmc" Mar 13 12:43:00 crc kubenswrapper[4837]: I0313 12:43:00.173137 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8b157bd4-1b09-44fc-ba60-6b9f3e008253-host\") pod \"crc-debug-qdfmc\" (UID: \"8b157bd4-1b09-44fc-ba60-6b9f3e008253\") " pod="openshift-must-gather-jkb99/crc-debug-qdfmc" Mar 13 12:43:00 crc kubenswrapper[4837]: I0313 12:43:00.274619 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8b157bd4-1b09-44fc-ba60-6b9f3e008253-host\") pod \"crc-debug-qdfmc\" (UID: \"8b157bd4-1b09-44fc-ba60-6b9f3e008253\") " pod="openshift-must-gather-jkb99/crc-debug-qdfmc" Mar 13 12:43:00 crc kubenswrapper[4837]: I0313 12:43:00.274744 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8b157bd4-1b09-44fc-ba60-6b9f3e008253-host\") pod \"crc-debug-qdfmc\" (UID: \"8b157bd4-1b09-44fc-ba60-6b9f3e008253\") " pod="openshift-must-gather-jkb99/crc-debug-qdfmc" Mar 13 12:43:00 crc kubenswrapper[4837]: I0313 12:43:00.274771 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58h52\" (UniqueName: \"kubernetes.io/projected/8b157bd4-1b09-44fc-ba60-6b9f3e008253-kube-api-access-58h52\") pod \"crc-debug-qdfmc\" (UID: \"8b157bd4-1b09-44fc-ba60-6b9f3e008253\") " pod="openshift-must-gather-jkb99/crc-debug-qdfmc" Mar 13 12:43:00 crc kubenswrapper[4837]: I0313 12:43:00.296655 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58h52\" (UniqueName: \"kubernetes.io/projected/8b157bd4-1b09-44fc-ba60-6b9f3e008253-kube-api-access-58h52\") pod \"crc-debug-qdfmc\" (UID: \"8b157bd4-1b09-44fc-ba60-6b9f3e008253\") " pod="openshift-must-gather-jkb99/crc-debug-qdfmc" Mar 13 12:43:00 crc kubenswrapper[4837]: I0313 12:43:00.452229 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jkb99/crc-debug-qdfmc" Mar 13 12:43:00 crc kubenswrapper[4837]: W0313 12:43:00.485369 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b157bd4_1b09_44fc_ba60_6b9f3e008253.slice/crio-bb188d45448e9137888500bafe79ed93c9d6da36a90e2a65f056bd2b9406ffbc WatchSource:0}: Error finding container bb188d45448e9137888500bafe79ed93c9d6da36a90e2a65f056bd2b9406ffbc: Status 404 returned error can't find the container with id bb188d45448e9137888500bafe79ed93c9d6da36a90e2a65f056bd2b9406ffbc Mar 13 12:43:00 crc kubenswrapper[4837]: I0313 12:43:00.801766 4837 generic.go:334] "Generic (PLEG): container finished" podID="8b157bd4-1b09-44fc-ba60-6b9f3e008253" containerID="021f2f7590a98a1912559c67d885639fef8ea6affc1fcb856c58211036ebcb42" exitCode=0 Mar 13 12:43:00 crc kubenswrapper[4837]: I0313 12:43:00.801927 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jkb99/crc-debug-qdfmc" event={"ID":"8b157bd4-1b09-44fc-ba60-6b9f3e008253","Type":"ContainerDied","Data":"021f2f7590a98a1912559c67d885639fef8ea6affc1fcb856c58211036ebcb42"} Mar 13 12:43:00 crc kubenswrapper[4837]: I0313 12:43:00.802110 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jkb99/crc-debug-qdfmc" event={"ID":"8b157bd4-1b09-44fc-ba60-6b9f3e008253","Type":"ContainerStarted","Data":"bb188d45448e9137888500bafe79ed93c9d6da36a90e2a65f056bd2b9406ffbc"} Mar 13 12:43:01 crc kubenswrapper[4837]: I0313 12:43:01.278090 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jkb99/crc-debug-qdfmc"] Mar 13 12:43:01 crc kubenswrapper[4837]: I0313 12:43:01.288846 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jkb99/crc-debug-qdfmc"] Mar 13 12:43:01 crc kubenswrapper[4837]: I0313 12:43:01.925920 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jkb99/crc-debug-qdfmc" Mar 13 12:43:02 crc kubenswrapper[4837]: I0313 12:43:02.011027 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58h52\" (UniqueName: \"kubernetes.io/projected/8b157bd4-1b09-44fc-ba60-6b9f3e008253-kube-api-access-58h52\") pod \"8b157bd4-1b09-44fc-ba60-6b9f3e008253\" (UID: \"8b157bd4-1b09-44fc-ba60-6b9f3e008253\") " Mar 13 12:43:02 crc kubenswrapper[4837]: I0313 12:43:02.011119 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8b157bd4-1b09-44fc-ba60-6b9f3e008253-host\") pod \"8b157bd4-1b09-44fc-ba60-6b9f3e008253\" (UID: \"8b157bd4-1b09-44fc-ba60-6b9f3e008253\") " Mar 13 12:43:02 crc kubenswrapper[4837]: I0313 12:43:02.011679 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b157bd4-1b09-44fc-ba60-6b9f3e008253-host" (OuterVolumeSpecName: "host") pod "8b157bd4-1b09-44fc-ba60-6b9f3e008253" (UID: "8b157bd4-1b09-44fc-ba60-6b9f3e008253"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:43:02 crc kubenswrapper[4837]: I0313 12:43:02.011809 4837 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8b157bd4-1b09-44fc-ba60-6b9f3e008253-host\") on node \"crc\" DevicePath \"\"" Mar 13 12:43:02 crc kubenswrapper[4837]: I0313 12:43:02.017327 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b157bd4-1b09-44fc-ba60-6b9f3e008253-kube-api-access-58h52" (OuterVolumeSpecName: "kube-api-access-58h52") pod "8b157bd4-1b09-44fc-ba60-6b9f3e008253" (UID: "8b157bd4-1b09-44fc-ba60-6b9f3e008253"). InnerVolumeSpecName "kube-api-access-58h52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:43:02 crc kubenswrapper[4837]: I0313 12:43:02.113803 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58h52\" (UniqueName: \"kubernetes.io/projected/8b157bd4-1b09-44fc-ba60-6b9f3e008253-kube-api-access-58h52\") on node \"crc\" DevicePath \"\"" Mar 13 12:43:02 crc kubenswrapper[4837]: I0313 12:43:02.447851 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jkb99/crc-debug-jtfh6"] Mar 13 12:43:02 crc kubenswrapper[4837]: E0313 12:43:02.448218 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b157bd4-1b09-44fc-ba60-6b9f3e008253" containerName="container-00" Mar 13 12:43:02 crc kubenswrapper[4837]: I0313 12:43:02.448235 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b157bd4-1b09-44fc-ba60-6b9f3e008253" containerName="container-00" Mar 13 12:43:02 crc kubenswrapper[4837]: I0313 12:43:02.448463 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b157bd4-1b09-44fc-ba60-6b9f3e008253" containerName="container-00" Mar 13 12:43:02 crc kubenswrapper[4837]: I0313 12:43:02.449128 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jkb99/crc-debug-jtfh6" Mar 13 12:43:02 crc kubenswrapper[4837]: I0313 12:43:02.521195 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3157eabd-f1a5-4ab2-b3ac-aae960131503-host\") pod \"crc-debug-jtfh6\" (UID: \"3157eabd-f1a5-4ab2-b3ac-aae960131503\") " pod="openshift-must-gather-jkb99/crc-debug-jtfh6" Mar 13 12:43:02 crc kubenswrapper[4837]: I0313 12:43:02.521354 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59rxt\" (UniqueName: \"kubernetes.io/projected/3157eabd-f1a5-4ab2-b3ac-aae960131503-kube-api-access-59rxt\") pod \"crc-debug-jtfh6\" (UID: \"3157eabd-f1a5-4ab2-b3ac-aae960131503\") " pod="openshift-must-gather-jkb99/crc-debug-jtfh6" Mar 13 12:43:02 crc kubenswrapper[4837]: I0313 12:43:02.622794 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59rxt\" (UniqueName: \"kubernetes.io/projected/3157eabd-f1a5-4ab2-b3ac-aae960131503-kube-api-access-59rxt\") pod \"crc-debug-jtfh6\" (UID: \"3157eabd-f1a5-4ab2-b3ac-aae960131503\") " pod="openshift-must-gather-jkb99/crc-debug-jtfh6" Mar 13 12:43:02 crc kubenswrapper[4837]: I0313 12:43:02.622912 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3157eabd-f1a5-4ab2-b3ac-aae960131503-host\") pod \"crc-debug-jtfh6\" (UID: \"3157eabd-f1a5-4ab2-b3ac-aae960131503\") " pod="openshift-must-gather-jkb99/crc-debug-jtfh6" Mar 13 12:43:02 crc kubenswrapper[4837]: I0313 12:43:02.623101 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3157eabd-f1a5-4ab2-b3ac-aae960131503-host\") pod \"crc-debug-jtfh6\" (UID: \"3157eabd-f1a5-4ab2-b3ac-aae960131503\") " pod="openshift-must-gather-jkb99/crc-debug-jtfh6" Mar 13 12:43:02 crc kubenswrapper[4837]: I0313 12:43:02.641681 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59rxt\" (UniqueName: \"kubernetes.io/projected/3157eabd-f1a5-4ab2-b3ac-aae960131503-kube-api-access-59rxt\") pod \"crc-debug-jtfh6\" (UID: \"3157eabd-f1a5-4ab2-b3ac-aae960131503\") " pod="openshift-must-gather-jkb99/crc-debug-jtfh6" Mar 13 12:43:02 crc kubenswrapper[4837]: I0313 12:43:02.763948 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jkb99/crc-debug-jtfh6" Mar 13 12:43:02 crc kubenswrapper[4837]: W0313 12:43:02.793242 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3157eabd_f1a5_4ab2_b3ac_aae960131503.slice/crio-8770535a8d77dab7ba0fe6f49814a1c54fbc6bedeb1cfde58bf1f489718da20c WatchSource:0}: Error finding container 8770535a8d77dab7ba0fe6f49814a1c54fbc6bedeb1cfde58bf1f489718da20c: Status 404 returned error can't find the container with id 8770535a8d77dab7ba0fe6f49814a1c54fbc6bedeb1cfde58bf1f489718da20c Mar 13 12:43:02 crc kubenswrapper[4837]: I0313 12:43:02.821463 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jkb99/crc-debug-qdfmc" Mar 13 12:43:02 crc kubenswrapper[4837]: I0313 12:43:02.822931 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb188d45448e9137888500bafe79ed93c9d6da36a90e2a65f056bd2b9406ffbc" Mar 13 12:43:02 crc kubenswrapper[4837]: I0313 12:43:02.829984 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jkb99/crc-debug-jtfh6" event={"ID":"3157eabd-f1a5-4ab2-b3ac-aae960131503","Type":"ContainerStarted","Data":"8770535a8d77dab7ba0fe6f49814a1c54fbc6bedeb1cfde58bf1f489718da20c"} Mar 13 12:43:03 crc kubenswrapper[4837]: I0313 12:43:03.069578 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b157bd4-1b09-44fc-ba60-6b9f3e008253" path="/var/lib/kubelet/pods/8b157bd4-1b09-44fc-ba60-6b9f3e008253/volumes" Mar 13 12:43:03 crc kubenswrapper[4837]: I0313 12:43:03.842331 4837 generic.go:334] "Generic (PLEG): container finished" podID="3157eabd-f1a5-4ab2-b3ac-aae960131503" containerID="344719b3277f9755326094abd259b245489fd00736db03b65759e9e2ad87423a" exitCode=0 Mar 13 12:43:03 crc kubenswrapper[4837]: I0313 12:43:03.842391 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jkb99/crc-debug-jtfh6" event={"ID":"3157eabd-f1a5-4ab2-b3ac-aae960131503","Type":"ContainerDied","Data":"344719b3277f9755326094abd259b245489fd00736db03b65759e9e2ad87423a"} Mar 13 12:43:03 crc kubenswrapper[4837]: I0313 12:43:03.916997 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jkb99/crc-debug-jtfh6"] Mar 13 12:43:03 crc kubenswrapper[4837]: I0313 12:43:03.928117 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jkb99/crc-debug-jtfh6"] Mar 13 12:43:04 crc kubenswrapper[4837]: I0313 12:43:04.970191 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jkb99/crc-debug-jtfh6" Mar 13 12:43:05 crc kubenswrapper[4837]: I0313 12:43:05.167990 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3157eabd-f1a5-4ab2-b3ac-aae960131503-host\") pod \"3157eabd-f1a5-4ab2-b3ac-aae960131503\" (UID: \"3157eabd-f1a5-4ab2-b3ac-aae960131503\") " Mar 13 12:43:05 crc kubenswrapper[4837]: I0313 12:43:05.168081 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59rxt\" (UniqueName: \"kubernetes.io/projected/3157eabd-f1a5-4ab2-b3ac-aae960131503-kube-api-access-59rxt\") pod \"3157eabd-f1a5-4ab2-b3ac-aae960131503\" (UID: \"3157eabd-f1a5-4ab2-b3ac-aae960131503\") " Mar 13 12:43:05 crc kubenswrapper[4837]: I0313 12:43:05.168538 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3157eabd-f1a5-4ab2-b3ac-aae960131503-host" (OuterVolumeSpecName: "host") pod "3157eabd-f1a5-4ab2-b3ac-aae960131503" (UID: "3157eabd-f1a5-4ab2-b3ac-aae960131503"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:43:05 crc kubenswrapper[4837]: I0313 12:43:05.168866 4837 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3157eabd-f1a5-4ab2-b3ac-aae960131503-host\") on node \"crc\" DevicePath \"\"" Mar 13 12:43:05 crc kubenswrapper[4837]: I0313 12:43:05.176966 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3157eabd-f1a5-4ab2-b3ac-aae960131503-kube-api-access-59rxt" (OuterVolumeSpecName: "kube-api-access-59rxt") pod "3157eabd-f1a5-4ab2-b3ac-aae960131503" (UID: "3157eabd-f1a5-4ab2-b3ac-aae960131503"). InnerVolumeSpecName "kube-api-access-59rxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:43:05 crc kubenswrapper[4837]: I0313 12:43:05.274219 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59rxt\" (UniqueName: \"kubernetes.io/projected/3157eabd-f1a5-4ab2-b3ac-aae960131503-kube-api-access-59rxt\") on node \"crc\" DevicePath \"\"" Mar 13 12:43:05 crc kubenswrapper[4837]: I0313 12:43:05.860550 4837 scope.go:117] "RemoveContainer" containerID="344719b3277f9755326094abd259b245489fd00736db03b65759e9e2ad87423a" Mar 13 12:43:05 crc kubenswrapper[4837]: I0313 12:43:05.860563 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jkb99/crc-debug-jtfh6" Mar 13 12:43:07 crc kubenswrapper[4837]: I0313 12:43:07.067304 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3157eabd-f1a5-4ab2-b3ac-aae960131503" path="/var/lib/kubelet/pods/3157eabd-f1a5-4ab2-b3ac-aae960131503/volumes" Mar 13 12:43:19 crc kubenswrapper[4837]: I0313 12:43:19.130367 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6d84f6b8c8-8rrwq_74c7e377-b579-47bc-a992-cca0cf047627/barbican-api/0.log" Mar 13 12:43:19 crc kubenswrapper[4837]: I0313 12:43:19.315168 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-58c489697d-dgjtz_d1cfe08e-23bd-4f52-ab3c-3d68377de2a9/barbican-keystone-listener/0.log" Mar 13 12:43:19 crc kubenswrapper[4837]: I0313 12:43:19.329314 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6d84f6b8c8-8rrwq_74c7e377-b579-47bc-a992-cca0cf047627/barbican-api-log/0.log" Mar 13 12:43:19 crc kubenswrapper[4837]: I0313 12:43:19.390141 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-58c489697d-dgjtz_d1cfe08e-23bd-4f52-ab3c-3d68377de2a9/barbican-keystone-listener-log/0.log" Mar 13 12:43:19 crc kubenswrapper[4837]: I0313 12:43:19.516218 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6f4ff9ff9-mjmsz_55084c82-a823-4f31-926e-21702ba02ba1/barbican-worker/0.log" Mar 13 12:43:19 crc kubenswrapper[4837]: I0313 12:43:19.528934 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6f4ff9ff9-mjmsz_55084c82-a823-4f31-926e-21702ba02ba1/barbican-worker-log/0.log" Mar 13 12:43:19 crc kubenswrapper[4837]: I0313 12:43:19.728175 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj_2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:43:19 crc kubenswrapper[4837]: I0313 12:43:19.773048 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_82b5b509-a674-4a89-a7cc-c01c7bfca144/ceilometer-central-agent/0.log" Mar 13 12:43:19 crc kubenswrapper[4837]: I0313 12:43:19.835328 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_82b5b509-a674-4a89-a7cc-c01c7bfca144/ceilometer-notification-agent/0.log" Mar 13 12:43:19 crc kubenswrapper[4837]: I0313 12:43:19.911966 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_82b5b509-a674-4a89-a7cc-c01c7bfca144/proxy-httpd/0.log" Mar 13 12:43:19 crc kubenswrapper[4837]: I0313 12:43:19.948303 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_82b5b509-a674-4a89-a7cc-c01c7bfca144/sg-core/0.log" Mar 13 12:43:20 crc kubenswrapper[4837]: I0313 12:43:20.062154 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a8004928-50bc-4db8-a701-4458c42bc776/cinder-api/0.log" Mar 13 12:43:20 crc kubenswrapper[4837]: I0313 12:43:20.110544 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a8004928-50bc-4db8-a701-4458c42bc776/cinder-api-log/0.log" Mar 13 12:43:20 crc kubenswrapper[4837]: I0313 12:43:20.202835 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_580b8861-16eb-4142-bd61-6d0221a07f4d/cinder-scheduler/0.log" Mar 13 12:43:20 crc kubenswrapper[4837]: I0313 12:43:20.297673 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_580b8861-16eb-4142-bd61-6d0221a07f4d/probe/0.log" Mar 13 12:43:20 crc kubenswrapper[4837]: I0313 12:43:20.405851 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-s95mk_875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:43:20 crc kubenswrapper[4837]: I0313 12:43:20.499709 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp_0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:43:20 crc kubenswrapper[4837]: I0313 12:43:20.593474 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-bxc2t_98f4bdc5-6452-4630-a299-6234d8a63bf8/init/0.log" Mar 13 12:43:20 crc kubenswrapper[4837]: I0313 12:43:20.742923 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-bxc2t_98f4bdc5-6452-4630-a299-6234d8a63bf8/init/0.log" Mar 13 12:43:20 crc kubenswrapper[4837]: I0313 12:43:20.801313 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-bxc2t_98f4bdc5-6452-4630-a299-6234d8a63bf8/dnsmasq-dns/0.log" Mar 13 12:43:20 crc kubenswrapper[4837]: I0313 12:43:20.839553 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts_121f6d1b-1277-4d68-8a48-6c4630dd6fe5/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:43:20 crc kubenswrapper[4837]: I0313 12:43:20.989548 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_d3f87d89-35d5-4dc0-9c37-5297718a9351/glance-httpd/0.log" Mar 13 12:43:21 crc kubenswrapper[4837]: I0313 12:43:21.008175 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_d3f87d89-35d5-4dc0-9c37-5297718a9351/glance-log/0.log" Mar 13 12:43:21 crc kubenswrapper[4837]: I0313 12:43:21.171288 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_d0f3b003-127f-414f-877a-8f7df2872049/glance-log/0.log" Mar 13 12:43:21 crc kubenswrapper[4837]: I0313 12:43:21.191399 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_d0f3b003-127f-414f-877a-8f7df2872049/glance-httpd/0.log" Mar 13 12:43:21 crc kubenswrapper[4837]: I0313 12:43:21.317859 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-fd6ddfd9b-f66l8_4d3df345-07a2-41bf-aae4-088b3ce83b63/horizon/0.log" Mar 13 12:43:21 crc kubenswrapper[4837]: I0313 12:43:21.460595 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-gj59c_6cc8d0dd-d1e6-4374-bb90-aaefc9197350/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:43:21 crc kubenswrapper[4837]: I0313 12:43:21.625761 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-fd6ddfd9b-f66l8_4d3df345-07a2-41bf-aae4-088b3ce83b63/horizon-log/0.log" Mar 13 12:43:21 crc kubenswrapper[4837]: I0313 12:43:21.743556 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-2b48q_033a02c2-cbe4-4676-ae46-f9b9b17a60fb/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:43:21 crc kubenswrapper[4837]: I0313 12:43:21.956003 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_abd69ff2-e72e-40c0-925f-d0c1c0a40f9a/kube-state-metrics/0.log" Mar 13 12:43:22 crc kubenswrapper[4837]: I0313 12:43:22.061100 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-55dc4d44f8-mvjvg_9cb9614d-a433-4be3-8145-4c1c8593404f/keystone-api/0.log" Mar 13 12:43:22 crc kubenswrapper[4837]: I0313 12:43:22.241680 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5_394104d4-0291-4071-a7da-d7b71e0f4083/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:43:22 crc kubenswrapper[4837]: I0313 12:43:22.515994 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-667d547b9-4p8qm_3c00dfc0-061b-43ba-b529-a89c9157a0cf/neutron-api/0.log" Mar 13 12:43:22 crc kubenswrapper[4837]: I0313 12:43:22.584447 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-667d547b9-4p8qm_3c00dfc0-061b-43ba-b529-a89c9157a0cf/neutron-httpd/0.log" Mar 13 12:43:22 crc kubenswrapper[4837]: I0313 12:43:22.722601 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4_20f35066-9c10-4433-a655-f5cef18d4deb/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:43:23 crc kubenswrapper[4837]: I0313 12:43:23.252208 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_4e6cd1d9-f670-4e94-8322-44e471c3be71/nova-api-log/0.log" Mar 13 12:43:23 crc kubenswrapper[4837]: I0313 12:43:23.418935 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_4e6cd1d9-f670-4e94-8322-44e471c3be71/nova-api-api/0.log" Mar 13 12:43:23 crc kubenswrapper[4837]: I0313 12:43:23.516453 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_58240a84-c8ab-43a9-8113-eaf2d0ddea2e/nova-cell0-conductor-conductor/0.log" Mar 13 12:43:23 crc kubenswrapper[4837]: I0313 12:43:23.595570 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_9a51debb-c1cb-4a55-b845-e89d89d11e86/nova-cell1-conductor-conductor/0.log" Mar 13 12:43:23 crc kubenswrapper[4837]: I0313 12:43:23.846693 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_662e258d-fe94-4373-912d-c906f1e93c90/nova-cell1-novncproxy-novncproxy/0.log" Mar 13 12:43:23 crc kubenswrapper[4837]: I0313 12:43:23.863815 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-4jdmk_e6986f16-e143-49f4-81e5-58abba717876/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:43:24 crc kubenswrapper[4837]: I0313 12:43:24.129593 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_7faa5418-aa48-4e20-830c-bb171cfea0d9/nova-metadata-log/0.log" Mar 13 12:43:24 crc kubenswrapper[4837]: I0313 12:43:24.266849 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_d380e047-7297-4835-b948-6c86c6b6aa27/nova-scheduler-scheduler/0.log" Mar 13 12:43:24 crc kubenswrapper[4837]: I0313 12:43:24.338368 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_362e31d4-ea62-40ed-8426-982d47559472/mysql-bootstrap/0.log" Mar 13 12:43:24 crc kubenswrapper[4837]: I0313 12:43:24.541016 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_362e31d4-ea62-40ed-8426-982d47559472/mysql-bootstrap/0.log" Mar 13 12:43:24 crc kubenswrapper[4837]: I0313 12:43:24.580618 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_362e31d4-ea62-40ed-8426-982d47559472/galera/0.log" Mar 13 12:43:24 crc kubenswrapper[4837]: I0313 12:43:24.706349 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd/mysql-bootstrap/0.log" Mar 13 12:43:25 crc kubenswrapper[4837]: I0313 12:43:25.013667 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd/mysql-bootstrap/0.log" Mar 13 12:43:25 crc kubenswrapper[4837]: I0313 12:43:25.015392 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd/galera/0.log" Mar 13 12:43:25 crc kubenswrapper[4837]: I0313 12:43:25.097413 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_7faa5418-aa48-4e20-830c-bb171cfea0d9/nova-metadata-metadata/0.log" Mar 13 12:43:25 crc kubenswrapper[4837]: I0313 12:43:25.183976 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_5d15c820-a2ee-4d4c-986f-2c2f09b43f79/openstackclient/0.log" Mar 13 12:43:25 crc kubenswrapper[4837]: I0313 12:43:25.237876 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-w69p6_18eb496a-7d9f-4bf6-af71-3b7b585d0f7d/openstack-network-exporter/0.log" Mar 13 12:43:25 crc kubenswrapper[4837]: I0313 12:43:25.428827 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-nbhpw_32dc51d9-5638-4530-91c8-5be8c13e60f3/ovn-controller/0.log" Mar 13 12:43:25 crc kubenswrapper[4837]: I0313 12:43:25.519468 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-ls998_71e00962-6b2f-495c-8f34-52993f66cef9/ovsdb-server-init/0.log" Mar 13 12:43:25 crc kubenswrapper[4837]: I0313 12:43:25.666212 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-ls998_71e00962-6b2f-495c-8f34-52993f66cef9/ovsdb-server-init/0.log" Mar 13 12:43:25 crc kubenswrapper[4837]: I0313 12:43:25.733037 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-ls998_71e00962-6b2f-495c-8f34-52993f66cef9/ovsdb-server/0.log" Mar 13 12:43:25 crc kubenswrapper[4837]: I0313 12:43:25.754331 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-ls998_71e00962-6b2f-495c-8f34-52993f66cef9/ovs-vswitchd/0.log" Mar 13 12:43:25 crc kubenswrapper[4837]: I0313 12:43:25.945240 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-kbffp_092bd277-504a-450d-aca1-d8ecc18f0c9f/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:43:25 crc kubenswrapper[4837]: I0313 12:43:25.956452 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_25ea0f5e-e277-4944-8c9d-2c7709e1a8cf/openstack-network-exporter/0.log" Mar 13 12:43:26 crc kubenswrapper[4837]: I0313 12:43:26.058249 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_25ea0f5e-e277-4944-8c9d-2c7709e1a8cf/ovn-northd/0.log" Mar 13 12:43:26 crc kubenswrapper[4837]: I0313 12:43:26.160941 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_38d61ffe-3c44-4657-bc91-d849f766a3e1/openstack-network-exporter/0.log" Mar 13 12:43:26 crc kubenswrapper[4837]: I0313 12:43:26.214464 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_38d61ffe-3c44-4657-bc91-d849f766a3e1/ovsdbserver-nb/0.log" Mar 13 12:43:26 crc kubenswrapper[4837]: I0313 12:43:26.426271 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_3d10fcb0-4d45-45bf-a663-971b8ce74010/openstack-network-exporter/0.log" Mar 13 12:43:26 crc kubenswrapper[4837]: I0313 12:43:26.471390 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_3d10fcb0-4d45-45bf-a663-971b8ce74010/ovsdbserver-sb/0.log" Mar 13 12:43:26 crc kubenswrapper[4837]: I0313 12:43:26.612710 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-59f7b5dc8d-rnsz6_07eece9e-0e59-4a06-8fea-efb4217d6907/placement-api/0.log" Mar 13 12:43:26 crc kubenswrapper[4837]: I0313 12:43:26.708986 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_90028d66-5134-4c09-af15-71e754f49bf3/setup-container/0.log" Mar 13 12:43:26 crc kubenswrapper[4837]: I0313 12:43:26.734802 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-59f7b5dc8d-rnsz6_07eece9e-0e59-4a06-8fea-efb4217d6907/placement-log/0.log" Mar 13 12:43:26 crc kubenswrapper[4837]: I0313 12:43:26.957153 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_245e5a26-d143-4e4d-bae8-094275a91574/setup-container/0.log" Mar 13 12:43:27 crc kubenswrapper[4837]: I0313 12:43:27.009698 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_90028d66-5134-4c09-af15-71e754f49bf3/rabbitmq/0.log" Mar 13 12:43:27 crc kubenswrapper[4837]: I0313 12:43:27.025473 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_90028d66-5134-4c09-af15-71e754f49bf3/setup-container/0.log" Mar 13 12:43:27 crc kubenswrapper[4837]: I0313 12:43:27.197033 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_245e5a26-d143-4e4d-bae8-094275a91574/setup-container/0.log" Mar 13 12:43:27 crc kubenswrapper[4837]: I0313 12:43:27.229068 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_245e5a26-d143-4e4d-bae8-094275a91574/rabbitmq/0.log" Mar 13 12:43:27 crc kubenswrapper[4837]: I0313 12:43:27.349869 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d_3b96ea7e-2148-4659-9a26-3335c88888c1/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:43:27 crc kubenswrapper[4837]: I0313 12:43:27.457889 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-pz9nt_0b7402b1-0b76-4ffa-b37f-6e014183f6a6/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:43:27 crc kubenswrapper[4837]: I0313 12:43:27.571893 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6_bfedd3e5-e8d7-4311-9a0d-30276ce40418/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:43:27 crc kubenswrapper[4837]: I0313 12:43:27.698573 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-s6jdp_f12ac62a-2011-4e89-a16f-e136959f9d1a/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:43:27 crc kubenswrapper[4837]: I0313 12:43:27.794756 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-vjnpx_4ddcb794-ab03-4308-a93c-c5929ed96e01/ssh-known-hosts-edpm-deployment/0.log" Mar 13 12:43:27 crc kubenswrapper[4837]: I0313 12:43:27.971771 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-bfbc874dc-vsh7q_36ffa543-526d-4d56-b599-06fcfe0988cf/proxy-server/0.log" Mar 13 12:43:28 crc kubenswrapper[4837]: I0313 12:43:28.009909 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-bfbc874dc-vsh7q_36ffa543-526d-4d56-b599-06fcfe0988cf/proxy-httpd/0.log" Mar 13 12:43:28 crc kubenswrapper[4837]: I0313 12:43:28.170404 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-69xgx_24998567-afa6-4adc-a503-4fc054946aef/swift-ring-rebalance/0.log" Mar 13 12:43:28 crc kubenswrapper[4837]: I0313 12:43:28.231395 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/account-auditor/0.log" Mar 13 12:43:28 crc kubenswrapper[4837]: I0313 12:43:28.259331 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/account-reaper/0.log" Mar 13 12:43:28 crc kubenswrapper[4837]: I0313 12:43:28.362424 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/account-replicator/0.log" Mar 13 12:43:28 crc kubenswrapper[4837]: I0313 12:43:28.440783 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/account-server/0.log" Mar 13 12:43:28 crc kubenswrapper[4837]: I0313 12:43:28.450295 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/container-auditor/0.log" Mar 13 12:43:28 crc kubenswrapper[4837]: I0313 12:43:28.501908 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/container-replicator/0.log" Mar 13 12:43:28 crc kubenswrapper[4837]: I0313 12:43:28.603290 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/container-server/0.log" Mar 13 12:43:28 crc kubenswrapper[4837]: I0313 12:43:28.628649 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/container-updater/0.log" Mar 13 12:43:28 crc kubenswrapper[4837]: I0313 12:43:28.651978 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/object-auditor/0.log" Mar 13 12:43:28 crc kubenswrapper[4837]: I0313 12:43:28.730213 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/object-expirer/0.log" Mar 13 12:43:28 crc kubenswrapper[4837]: I0313 12:43:28.849385 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/object-updater/0.log" Mar 13 12:43:28 crc kubenswrapper[4837]: I0313 12:43:28.862564 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/object-replicator/0.log" Mar 13 12:43:28 crc kubenswrapper[4837]: I0313 12:43:28.863297 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/object-server/0.log" Mar 13 12:43:28 crc kubenswrapper[4837]: I0313 12:43:28.997479 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/rsync/0.log" Mar 13 12:43:29 crc kubenswrapper[4837]: I0313 12:43:29.075263 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/swift-recon-cron/0.log" Mar 13 12:43:29 crc kubenswrapper[4837]: I0313 12:43:29.230881 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x_ac15848f-4f6f-4159-828f-d30a77f93a4b/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:43:29 crc kubenswrapper[4837]: I0313 12:43:29.305059 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_66bdda91-c5b6-4879-9adf-21846884c797/tempest-tests-tempest-tests-runner/0.log" Mar 13 12:43:29 crc kubenswrapper[4837]: I0313 12:43:29.490215 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_0244acef-b630-4b97-9bb5-9f99de391613/test-operator-logs-container/0.log" Mar 13 12:43:29 crc kubenswrapper[4837]: I0313 12:43:29.540304 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-42br8_e3ec33da-9091-4eb1-aafa-62b9bdf16072/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:43:39 crc kubenswrapper[4837]: I0313 12:43:39.525118 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_ae39431b-5fa4-4a09-b76f-44b4d256c129/memcached/0.log" Mar 13 12:43:53 crc kubenswrapper[4837]: I0313 12:43:53.608359 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66d56f6ff4-b7cdx_e645f00a-8463-4fac-b010-f0500b54d68a/manager/0.log" Mar 13 12:43:54 crc kubenswrapper[4837]: I0313 12:43:54.060161 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b_53ac9dfc-487a-47cf-83f2-91542b93bb95/util/0.log" Mar 13 12:43:54 crc kubenswrapper[4837]: I0313 12:43:54.279575 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b_53ac9dfc-487a-47cf-83f2-91542b93bb95/util/0.log" Mar 13 12:43:54 crc kubenswrapper[4837]: I0313 12:43:54.317497 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b_53ac9dfc-487a-47cf-83f2-91542b93bb95/pull/0.log" Mar 13 12:43:54 crc kubenswrapper[4837]: I0313 12:43:54.480526 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b_53ac9dfc-487a-47cf-83f2-91542b93bb95/pull/0.log" Mar 13 12:43:54 crc kubenswrapper[4837]: I0313 12:43:54.645233 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b_53ac9dfc-487a-47cf-83f2-91542b93bb95/util/0.log" Mar 13 12:43:54 crc kubenswrapper[4837]: I0313 12:43:54.659101 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b_53ac9dfc-487a-47cf-83f2-91542b93bb95/pull/0.log" Mar 13 12:43:54 crc kubenswrapper[4837]: I0313 12:43:54.766431 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-984cd4dcf-kbn8z_0a24601d-8e41-4f99-9e33-870d791a3e7e/manager/0.log" Mar 13 12:43:54 crc kubenswrapper[4837]: I0313 12:43:54.904043 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b_53ac9dfc-487a-47cf-83f2-91542b93bb95/extract/0.log" Mar 13 12:43:55 crc kubenswrapper[4837]: I0313 12:43:55.183534 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5964f64c48-mrgb9_1870e3ae-40fd-479c-9aa7-9ce3a3e2dd2e/manager/0.log" Mar 13 12:43:55 crc kubenswrapper[4837]: I0313 12:43:55.639157 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-77b6666d85-ss4rm_b2c881d7-03db-4608-a3f4-9a9ad8b2f5da/manager/0.log" Mar 13 12:43:55 crc kubenswrapper[4837]: I0313 12:43:55.670803 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d9d6b584d-bvmr7_11a29883-0638-4da4-a1dc-bf2127a3645c/manager/0.log" Mar 13 12:43:56 crc kubenswrapper[4837]: I0313 12:43:56.036292 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6bbb499bbc-9zvxf_89e6d6f8-7bd3-4862-b41c-cd5c1f05f3e5/manager/0.log" Mar 13 12:43:56 crc kubenswrapper[4837]: I0313 12:43:56.223142 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5995f4446f-fhlk9_c19c3466-ab50-4be3-8299-d7b8b3d263df/manager/0.log" Mar 13 12:43:56 crc kubenswrapper[4837]: I0313 12:43:56.428863 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-684f77d66d-kc2x6_9bd066a9-3999-405a-b619-540678a46ded/manager/0.log" Mar 13 12:43:56 crc kubenswrapper[4837]: I0313 12:43:56.474556 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-68f45f9d9f-twrg7_fa1b1ba2-3856-49cb-bda4-8ac5e63b5298/manager/0.log" Mar 13 12:43:56 crc kubenswrapper[4837]: I0313 12:43:56.743231 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-658d4cdd5-7nm95_046bdee0-f0cf-4d17-916b-68d301502473/manager/0.log" Mar 13 12:43:56 crc kubenswrapper[4837]: I0313 12:43:56.887020 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-776c5696bf-6ht9l_3059d7c0-2624-4d3e-af0f-de054401f1ec/manager/0.log" Mar 13 12:43:57 crc kubenswrapper[4837]: I0313 12:43:57.103151 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-569cc54c5-shrx7_ee1c592d-7979-4b75-b8e4-7ccd6d7d6048/manager/0.log" Mar 13 12:43:57 crc kubenswrapper[4837]: I0313 12:43:57.187063 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5f4f55cb5c-7f7zd_561aed86-f289-4dd1-8c53-307ccdc99165/manager/0.log" Mar 13 12:43:57 crc kubenswrapper[4837]: I0313 12:43:57.269738 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-557ccf57b77x9vc_7b38159c-e030-4734-963d-dfc38d29c75c/manager/0.log" Mar 13 12:43:57 crc kubenswrapper[4837]: I0313 12:43:57.651498 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-c99df78b8-qxmfb_4f8c5e9e-7680-4bc3-8096-0c62a1de4da5/operator/0.log" Mar 13 12:43:57 crc kubenswrapper[4837]: I0313 12:43:57.841362 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-mdjzs_9da10ec5-aa1b-4797-91ce-04a91266831a/registry-server/0.log" Mar 13 12:43:58 crc kubenswrapper[4837]: I0313 12:43:58.174851 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-bbc5b68f9-nxwr9_5f00cf34-6fc4-4ee9-93e5-5ff8c6b1128d/manager/0.log" Mar 13 12:43:58 crc kubenswrapper[4837]: I0313 12:43:58.398881 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-574d45c66c-fwblp_35a21ab1-95b5-446a-ae10-d004e5aa2995/manager/0.log" Mar 13 12:43:58 crc kubenswrapper[4837]: I0313 12:43:58.605880 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-xkk4z_ce0c89e1-3fc0-473d-875f-461c8b423061/operator/0.log" Mar 13 12:43:58 crc kubenswrapper[4837]: I0313 12:43:58.831659 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-677c674df7-cfv8z_55649f1c-678e-4e03-be55-7c4435446199/manager/0.log" Mar 13 12:43:59 crc kubenswrapper[4837]: I0313 12:43:59.136605 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6cd66dbd4b-8lkmx_cb20db22-bd0e-4897-8ed6-a6a80a91ffff/manager/0.log" Mar 13 12:43:59 crc kubenswrapper[4837]: I0313 12:43:59.206172 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-dk4nr_fe107e39-b5ec-473d-8851-b57775dadafc/manager/0.log" Mar 13 12:43:59 crc kubenswrapper[4837]: I0313 12:43:59.247903 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-55876d85bb-96mp7_eaf3fa29-f441-43df-9fbe-409d9d8ad871/manager/0.log" Mar 13 12:43:59 crc kubenswrapper[4837]: I0313 12:43:59.334302 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6dd88c6f67-hrcp9_5ef20b1d-5c03-4993-b635-b031ddcab3bf/manager/0.log" Mar 13 12:44:00 crc kubenswrapper[4837]: I0313 12:44:00.032335 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-677bd678f7-jvdqq_1d59bb7f-598d-4c70-9b8c-ce4e3048691f/manager/0.log" Mar 13 12:44:00 crc kubenswrapper[4837]: I0313 12:44:00.146251 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556764-r7n49"] Mar 13 12:44:00 crc kubenswrapper[4837]: E0313 12:44:00.150092 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3157eabd-f1a5-4ab2-b3ac-aae960131503" containerName="container-00" Mar 13 12:44:00 crc kubenswrapper[4837]: I0313 12:44:00.150112 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="3157eabd-f1a5-4ab2-b3ac-aae960131503" containerName="container-00" Mar 13 12:44:00 crc kubenswrapper[4837]: I0313 12:44:00.150295 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="3157eabd-f1a5-4ab2-b3ac-aae960131503" containerName="container-00" Mar 13 12:44:00 crc kubenswrapper[4837]: I0313 12:44:00.150960 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556764-r7n49" Mar 13 12:44:00 crc kubenswrapper[4837]: I0313 12:44:00.153607 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:44:00 crc kubenswrapper[4837]: I0313 12:44:00.153741 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:44:00 crc kubenswrapper[4837]: I0313 12:44:00.153760 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:44:00 crc kubenswrapper[4837]: I0313 12:44:00.162324 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556764-r7n49"] Mar 13 12:44:00 crc kubenswrapper[4837]: I0313 12:44:00.300365 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7db7\" (UniqueName: \"kubernetes.io/projected/7e88464a-8619-4750-ac96-b1ad569fcece-kube-api-access-d7db7\") pod \"auto-csr-approver-29556764-r7n49\" (UID: \"7e88464a-8619-4750-ac96-b1ad569fcece\") " pod="openshift-infra/auto-csr-approver-29556764-r7n49" Mar 13 12:44:00 crc kubenswrapper[4837]: I0313 12:44:00.402903 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7db7\" (UniqueName: \"kubernetes.io/projected/7e88464a-8619-4750-ac96-b1ad569fcece-kube-api-access-d7db7\") pod \"auto-csr-approver-29556764-r7n49\" (UID: \"7e88464a-8619-4750-ac96-b1ad569fcece\") " pod="openshift-infra/auto-csr-approver-29556764-r7n49" Mar 13 12:44:00 crc kubenswrapper[4837]: I0313 12:44:00.423899 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7db7\" (UniqueName: \"kubernetes.io/projected/7e88464a-8619-4750-ac96-b1ad569fcece-kube-api-access-d7db7\") pod \"auto-csr-approver-29556764-r7n49\" (UID: \"7e88464a-8619-4750-ac96-b1ad569fcece\") " pod="openshift-infra/auto-csr-approver-29556764-r7n49" Mar 13 12:44:00 crc kubenswrapper[4837]: I0313 12:44:00.476596 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556764-r7n49" Mar 13 12:44:00 crc kubenswrapper[4837]: I0313 12:44:00.950785 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556764-r7n49"] Mar 13 12:44:00 crc kubenswrapper[4837]: W0313 12:44:00.952441 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e88464a_8619_4750_ac96_b1ad569fcece.slice/crio-bfb4f6797f10f32b7ae2c1839830c41e8293f02b1b5b7739f3c42113f69cd41e WatchSource:0}: Error finding container bfb4f6797f10f32b7ae2c1839830c41e8293f02b1b5b7739f3c42113f69cd41e: Status 404 returned error can't find the container with id bfb4f6797f10f32b7ae2c1839830c41e8293f02b1b5b7739f3c42113f69cd41e Mar 13 12:44:01 crc kubenswrapper[4837]: I0313 12:44:01.495953 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556764-r7n49" event={"ID":"7e88464a-8619-4750-ac96-b1ad569fcece","Type":"ContainerStarted","Data":"bfb4f6797f10f32b7ae2c1839830c41e8293f02b1b5b7739f3c42113f69cd41e"} Mar 13 12:44:02 crc kubenswrapper[4837]: I0313 12:44:02.504959 4837 generic.go:334] "Generic (PLEG): container finished" podID="7e88464a-8619-4750-ac96-b1ad569fcece" containerID="1dc84242c71f8e5d31bcd05b0ae44aeb29c8a625295bbab7f2eb79c610ba55a4" exitCode=0 Mar 13 12:44:02 crc kubenswrapper[4837]: I0313 12:44:02.505078 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556764-r7n49" event={"ID":"7e88464a-8619-4750-ac96-b1ad569fcece","Type":"ContainerDied","Data":"1dc84242c71f8e5d31bcd05b0ae44aeb29c8a625295bbab7f2eb79c610ba55a4"} Mar 13 12:44:03 crc kubenswrapper[4837]: I0313 12:44:03.918843 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556764-r7n49" Mar 13 12:44:03 crc kubenswrapper[4837]: I0313 12:44:03.971335 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7db7\" (UniqueName: \"kubernetes.io/projected/7e88464a-8619-4750-ac96-b1ad569fcece-kube-api-access-d7db7\") pod \"7e88464a-8619-4750-ac96-b1ad569fcece\" (UID: \"7e88464a-8619-4750-ac96-b1ad569fcece\") " Mar 13 12:44:03 crc kubenswrapper[4837]: I0313 12:44:03.977868 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e88464a-8619-4750-ac96-b1ad569fcece-kube-api-access-d7db7" (OuterVolumeSpecName: "kube-api-access-d7db7") pod "7e88464a-8619-4750-ac96-b1ad569fcece" (UID: "7e88464a-8619-4750-ac96-b1ad569fcece"). InnerVolumeSpecName "kube-api-access-d7db7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:44:04 crc kubenswrapper[4837]: I0313 12:44:04.074192 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7db7\" (UniqueName: \"kubernetes.io/projected/7e88464a-8619-4750-ac96-b1ad569fcece-kube-api-access-d7db7\") on node \"crc\" DevicePath \"\"" Mar 13 12:44:04 crc kubenswrapper[4837]: I0313 12:44:04.522940 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556764-r7n49" event={"ID":"7e88464a-8619-4750-ac96-b1ad569fcece","Type":"ContainerDied","Data":"bfb4f6797f10f32b7ae2c1839830c41e8293f02b1b5b7739f3c42113f69cd41e"} Mar 13 12:44:04 crc kubenswrapper[4837]: I0313 12:44:04.522982 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfb4f6797f10f32b7ae2c1839830c41e8293f02b1b5b7739f3c42113f69cd41e" Mar 13 12:44:04 crc kubenswrapper[4837]: I0313 12:44:04.523010 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556764-r7n49" Mar 13 12:44:04 crc kubenswrapper[4837]: I0313 12:44:04.982727 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556758-srcvt"] Mar 13 12:44:04 crc kubenswrapper[4837]: I0313 12:44:04.990264 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556758-srcvt"] Mar 13 12:44:05 crc kubenswrapper[4837]: I0313 12:44:05.064274 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ea011c6-ad1b-46cf-b5c4-11ac11fd6a38" path="/var/lib/kubelet/pods/0ea011c6-ad1b-46cf-b5c4-11ac11fd6a38/volumes" Mar 13 12:44:17 crc kubenswrapper[4837]: I0313 12:44:17.715364 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-jrm5t_00848ba6-522a-45c7-81bd-7ab287d77626/control-plane-machine-set-operator/0.log" Mar 13 12:44:17 crc kubenswrapper[4837]: I0313 12:44:17.835061 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-vsp2m_6db10103-96be-4420-b302-a7064e347f61/kube-rbac-proxy/0.log" Mar 13 12:44:17 crc kubenswrapper[4837]: I0313 12:44:17.909835 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-vsp2m_6db10103-96be-4420-b302-a7064e347f61/machine-api-operator/0.log" Mar 13 12:44:29 crc kubenswrapper[4837]: I0313 12:44:29.433657 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-dlspp_5ecc1237-3421-41d5-8efb-a62399ae1d73/cert-manager-controller/0.log" Mar 13 12:44:29 crc kubenswrapper[4837]: I0313 12:44:29.636987 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-xzv5h_67507b8e-35d5-4dff-9239-45b5ef997e53/cert-manager-cainjector/0.log" Mar 13 12:44:29 crc kubenswrapper[4837]: I0313 12:44:29.659199 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-ht9vn_0e500b82-1f14-4a1e-937d-00248f195033/cert-manager-webhook/0.log" Mar 13 12:44:32 crc kubenswrapper[4837]: I0313 12:44:32.605000 4837 scope.go:117] "RemoveContainer" containerID="62169da6c39018c4d64900197bc422e10f99368271388e87ca1a65e2ba0fb126" Mar 13 12:44:42 crc kubenswrapper[4837]: I0313 12:44:42.639269 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-fpxmr_00b31b3f-b520-493a-ad26-679e09376e81/nmstate-console-plugin/0.log" Mar 13 12:44:42 crc kubenswrapper[4837]: I0313 12:44:42.969696 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-vqqqz_ebe31727-805d-472e-89d3-e99b11435be1/nmstate-handler/0.log" Mar 13 12:44:43 crc kubenswrapper[4837]: I0313 12:44:43.074711 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-8xzdk_5d1f2d02-86ab-4679-a4e4-530ad37e4302/nmstate-metrics/0.log" Mar 13 12:44:43 crc kubenswrapper[4837]: I0313 12:44:43.087400 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-8xzdk_5d1f2d02-86ab-4679-a4e4-530ad37e4302/kube-rbac-proxy/0.log" Mar 13 12:44:43 crc kubenswrapper[4837]: I0313 12:44:43.234113 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-zf78q_ef7096b9-861a-4889-9318-535c35151777/nmstate-operator/0.log" Mar 13 12:44:43 crc kubenswrapper[4837]: I0313 12:44:43.317730 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-6cx5h_0b06c77a-f41d-41a6-b115-f12cc5109c0c/nmstate-webhook/0.log" Mar 13 12:45:00 crc kubenswrapper[4837]: I0313 12:45:00.143893 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556765-tjz7g"] Mar 13 12:45:00 crc kubenswrapper[4837]: E0313 12:45:00.144757 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e88464a-8619-4750-ac96-b1ad569fcece" containerName="oc" Mar 13 12:45:00 crc kubenswrapper[4837]: I0313 12:45:00.144770 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e88464a-8619-4750-ac96-b1ad569fcece" containerName="oc" Mar 13 12:45:00 crc kubenswrapper[4837]: I0313 12:45:00.144957 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e88464a-8619-4750-ac96-b1ad569fcece" containerName="oc" Mar 13 12:45:00 crc kubenswrapper[4837]: I0313 12:45:00.145618 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556765-tjz7g" Mar 13 12:45:00 crc kubenswrapper[4837]: I0313 12:45:00.147435 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 12:45:00 crc kubenswrapper[4837]: I0313 12:45:00.147560 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 12:45:00 crc kubenswrapper[4837]: I0313 12:45:00.165966 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556765-tjz7g"] Mar 13 12:45:00 crc kubenswrapper[4837]: I0313 12:45:00.256540 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e6600de-c059-4b00-bb32-d08fa205af0b-secret-volume\") pod \"collect-profiles-29556765-tjz7g\" (UID: \"5e6600de-c059-4b00-bb32-d08fa205af0b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556765-tjz7g" Mar 13 12:45:00 crc kubenswrapper[4837]: I0313 12:45:00.256582 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74z7h\" (UniqueName: \"kubernetes.io/projected/5e6600de-c059-4b00-bb32-d08fa205af0b-kube-api-access-74z7h\") pod \"collect-profiles-29556765-tjz7g\" (UID: \"5e6600de-c059-4b00-bb32-d08fa205af0b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556765-tjz7g" Mar 13 12:45:00 crc kubenswrapper[4837]: I0313 12:45:00.256599 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e6600de-c059-4b00-bb32-d08fa205af0b-config-volume\") pod \"collect-profiles-29556765-tjz7g\" (UID: \"5e6600de-c059-4b00-bb32-d08fa205af0b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556765-tjz7g" Mar 13 12:45:00 crc kubenswrapper[4837]: I0313 12:45:00.359024 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e6600de-c059-4b00-bb32-d08fa205af0b-secret-volume\") pod \"collect-profiles-29556765-tjz7g\" (UID: \"5e6600de-c059-4b00-bb32-d08fa205af0b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556765-tjz7g" Mar 13 12:45:00 crc kubenswrapper[4837]: I0313 12:45:00.359068 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e6600de-c059-4b00-bb32-d08fa205af0b-config-volume\") pod \"collect-profiles-29556765-tjz7g\" (UID: \"5e6600de-c059-4b00-bb32-d08fa205af0b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556765-tjz7g" Mar 13 12:45:00 crc kubenswrapper[4837]: I0313 12:45:00.359087 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74z7h\" (UniqueName: \"kubernetes.io/projected/5e6600de-c059-4b00-bb32-d08fa205af0b-kube-api-access-74z7h\") pod \"collect-profiles-29556765-tjz7g\" (UID: \"5e6600de-c059-4b00-bb32-d08fa205af0b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556765-tjz7g" Mar 13 12:45:00 crc kubenswrapper[4837]: I0313 12:45:00.359860 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e6600de-c059-4b00-bb32-d08fa205af0b-config-volume\") pod \"collect-profiles-29556765-tjz7g\" (UID: \"5e6600de-c059-4b00-bb32-d08fa205af0b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556765-tjz7g" Mar 13 12:45:00 crc kubenswrapper[4837]: I0313 12:45:00.366157 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e6600de-c059-4b00-bb32-d08fa205af0b-secret-volume\") pod \"collect-profiles-29556765-tjz7g\" (UID: \"5e6600de-c059-4b00-bb32-d08fa205af0b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556765-tjz7g" Mar 13 12:45:00 crc kubenswrapper[4837]: I0313 12:45:00.385238 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74z7h\" (UniqueName: \"kubernetes.io/projected/5e6600de-c059-4b00-bb32-d08fa205af0b-kube-api-access-74z7h\") pod \"collect-profiles-29556765-tjz7g\" (UID: \"5e6600de-c059-4b00-bb32-d08fa205af0b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556765-tjz7g" Mar 13 12:45:00 crc kubenswrapper[4837]: I0313 12:45:00.505075 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556765-tjz7g" Mar 13 12:45:00 crc kubenswrapper[4837]: I0313 12:45:00.997861 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556765-tjz7g"] Mar 13 12:45:01 crc kubenswrapper[4837]: I0313 12:45:01.020951 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556765-tjz7g" event={"ID":"5e6600de-c059-4b00-bb32-d08fa205af0b","Type":"ContainerStarted","Data":"9223edefc813554c07fd84bc832d210bc6d60ad4f95d18bbbb21c5caf7d7c599"} Mar 13 12:45:02 crc kubenswrapper[4837]: I0313 12:45:02.035597 4837 generic.go:334] "Generic (PLEG): container finished" podID="5e6600de-c059-4b00-bb32-d08fa205af0b" containerID="757048523c491e6eb74ea7ed6665cc66421c95e45bb0822633b5750e9b571caa" exitCode=0 Mar 13 12:45:02 crc kubenswrapper[4837]: I0313 12:45:02.035681 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556765-tjz7g" event={"ID":"5e6600de-c059-4b00-bb32-d08fa205af0b","Type":"ContainerDied","Data":"757048523c491e6eb74ea7ed6665cc66421c95e45bb0822633b5750e9b571caa"} Mar 13 12:45:03 crc kubenswrapper[4837]: I0313 12:45:03.430983 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556765-tjz7g" Mar 13 12:45:03 crc kubenswrapper[4837]: I0313 12:45:03.515645 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e6600de-c059-4b00-bb32-d08fa205af0b-secret-volume\") pod \"5e6600de-c059-4b00-bb32-d08fa205af0b\" (UID: \"5e6600de-c059-4b00-bb32-d08fa205af0b\") " Mar 13 12:45:03 crc kubenswrapper[4837]: I0313 12:45:03.516062 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74z7h\" (UniqueName: \"kubernetes.io/projected/5e6600de-c059-4b00-bb32-d08fa205af0b-kube-api-access-74z7h\") pod \"5e6600de-c059-4b00-bb32-d08fa205af0b\" (UID: \"5e6600de-c059-4b00-bb32-d08fa205af0b\") " Mar 13 12:45:03 crc kubenswrapper[4837]: I0313 12:45:03.516099 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e6600de-c059-4b00-bb32-d08fa205af0b-config-volume\") pod \"5e6600de-c059-4b00-bb32-d08fa205af0b\" (UID: \"5e6600de-c059-4b00-bb32-d08fa205af0b\") " Mar 13 12:45:03 crc kubenswrapper[4837]: I0313 12:45:03.517254 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e6600de-c059-4b00-bb32-d08fa205af0b-config-volume" (OuterVolumeSpecName: "config-volume") pod "5e6600de-c059-4b00-bb32-d08fa205af0b" (UID: "5e6600de-c059-4b00-bb32-d08fa205af0b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 12:45:03 crc kubenswrapper[4837]: I0313 12:45:03.523804 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e6600de-c059-4b00-bb32-d08fa205af0b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5e6600de-c059-4b00-bb32-d08fa205af0b" (UID: "5e6600de-c059-4b00-bb32-d08fa205af0b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 12:45:03 crc kubenswrapper[4837]: I0313 12:45:03.528009 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e6600de-c059-4b00-bb32-d08fa205af0b-kube-api-access-74z7h" (OuterVolumeSpecName: "kube-api-access-74z7h") pod "5e6600de-c059-4b00-bb32-d08fa205af0b" (UID: "5e6600de-c059-4b00-bb32-d08fa205af0b"). InnerVolumeSpecName "kube-api-access-74z7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:45:03 crc kubenswrapper[4837]: I0313 12:45:03.618202 4837 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e6600de-c059-4b00-bb32-d08fa205af0b-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 12:45:03 crc kubenswrapper[4837]: I0313 12:45:03.618240 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74z7h\" (UniqueName: \"kubernetes.io/projected/5e6600de-c059-4b00-bb32-d08fa205af0b-kube-api-access-74z7h\") on node \"crc\" DevicePath \"\"" Mar 13 12:45:03 crc kubenswrapper[4837]: I0313 12:45:03.618249 4837 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e6600de-c059-4b00-bb32-d08fa205af0b-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 12:45:04 crc kubenswrapper[4837]: I0313 12:45:04.056827 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556765-tjz7g" event={"ID":"5e6600de-c059-4b00-bb32-d08fa205af0b","Type":"ContainerDied","Data":"9223edefc813554c07fd84bc832d210bc6d60ad4f95d18bbbb21c5caf7d7c599"} Mar 13 12:45:04 crc kubenswrapper[4837]: I0313 12:45:04.056867 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9223edefc813554c07fd84bc832d210bc6d60ad4f95d18bbbb21c5caf7d7c599" Mar 13 12:45:04 crc kubenswrapper[4837]: I0313 12:45:04.056900 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556765-tjz7g" Mar 13 12:45:04 crc kubenswrapper[4837]: I0313 12:45:04.515054 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc"] Mar 13 12:45:04 crc kubenswrapper[4837]: I0313 12:45:04.530031 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556720-slvpc"] Mar 13 12:45:05 crc kubenswrapper[4837]: I0313 12:45:05.083776 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6d18151-32fe-4457-814f-33c3ed53dab8" path="/var/lib/kubelet/pods/a6d18151-32fe-4457-814f-33c3ed53dab8/volumes" Mar 13 12:45:05 crc kubenswrapper[4837]: I0313 12:45:05.484363 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:45:05 crc kubenswrapper[4837]: I0313 12:45:05.484432 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:45:11 crc kubenswrapper[4837]: I0313 12:45:11.962239 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-zm9dj_0ad270d6-2fc1-4ed0-8a87-bef0e59a4c88/kube-rbac-proxy/0.log" Mar 13 12:45:12 crc kubenswrapper[4837]: I0313 12:45:12.117858 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-zm9dj_0ad270d6-2fc1-4ed0-8a87-bef0e59a4c88/controller/0.log" Mar 13 12:45:12 crc kubenswrapper[4837]: I0313 12:45:12.158252 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/cp-frr-files/0.log" Mar 13 12:45:12 crc kubenswrapper[4837]: I0313 12:45:12.348967 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/cp-frr-files/0.log" Mar 13 12:45:12 crc kubenswrapper[4837]: I0313 12:45:12.399098 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/cp-reloader/0.log" Mar 13 12:45:12 crc kubenswrapper[4837]: I0313 12:45:12.402526 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/cp-metrics/0.log" Mar 13 12:45:12 crc kubenswrapper[4837]: I0313 12:45:12.426233 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/cp-reloader/0.log" Mar 13 12:45:12 crc kubenswrapper[4837]: I0313 12:45:12.599351 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/cp-metrics/0.log" Mar 13 12:45:12 crc kubenswrapper[4837]: I0313 12:45:12.607236 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/cp-frr-files/0.log" Mar 13 12:45:12 crc kubenswrapper[4837]: I0313 12:45:12.623742 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/cp-metrics/0.log" Mar 13 12:45:12 crc kubenswrapper[4837]: I0313 12:45:12.658687 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/cp-reloader/0.log" Mar 13 12:45:12 crc kubenswrapper[4837]: I0313 12:45:12.812766 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/cp-frr-files/0.log" Mar 13 12:45:12 crc kubenswrapper[4837]: I0313 12:45:12.846316 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/cp-reloader/0.log" Mar 13 12:45:12 crc kubenswrapper[4837]: I0313 12:45:12.861464 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/cp-metrics/0.log" Mar 13 12:45:12 crc kubenswrapper[4837]: I0313 12:45:12.877546 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/controller/0.log" Mar 13 12:45:13 crc kubenswrapper[4837]: I0313 12:45:13.051612 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/frr-metrics/0.log" Mar 13 12:45:13 crc kubenswrapper[4837]: I0313 12:45:13.077114 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/kube-rbac-proxy/0.log" Mar 13 12:45:13 crc kubenswrapper[4837]: I0313 12:45:13.121121 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/kube-rbac-proxy-frr/0.log" Mar 13 12:45:13 crc kubenswrapper[4837]: I0313 12:45:13.263294 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/reloader/0.log" Mar 13 12:45:13 crc kubenswrapper[4837]: I0313 12:45:13.357026 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-jwgl7_c72405c5-2c81-43f4-93c6-f73f9771be8b/frr-k8s-webhook-server/0.log" Mar 13 12:45:13 crc kubenswrapper[4837]: I0313 12:45:13.852338 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-dcfbdf95f-7x96d_41898fd8-d078-444c-bb55-33f4fb6f3dcc/manager/0.log" Mar 13 12:45:14 crc kubenswrapper[4837]: I0313 12:45:14.031893 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-59b847b88-lrvzm_eabfad13-4fe4-495d-8b6a-2da56ef3b826/webhook-server/0.log" Mar 13 12:45:14 crc kubenswrapper[4837]: I0313 12:45:14.064935 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-8skdh_82a5fe00-90be-47b1-a357-69942f385d4f/kube-rbac-proxy/0.log" Mar 13 12:45:14 crc kubenswrapper[4837]: I0313 12:45:14.362713 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/frr/0.log" Mar 13 12:45:14 crc kubenswrapper[4837]: I0313 12:45:14.519462 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-8skdh_82a5fe00-90be-47b1-a357-69942f385d4f/speaker/0.log" Mar 13 12:45:26 crc kubenswrapper[4837]: I0313 12:45:26.643094 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h_c49e70e5-a4f6-4782-aa38-2faeb20ec38a/util/0.log" Mar 13 12:45:26 crc kubenswrapper[4837]: I0313 12:45:26.815722 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h_c49e70e5-a4f6-4782-aa38-2faeb20ec38a/util/0.log" Mar 13 12:45:26 crc kubenswrapper[4837]: I0313 12:45:26.841685 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h_c49e70e5-a4f6-4782-aa38-2faeb20ec38a/pull/0.log" Mar 13 12:45:26 crc kubenswrapper[4837]: I0313 12:45:26.889629 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h_c49e70e5-a4f6-4782-aa38-2faeb20ec38a/pull/0.log" Mar 13 12:45:27 crc kubenswrapper[4837]: I0313 12:45:27.031854 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h_c49e70e5-a4f6-4782-aa38-2faeb20ec38a/extract/0.log" Mar 13 12:45:27 crc kubenswrapper[4837]: I0313 12:45:27.078882 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h_c49e70e5-a4f6-4782-aa38-2faeb20ec38a/pull/0.log" Mar 13 12:45:27 crc kubenswrapper[4837]: I0313 12:45:27.095510 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h_c49e70e5-a4f6-4782-aa38-2faeb20ec38a/util/0.log" Mar 13 12:45:27 crc kubenswrapper[4837]: I0313 12:45:27.214937 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf_b1863878-b849-4485-9e78-35c9f9856697/util/0.log" Mar 13 12:45:27 crc kubenswrapper[4837]: I0313 12:45:27.410256 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf_b1863878-b849-4485-9e78-35c9f9856697/pull/0.log" Mar 13 12:45:27 crc kubenswrapper[4837]: I0313 12:45:27.415478 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf_b1863878-b849-4485-9e78-35c9f9856697/util/0.log" Mar 13 12:45:27 crc kubenswrapper[4837]: I0313 12:45:27.416597 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf_b1863878-b849-4485-9e78-35c9f9856697/pull/0.log" Mar 13 12:45:27 crc kubenswrapper[4837]: I0313 12:45:27.603700 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf_b1863878-b849-4485-9e78-35c9f9856697/util/0.log" Mar 13 12:45:27 crc kubenswrapper[4837]: I0313 12:45:27.611234 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf_b1863878-b849-4485-9e78-35c9f9856697/extract/0.log" Mar 13 12:45:27 crc kubenswrapper[4837]: I0313 12:45:27.616788 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf_b1863878-b849-4485-9e78-35c9f9856697/pull/0.log" Mar 13 12:45:27 crc kubenswrapper[4837]: I0313 12:45:27.778677 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zckjb_4298f221-fd11-49a1-a0e9-6f95dbdedc44/extract-utilities/0.log" Mar 13 12:45:27 crc kubenswrapper[4837]: I0313 12:45:27.921675 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zckjb_4298f221-fd11-49a1-a0e9-6f95dbdedc44/extract-utilities/0.log" Mar 13 12:45:27 crc kubenswrapper[4837]: I0313 12:45:27.943356 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zckjb_4298f221-fd11-49a1-a0e9-6f95dbdedc44/extract-content/0.log" Mar 13 12:45:27 crc kubenswrapper[4837]: I0313 12:45:27.952820 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zckjb_4298f221-fd11-49a1-a0e9-6f95dbdedc44/extract-content/0.log" Mar 13 12:45:28 crc kubenswrapper[4837]: I0313 12:45:28.125943 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zckjb_4298f221-fd11-49a1-a0e9-6f95dbdedc44/extract-utilities/0.log" Mar 13 12:45:28 crc kubenswrapper[4837]: I0313 12:45:28.132627 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zckjb_4298f221-fd11-49a1-a0e9-6f95dbdedc44/extract-content/0.log" Mar 13 12:45:28 crc kubenswrapper[4837]: I0313 12:45:28.406863 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tjvc6_07889497-1048-4f7a-9245-132767bb28b6/extract-utilities/0.log" Mar 13 12:45:28 crc kubenswrapper[4837]: I0313 12:45:28.546594 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tjvc6_07889497-1048-4f7a-9245-132767bb28b6/extract-utilities/0.log" Mar 13 12:45:28 crc kubenswrapper[4837]: I0313 12:45:28.594217 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zckjb_4298f221-fd11-49a1-a0e9-6f95dbdedc44/registry-server/0.log" Mar 13 12:45:28 crc kubenswrapper[4837]: I0313 12:45:28.605373 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tjvc6_07889497-1048-4f7a-9245-132767bb28b6/extract-content/0.log" Mar 13 12:45:28 crc kubenswrapper[4837]: I0313 12:45:28.643455 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tjvc6_07889497-1048-4f7a-9245-132767bb28b6/extract-content/0.log" Mar 13 12:45:28 crc kubenswrapper[4837]: I0313 12:45:28.805250 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tjvc6_07889497-1048-4f7a-9245-132767bb28b6/extract-utilities/0.log" Mar 13 12:45:28 crc kubenswrapper[4837]: I0313 12:45:28.846274 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tjvc6_07889497-1048-4f7a-9245-132767bb28b6/extract-content/0.log" Mar 13 12:45:29 crc kubenswrapper[4837]: I0313 12:45:29.034311 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-7rzpc_b87c8f86-a346-4907-9441-048c3220646f/marketplace-operator/0.log" Mar 13 12:45:29 crc kubenswrapper[4837]: I0313 12:45:29.152468 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m5v5n_fec78503-41e5-45f4-9217-1debe55ec107/extract-utilities/0.log" Mar 13 12:45:29 crc kubenswrapper[4837]: I0313 12:45:29.347357 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m5v5n_fec78503-41e5-45f4-9217-1debe55ec107/extract-utilities/0.log" Mar 13 12:45:29 crc kubenswrapper[4837]: I0313 12:45:29.358313 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tjvc6_07889497-1048-4f7a-9245-132767bb28b6/registry-server/0.log" Mar 13 12:45:29 crc kubenswrapper[4837]: I0313 12:45:29.396476 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m5v5n_fec78503-41e5-45f4-9217-1debe55ec107/extract-content/0.log" Mar 13 12:45:29 crc kubenswrapper[4837]: I0313 12:45:29.427425 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m5v5n_fec78503-41e5-45f4-9217-1debe55ec107/extract-content/0.log" Mar 13 12:45:29 crc kubenswrapper[4837]: I0313 12:45:29.566230 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m5v5n_fec78503-41e5-45f4-9217-1debe55ec107/extract-utilities/0.log" Mar 13 12:45:29 crc kubenswrapper[4837]: I0313 12:45:29.571197 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m5v5n_fec78503-41e5-45f4-9217-1debe55ec107/extract-content/0.log" Mar 13 12:45:29 crc kubenswrapper[4837]: I0313 12:45:29.698088 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m5v5n_fec78503-41e5-45f4-9217-1debe55ec107/registry-server/0.log" Mar 13 12:45:29 crc kubenswrapper[4837]: I0313 12:45:29.766756 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kpp2z_8d96905d-521e-4ab9-87a8-d6edd0c027ed/extract-utilities/0.log" Mar 13 12:45:29 crc kubenswrapper[4837]: I0313 12:45:29.950144 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kpp2z_8d96905d-521e-4ab9-87a8-d6edd0c027ed/extract-content/0.log" Mar 13 12:45:29 crc kubenswrapper[4837]: I0313 12:45:29.951421 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kpp2z_8d96905d-521e-4ab9-87a8-d6edd0c027ed/extract-utilities/0.log" Mar 13 12:45:29 crc kubenswrapper[4837]: I0313 12:45:29.979868 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kpp2z_8d96905d-521e-4ab9-87a8-d6edd0c027ed/extract-content/0.log" Mar 13 12:45:30 crc kubenswrapper[4837]: I0313 12:45:30.119262 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kpp2z_8d96905d-521e-4ab9-87a8-d6edd0c027ed/extract-utilities/0.log" Mar 13 12:45:30 crc kubenswrapper[4837]: I0313 12:45:30.204952 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kpp2z_8d96905d-521e-4ab9-87a8-d6edd0c027ed/extract-content/0.log" Mar 13 12:45:30 crc kubenswrapper[4837]: I0313 12:45:30.704703 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kpp2z_8d96905d-521e-4ab9-87a8-d6edd0c027ed/registry-server/0.log" Mar 13 12:45:32 crc kubenswrapper[4837]: I0313 12:45:32.681278 4837 scope.go:117] "RemoveContainer" containerID="2d2bfd751903359f1fbdf915afe9614d288e33b823b0215d4cd3578202f69f1c" Mar 13 12:45:35 crc kubenswrapper[4837]: I0313 12:45:35.484120 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:45:35 crc kubenswrapper[4837]: I0313 12:45:35.484420 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:46:00 crc kubenswrapper[4837]: I0313 12:46:00.145813 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556766-lgkks"] Mar 13 12:46:00 crc kubenswrapper[4837]: E0313 12:46:00.146675 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e6600de-c059-4b00-bb32-d08fa205af0b" containerName="collect-profiles" Mar 13 12:46:00 crc kubenswrapper[4837]: I0313 12:46:00.146686 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e6600de-c059-4b00-bb32-d08fa205af0b" containerName="collect-profiles" Mar 13 12:46:00 crc kubenswrapper[4837]: I0313 12:46:00.146925 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e6600de-c059-4b00-bb32-d08fa205af0b" containerName="collect-profiles" Mar 13 12:46:00 crc kubenswrapper[4837]: I0313 12:46:00.147710 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556766-lgkks" Mar 13 12:46:00 crc kubenswrapper[4837]: I0313 12:46:00.150554 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:46:00 crc kubenswrapper[4837]: I0313 12:46:00.157380 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556766-lgkks"] Mar 13 12:46:00 crc kubenswrapper[4837]: I0313 12:46:00.190282 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:46:00 crc kubenswrapper[4837]: I0313 12:46:00.191106 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:46:00 crc kubenswrapper[4837]: I0313 12:46:00.303128 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbx56\" (UniqueName: \"kubernetes.io/projected/528b7541-8fac-4df9-9168-c3166532618d-kube-api-access-bbx56\") pod \"auto-csr-approver-29556766-lgkks\" (UID: \"528b7541-8fac-4df9-9168-c3166532618d\") " pod="openshift-infra/auto-csr-approver-29556766-lgkks" Mar 13 12:46:00 crc kubenswrapper[4837]: I0313 12:46:00.404817 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbx56\" (UniqueName: \"kubernetes.io/projected/528b7541-8fac-4df9-9168-c3166532618d-kube-api-access-bbx56\") pod \"auto-csr-approver-29556766-lgkks\" (UID: \"528b7541-8fac-4df9-9168-c3166532618d\") " pod="openshift-infra/auto-csr-approver-29556766-lgkks" Mar 13 12:46:00 crc kubenswrapper[4837]: I0313 12:46:00.426473 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbx56\" (UniqueName: \"kubernetes.io/projected/528b7541-8fac-4df9-9168-c3166532618d-kube-api-access-bbx56\") pod \"auto-csr-approver-29556766-lgkks\" (UID: \"528b7541-8fac-4df9-9168-c3166532618d\") " pod="openshift-infra/auto-csr-approver-29556766-lgkks" Mar 13 12:46:00 crc kubenswrapper[4837]: I0313 12:46:00.508000 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556766-lgkks" Mar 13 12:46:00 crc kubenswrapper[4837]: I0313 12:46:00.966994 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556766-lgkks"] Mar 13 12:46:01 crc kubenswrapper[4837]: I0313 12:46:01.580678 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556766-lgkks" event={"ID":"528b7541-8fac-4df9-9168-c3166532618d","Type":"ContainerStarted","Data":"15fdf260725a5dcb59db67a57200a80bab5a12f32fd0ab8f574e699cb6938eb0"} Mar 13 12:46:02 crc kubenswrapper[4837]: I0313 12:46:02.589711 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556766-lgkks" event={"ID":"528b7541-8fac-4df9-9168-c3166532618d","Type":"ContainerDied","Data":"ef54f8788c02611298b8e701f2aadf7ddd17abb3f2f5d777925238c78d7d9c68"} Mar 13 12:46:02 crc kubenswrapper[4837]: I0313 12:46:02.589617 4837 generic.go:334] "Generic (PLEG): container finished" podID="528b7541-8fac-4df9-9168-c3166532618d" containerID="ef54f8788c02611298b8e701f2aadf7ddd17abb3f2f5d777925238c78d7d9c68" exitCode=0 Mar 13 12:46:03 crc kubenswrapper[4837]: I0313 12:46:03.983834 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556766-lgkks" Mar 13 12:46:04 crc kubenswrapper[4837]: I0313 12:46:04.073581 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbx56\" (UniqueName: \"kubernetes.io/projected/528b7541-8fac-4df9-9168-c3166532618d-kube-api-access-bbx56\") pod \"528b7541-8fac-4df9-9168-c3166532618d\" (UID: \"528b7541-8fac-4df9-9168-c3166532618d\") " Mar 13 12:46:04 crc kubenswrapper[4837]: I0313 12:46:04.080082 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/528b7541-8fac-4df9-9168-c3166532618d-kube-api-access-bbx56" (OuterVolumeSpecName: "kube-api-access-bbx56") pod "528b7541-8fac-4df9-9168-c3166532618d" (UID: "528b7541-8fac-4df9-9168-c3166532618d"). InnerVolumeSpecName "kube-api-access-bbx56". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:46:04 crc kubenswrapper[4837]: I0313 12:46:04.179828 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbx56\" (UniqueName: \"kubernetes.io/projected/528b7541-8fac-4df9-9168-c3166532618d-kube-api-access-bbx56\") on node \"crc\" DevicePath \"\"" Mar 13 12:46:04 crc kubenswrapper[4837]: I0313 12:46:04.609809 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556766-lgkks" event={"ID":"528b7541-8fac-4df9-9168-c3166532618d","Type":"ContainerDied","Data":"15fdf260725a5dcb59db67a57200a80bab5a12f32fd0ab8f574e699cb6938eb0"} Mar 13 12:46:04 crc kubenswrapper[4837]: I0313 12:46:04.609850 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15fdf260725a5dcb59db67a57200a80bab5a12f32fd0ab8f574e699cb6938eb0" Mar 13 12:46:04 crc kubenswrapper[4837]: I0313 12:46:04.610534 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556766-lgkks" Mar 13 12:46:05 crc kubenswrapper[4837]: I0313 12:46:05.071339 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556760-zcnxn"] Mar 13 12:46:05 crc kubenswrapper[4837]: I0313 12:46:05.073511 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556760-zcnxn"] Mar 13 12:46:05 crc kubenswrapper[4837]: I0313 12:46:05.483580 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:46:05 crc kubenswrapper[4837]: I0313 12:46:05.483934 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:46:05 crc kubenswrapper[4837]: I0313 12:46:05.484000 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 12:46:05 crc kubenswrapper[4837]: I0313 12:46:05.484826 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"42895e7497f11e52c6189b0227f4673591fee559ac68adfdd28355562f8112bd"} pod="openshift-machine-config-operator/machine-config-daemon-2td4d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 12:46:05 crc kubenswrapper[4837]: I0313 12:46:05.484902 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" containerID="cri-o://42895e7497f11e52c6189b0227f4673591fee559ac68adfdd28355562f8112bd" gracePeriod=600 Mar 13 12:46:05 crc kubenswrapper[4837]: I0313 12:46:05.626057 4837 generic.go:334] "Generic (PLEG): container finished" podID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerID="42895e7497f11e52c6189b0227f4673591fee559ac68adfdd28355562f8112bd" exitCode=0 Mar 13 12:46:05 crc kubenswrapper[4837]: I0313 12:46:05.626116 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerDied","Data":"42895e7497f11e52c6189b0227f4673591fee559ac68adfdd28355562f8112bd"} Mar 13 12:46:05 crc kubenswrapper[4837]: I0313 12:46:05.626153 4837 scope.go:117] "RemoveContainer" containerID="0eaeaa7b861d63492055f8d488f2ab733a19588d375928b17dfe13cf022add65" Mar 13 12:46:06 crc kubenswrapper[4837]: I0313 12:46:06.635552 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerStarted","Data":"103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0"} Mar 13 12:46:07 crc kubenswrapper[4837]: I0313 12:46:07.062848 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a273cb74-6dcc-4e87-8f25-db5c77132250" path="/var/lib/kubelet/pods/a273cb74-6dcc-4e87-8f25-db5c77132250/volumes" Mar 13 12:46:32 crc kubenswrapper[4837]: I0313 12:46:32.746757 4837 scope.go:117] "RemoveContainer" containerID="da2f9878f57615785241ef1796e14de81a297b17cfd5ebaf3f55711c66c5482b" Mar 13 12:47:18 crc kubenswrapper[4837]: I0313 12:47:18.315730 4837 generic.go:334] "Generic (PLEG): container finished" podID="8822de14-eaa5-4016-91fd-611718d9b51a" containerID="46bee07e0cc64861f34813174541cff76485e4b8cd9b5fb84ab93fd9eff59fed" exitCode=0 Mar 13 12:47:18 crc kubenswrapper[4837]: I0313 12:47:18.315948 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jkb99/must-gather-4lckb" event={"ID":"8822de14-eaa5-4016-91fd-611718d9b51a","Type":"ContainerDied","Data":"46bee07e0cc64861f34813174541cff76485e4b8cd9b5fb84ab93fd9eff59fed"} Mar 13 12:47:18 crc kubenswrapper[4837]: I0313 12:47:18.316899 4837 scope.go:117] "RemoveContainer" containerID="46bee07e0cc64861f34813174541cff76485e4b8cd9b5fb84ab93fd9eff59fed" Mar 13 12:47:18 crc kubenswrapper[4837]: I0313 12:47:18.912735 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jkb99_must-gather-4lckb_8822de14-eaa5-4016-91fd-611718d9b51a/gather/0.log" Mar 13 12:47:27 crc kubenswrapper[4837]: I0313 12:47:27.259454 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jkb99/must-gather-4lckb"] Mar 13 12:47:27 crc kubenswrapper[4837]: I0313 12:47:27.260330 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-jkb99/must-gather-4lckb" podUID="8822de14-eaa5-4016-91fd-611718d9b51a" containerName="copy" containerID="cri-o://d7374ab200a788a99f53fe2448f4035d1be2d4984c27b0031e0578210408765b" gracePeriod=2 Mar 13 12:47:27 crc kubenswrapper[4837]: I0313 12:47:27.269480 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jkb99/must-gather-4lckb"] Mar 13 12:47:27 crc kubenswrapper[4837]: I0313 12:47:27.402910 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jkb99_must-gather-4lckb_8822de14-eaa5-4016-91fd-611718d9b51a/copy/0.log" Mar 13 12:47:27 crc kubenswrapper[4837]: I0313 12:47:27.403207 4837 generic.go:334] "Generic (PLEG): container finished" podID="8822de14-eaa5-4016-91fd-611718d9b51a" containerID="d7374ab200a788a99f53fe2448f4035d1be2d4984c27b0031e0578210408765b" exitCode=143 Mar 13 12:47:27 crc kubenswrapper[4837]: I0313 12:47:27.770703 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jkb99_must-gather-4lckb_8822de14-eaa5-4016-91fd-611718d9b51a/copy/0.log" Mar 13 12:47:27 crc kubenswrapper[4837]: I0313 12:47:27.771320 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jkb99/must-gather-4lckb" Mar 13 12:47:27 crc kubenswrapper[4837]: I0313 12:47:27.801345 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8822de14-eaa5-4016-91fd-611718d9b51a-must-gather-output\") pod \"8822de14-eaa5-4016-91fd-611718d9b51a\" (UID: \"8822de14-eaa5-4016-91fd-611718d9b51a\") " Mar 13 12:47:27 crc kubenswrapper[4837]: I0313 12:47:27.801406 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbzbt\" (UniqueName: \"kubernetes.io/projected/8822de14-eaa5-4016-91fd-611718d9b51a-kube-api-access-zbzbt\") pod \"8822de14-eaa5-4016-91fd-611718d9b51a\" (UID: \"8822de14-eaa5-4016-91fd-611718d9b51a\") " Mar 13 12:47:27 crc kubenswrapper[4837]: I0313 12:47:27.807519 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8822de14-eaa5-4016-91fd-611718d9b51a-kube-api-access-zbzbt" (OuterVolumeSpecName: "kube-api-access-zbzbt") pod "8822de14-eaa5-4016-91fd-611718d9b51a" (UID: "8822de14-eaa5-4016-91fd-611718d9b51a"). InnerVolumeSpecName "kube-api-access-zbzbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:47:27 crc kubenswrapper[4837]: I0313 12:47:27.904599 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbzbt\" (UniqueName: \"kubernetes.io/projected/8822de14-eaa5-4016-91fd-611718d9b51a-kube-api-access-zbzbt\") on node \"crc\" DevicePath \"\"" Mar 13 12:47:27 crc kubenswrapper[4837]: I0313 12:47:27.963904 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8822de14-eaa5-4016-91fd-611718d9b51a-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "8822de14-eaa5-4016-91fd-611718d9b51a" (UID: "8822de14-eaa5-4016-91fd-611718d9b51a"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:47:28 crc kubenswrapper[4837]: I0313 12:47:28.006855 4837 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8822de14-eaa5-4016-91fd-611718d9b51a-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 13 12:47:28 crc kubenswrapper[4837]: I0313 12:47:28.414027 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jkb99_must-gather-4lckb_8822de14-eaa5-4016-91fd-611718d9b51a/copy/0.log" Mar 13 12:47:28 crc kubenswrapper[4837]: I0313 12:47:28.416334 4837 scope.go:117] "RemoveContainer" containerID="d7374ab200a788a99f53fe2448f4035d1be2d4984c27b0031e0578210408765b" Mar 13 12:47:28 crc kubenswrapper[4837]: I0313 12:47:28.416439 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jkb99/must-gather-4lckb" Mar 13 12:47:28 crc kubenswrapper[4837]: I0313 12:47:28.454900 4837 scope.go:117] "RemoveContainer" containerID="46bee07e0cc64861f34813174541cff76485e4b8cd9b5fb84ab93fd9eff59fed" Mar 13 12:47:29 crc kubenswrapper[4837]: I0313 12:47:29.064847 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8822de14-eaa5-4016-91fd-611718d9b51a" path="/var/lib/kubelet/pods/8822de14-eaa5-4016-91fd-611718d9b51a/volumes" Mar 13 12:48:00 crc kubenswrapper[4837]: I0313 12:48:00.147733 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556768-f66sk"] Mar 13 12:48:00 crc kubenswrapper[4837]: E0313 12:48:00.148677 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8822de14-eaa5-4016-91fd-611718d9b51a" containerName="copy" Mar 13 12:48:00 crc kubenswrapper[4837]: I0313 12:48:00.148693 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="8822de14-eaa5-4016-91fd-611718d9b51a" containerName="copy" Mar 13 12:48:00 crc kubenswrapper[4837]: E0313 12:48:00.148724 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="528b7541-8fac-4df9-9168-c3166532618d" containerName="oc" Mar 13 12:48:00 crc kubenswrapper[4837]: I0313 12:48:00.148732 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="528b7541-8fac-4df9-9168-c3166532618d" containerName="oc" Mar 13 12:48:00 crc kubenswrapper[4837]: E0313 12:48:00.148748 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8822de14-eaa5-4016-91fd-611718d9b51a" containerName="gather" Mar 13 12:48:00 crc kubenswrapper[4837]: I0313 12:48:00.148756 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="8822de14-eaa5-4016-91fd-611718d9b51a" containerName="gather" Mar 13 12:48:00 crc kubenswrapper[4837]: I0313 12:48:00.148995 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="8822de14-eaa5-4016-91fd-611718d9b51a" containerName="copy" Mar 13 12:48:00 crc kubenswrapper[4837]: I0313 12:48:00.149012 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="528b7541-8fac-4df9-9168-c3166532618d" containerName="oc" Mar 13 12:48:00 crc kubenswrapper[4837]: I0313 12:48:00.149026 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="8822de14-eaa5-4016-91fd-611718d9b51a" containerName="gather" Mar 13 12:48:00 crc kubenswrapper[4837]: I0313 12:48:00.149796 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556768-f66sk" Mar 13 12:48:00 crc kubenswrapper[4837]: I0313 12:48:00.151975 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:48:00 crc kubenswrapper[4837]: I0313 12:48:00.154910 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:48:00 crc kubenswrapper[4837]: I0313 12:48:00.155302 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:48:00 crc kubenswrapper[4837]: I0313 12:48:00.186825 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556768-f66sk"] Mar 13 12:48:00 crc kubenswrapper[4837]: I0313 12:48:00.208953 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7wzc\" (UniqueName: \"kubernetes.io/projected/1b297ac1-71ba-4b15-b915-a38f9da4ebb7-kube-api-access-s7wzc\") pod \"auto-csr-approver-29556768-f66sk\" (UID: \"1b297ac1-71ba-4b15-b915-a38f9da4ebb7\") " pod="openshift-infra/auto-csr-approver-29556768-f66sk" Mar 13 12:48:00 crc kubenswrapper[4837]: I0313 12:48:00.311924 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7wzc\" (UniqueName: \"kubernetes.io/projected/1b297ac1-71ba-4b15-b915-a38f9da4ebb7-kube-api-access-s7wzc\") pod \"auto-csr-approver-29556768-f66sk\" (UID: \"1b297ac1-71ba-4b15-b915-a38f9da4ebb7\") " pod="openshift-infra/auto-csr-approver-29556768-f66sk" Mar 13 12:48:00 crc kubenswrapper[4837]: I0313 12:48:00.331991 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7wzc\" (UniqueName: \"kubernetes.io/projected/1b297ac1-71ba-4b15-b915-a38f9da4ebb7-kube-api-access-s7wzc\") pod \"auto-csr-approver-29556768-f66sk\" (UID: \"1b297ac1-71ba-4b15-b915-a38f9da4ebb7\") " pod="openshift-infra/auto-csr-approver-29556768-f66sk" Mar 13 12:48:00 crc kubenswrapper[4837]: I0313 12:48:00.495583 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556768-f66sk" Mar 13 12:48:00 crc kubenswrapper[4837]: I0313 12:48:00.922608 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556768-f66sk"] Mar 13 12:48:00 crc kubenswrapper[4837]: W0313 12:48:00.931208 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b297ac1_71ba_4b15_b915_a38f9da4ebb7.slice/crio-be55f861f9827b65e56a18da973cb785ee7a3d9371ba71c4db0877383b2afeb1 WatchSource:0}: Error finding container be55f861f9827b65e56a18da973cb785ee7a3d9371ba71c4db0877383b2afeb1: Status 404 returned error can't find the container with id be55f861f9827b65e56a18da973cb785ee7a3d9371ba71c4db0877383b2afeb1 Mar 13 12:48:00 crc kubenswrapper[4837]: I0313 12:48:00.933894 4837 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 12:48:01 crc kubenswrapper[4837]: I0313 12:48:01.736332 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556768-f66sk" event={"ID":"1b297ac1-71ba-4b15-b915-a38f9da4ebb7","Type":"ContainerStarted","Data":"be55f861f9827b65e56a18da973cb785ee7a3d9371ba71c4db0877383b2afeb1"} Mar 13 12:48:02 crc kubenswrapper[4837]: I0313 12:48:02.749875 4837 generic.go:334] "Generic (PLEG): container finished" podID="1b297ac1-71ba-4b15-b915-a38f9da4ebb7" containerID="6da52e600ecb49afa497ca1fed54ebec9623af66e73a4cbe5e0c9804569c398b" exitCode=0 Mar 13 12:48:02 crc kubenswrapper[4837]: I0313 12:48:02.750171 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556768-f66sk" event={"ID":"1b297ac1-71ba-4b15-b915-a38f9da4ebb7","Type":"ContainerDied","Data":"6da52e600ecb49afa497ca1fed54ebec9623af66e73a4cbe5e0c9804569c398b"} Mar 13 12:48:04 crc kubenswrapper[4837]: I0313 12:48:04.104372 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556768-f66sk" Mar 13 12:48:04 crc kubenswrapper[4837]: I0313 12:48:04.277414 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7wzc\" (UniqueName: \"kubernetes.io/projected/1b297ac1-71ba-4b15-b915-a38f9da4ebb7-kube-api-access-s7wzc\") pod \"1b297ac1-71ba-4b15-b915-a38f9da4ebb7\" (UID: \"1b297ac1-71ba-4b15-b915-a38f9da4ebb7\") " Mar 13 12:48:04 crc kubenswrapper[4837]: I0313 12:48:04.284401 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b297ac1-71ba-4b15-b915-a38f9da4ebb7-kube-api-access-s7wzc" (OuterVolumeSpecName: "kube-api-access-s7wzc") pod "1b297ac1-71ba-4b15-b915-a38f9da4ebb7" (UID: "1b297ac1-71ba-4b15-b915-a38f9da4ebb7"). InnerVolumeSpecName "kube-api-access-s7wzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:48:04 crc kubenswrapper[4837]: I0313 12:48:04.379962 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7wzc\" (UniqueName: \"kubernetes.io/projected/1b297ac1-71ba-4b15-b915-a38f9da4ebb7-kube-api-access-s7wzc\") on node \"crc\" DevicePath \"\"" Mar 13 12:48:04 crc kubenswrapper[4837]: I0313 12:48:04.778488 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556768-f66sk" event={"ID":"1b297ac1-71ba-4b15-b915-a38f9da4ebb7","Type":"ContainerDied","Data":"be55f861f9827b65e56a18da973cb785ee7a3d9371ba71c4db0877383b2afeb1"} Mar 13 12:48:04 crc kubenswrapper[4837]: I0313 12:48:04.778538 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be55f861f9827b65e56a18da973cb785ee7a3d9371ba71c4db0877383b2afeb1" Mar 13 12:48:04 crc kubenswrapper[4837]: I0313 12:48:04.778552 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556768-f66sk" Mar 13 12:48:05 crc kubenswrapper[4837]: I0313 12:48:05.167977 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556762-g52qb"] Mar 13 12:48:05 crc kubenswrapper[4837]: I0313 12:48:05.176958 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556762-g52qb"] Mar 13 12:48:05 crc kubenswrapper[4837]: I0313 12:48:05.483579 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:48:05 crc kubenswrapper[4837]: I0313 12:48:05.484293 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:48:07 crc kubenswrapper[4837]: I0313 12:48:07.062014 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5bfa2ad-f9e7-42e8-b9ea-cf4a1a5c6ca4" path="/var/lib/kubelet/pods/e5bfa2ad-f9e7-42e8-b9ea-cf4a1a5c6ca4/volumes" Mar 13 12:48:32 crc kubenswrapper[4837]: I0313 12:48:32.866404 4837 scope.go:117] "RemoveContainer" containerID="d6ca53672f75fdcf8f31c32bb76f3e903dae1282d3f22ff4ff5cc9e6da3282e1" Mar 13 12:48:35 crc kubenswrapper[4837]: I0313 12:48:35.483902 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:48:35 crc kubenswrapper[4837]: I0313 12:48:35.484525 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:49:05 crc kubenswrapper[4837]: I0313 12:49:05.483801 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:49:05 crc kubenswrapper[4837]: I0313 12:49:05.484325 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:49:05 crc kubenswrapper[4837]: I0313 12:49:05.484364 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 12:49:05 crc kubenswrapper[4837]: I0313 12:49:05.484949 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0"} pod="openshift-machine-config-operator/machine-config-daemon-2td4d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 12:49:05 crc kubenswrapper[4837]: I0313 12:49:05.485001 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" containerID="cri-o://103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0" gracePeriod=600 Mar 13 12:49:05 crc kubenswrapper[4837]: E0313 12:49:05.605561 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:49:06 crc kubenswrapper[4837]: I0313 12:49:06.326329 4837 generic.go:334] "Generic (PLEG): container finished" podID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerID="103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0" exitCode=0 Mar 13 12:49:06 crc kubenswrapper[4837]: I0313 12:49:06.326377 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerDied","Data":"103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0"} Mar 13 12:49:06 crc kubenswrapper[4837]: I0313 12:49:06.326462 4837 scope.go:117] "RemoveContainer" containerID="42895e7497f11e52c6189b0227f4673591fee559ac68adfdd28355562f8112bd" Mar 13 12:49:06 crc kubenswrapper[4837]: I0313 12:49:06.327135 4837 scope.go:117] "RemoveContainer" containerID="103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0" Mar 13 12:49:06 crc kubenswrapper[4837]: E0313 12:49:06.327436 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:49:18 crc kubenswrapper[4837]: I0313 12:49:18.048909 4837 scope.go:117] "RemoveContainer" containerID="103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0" Mar 13 12:49:18 crc kubenswrapper[4837]: E0313 12:49:18.049894 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:49:32 crc kubenswrapper[4837]: I0313 12:49:32.938863 4837 scope.go:117] "RemoveContainer" containerID="021f2f7590a98a1912559c67d885639fef8ea6affc1fcb856c58211036ebcb42" Mar 13 12:49:33 crc kubenswrapper[4837]: I0313 12:49:33.048664 4837 scope.go:117] "RemoveContainer" containerID="103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0" Mar 13 12:49:33 crc kubenswrapper[4837]: E0313 12:49:33.049358 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:49:44 crc kubenswrapper[4837]: I0313 12:49:44.048336 4837 scope.go:117] "RemoveContainer" containerID="103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0" Mar 13 12:49:44 crc kubenswrapper[4837]: E0313 12:49:44.049136 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:49:59 crc kubenswrapper[4837]: I0313 12:49:59.048230 4837 scope.go:117] "RemoveContainer" containerID="103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0" Mar 13 12:49:59 crc kubenswrapper[4837]: E0313 12:49:59.049052 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:50:00 crc kubenswrapper[4837]: I0313 12:50:00.153260 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556770-5vjcn"] Mar 13 12:50:00 crc kubenswrapper[4837]: E0313 12:50:00.153782 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b297ac1-71ba-4b15-b915-a38f9da4ebb7" containerName="oc" Mar 13 12:50:00 crc kubenswrapper[4837]: I0313 12:50:00.153798 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b297ac1-71ba-4b15-b915-a38f9da4ebb7" containerName="oc" Mar 13 12:50:00 crc kubenswrapper[4837]: I0313 12:50:00.154051 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b297ac1-71ba-4b15-b915-a38f9da4ebb7" containerName="oc" Mar 13 12:50:00 crc kubenswrapper[4837]: I0313 12:50:00.154806 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556770-5vjcn" Mar 13 12:50:00 crc kubenswrapper[4837]: I0313 12:50:00.156969 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:50:00 crc kubenswrapper[4837]: I0313 12:50:00.157066 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:50:00 crc kubenswrapper[4837]: I0313 12:50:00.157330 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:50:00 crc kubenswrapper[4837]: I0313 12:50:00.164622 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556770-5vjcn"] Mar 13 12:50:00 crc kubenswrapper[4837]: I0313 12:50:00.231982 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf7lw\" (UniqueName: \"kubernetes.io/projected/39e3042e-9415-4734-bfa5-8def0b858b6e-kube-api-access-xf7lw\") pod \"auto-csr-approver-29556770-5vjcn\" (UID: \"39e3042e-9415-4734-bfa5-8def0b858b6e\") " pod="openshift-infra/auto-csr-approver-29556770-5vjcn" Mar 13 12:50:00 crc kubenswrapper[4837]: I0313 12:50:00.334232 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xf7lw\" (UniqueName: \"kubernetes.io/projected/39e3042e-9415-4734-bfa5-8def0b858b6e-kube-api-access-xf7lw\") pod \"auto-csr-approver-29556770-5vjcn\" (UID: \"39e3042e-9415-4734-bfa5-8def0b858b6e\") " pod="openshift-infra/auto-csr-approver-29556770-5vjcn" Mar 13 12:50:00 crc kubenswrapper[4837]: I0313 12:50:00.358477 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf7lw\" (UniqueName: \"kubernetes.io/projected/39e3042e-9415-4734-bfa5-8def0b858b6e-kube-api-access-xf7lw\") pod \"auto-csr-approver-29556770-5vjcn\" (UID: \"39e3042e-9415-4734-bfa5-8def0b858b6e\") " pod="openshift-infra/auto-csr-approver-29556770-5vjcn" Mar 13 12:50:00 crc kubenswrapper[4837]: I0313 12:50:00.479424 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556770-5vjcn" Mar 13 12:50:00 crc kubenswrapper[4837]: I0313 12:50:00.937937 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556770-5vjcn"] Mar 13 12:50:01 crc kubenswrapper[4837]: I0313 12:50:01.856917 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556770-5vjcn" event={"ID":"39e3042e-9415-4734-bfa5-8def0b858b6e","Type":"ContainerStarted","Data":"f54e70c22239b0c1bbae84029edcf748edb7110b2fb1a844699a97a3a7ed2e7d"} Mar 13 12:50:02 crc kubenswrapper[4837]: I0313 12:50:02.868259 4837 generic.go:334] "Generic (PLEG): container finished" podID="39e3042e-9415-4734-bfa5-8def0b858b6e" containerID="8a03a622bd1e0141b38071e7ff2bc9ecddb0162408970736756d5805f18fdf44" exitCode=0 Mar 13 12:50:02 crc kubenswrapper[4837]: I0313 12:50:02.868350 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556770-5vjcn" event={"ID":"39e3042e-9415-4734-bfa5-8def0b858b6e","Type":"ContainerDied","Data":"8a03a622bd1e0141b38071e7ff2bc9ecddb0162408970736756d5805f18fdf44"} Mar 13 12:50:04 crc kubenswrapper[4837]: I0313 12:50:04.189115 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556770-5vjcn" Mar 13 12:50:04 crc kubenswrapper[4837]: I0313 12:50:04.310775 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xf7lw\" (UniqueName: \"kubernetes.io/projected/39e3042e-9415-4734-bfa5-8def0b858b6e-kube-api-access-xf7lw\") pod \"39e3042e-9415-4734-bfa5-8def0b858b6e\" (UID: \"39e3042e-9415-4734-bfa5-8def0b858b6e\") " Mar 13 12:50:04 crc kubenswrapper[4837]: I0313 12:50:04.324900 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39e3042e-9415-4734-bfa5-8def0b858b6e-kube-api-access-xf7lw" (OuterVolumeSpecName: "kube-api-access-xf7lw") pod "39e3042e-9415-4734-bfa5-8def0b858b6e" (UID: "39e3042e-9415-4734-bfa5-8def0b858b6e"). InnerVolumeSpecName "kube-api-access-xf7lw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:50:04 crc kubenswrapper[4837]: I0313 12:50:04.413748 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xf7lw\" (UniqueName: \"kubernetes.io/projected/39e3042e-9415-4734-bfa5-8def0b858b6e-kube-api-access-xf7lw\") on node \"crc\" DevicePath \"\"" Mar 13 12:50:04 crc kubenswrapper[4837]: I0313 12:50:04.888136 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556770-5vjcn" event={"ID":"39e3042e-9415-4734-bfa5-8def0b858b6e","Type":"ContainerDied","Data":"f54e70c22239b0c1bbae84029edcf748edb7110b2fb1a844699a97a3a7ed2e7d"} Mar 13 12:50:04 crc kubenswrapper[4837]: I0313 12:50:04.888183 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f54e70c22239b0c1bbae84029edcf748edb7110b2fb1a844699a97a3a7ed2e7d" Mar 13 12:50:04 crc kubenswrapper[4837]: I0313 12:50:04.888228 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556770-5vjcn" Mar 13 12:50:05 crc kubenswrapper[4837]: I0313 12:50:05.273182 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556764-r7n49"] Mar 13 12:50:05 crc kubenswrapper[4837]: I0313 12:50:05.285746 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556764-r7n49"] Mar 13 12:50:07 crc kubenswrapper[4837]: I0313 12:50:07.059947 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e88464a-8619-4750-ac96-b1ad569fcece" path="/var/lib/kubelet/pods/7e88464a-8619-4750-ac96-b1ad569fcece/volumes" Mar 13 12:50:10 crc kubenswrapper[4837]: I0313 12:50:10.048176 4837 scope.go:117] "RemoveContainer" containerID="103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0" Mar 13 12:50:10 crc kubenswrapper[4837]: E0313 12:50:10.049941 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:50:23 crc kubenswrapper[4837]: I0313 12:50:23.049139 4837 scope.go:117] "RemoveContainer" containerID="103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0" Mar 13 12:50:23 crc kubenswrapper[4837]: E0313 12:50:23.049951 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:50:30 crc kubenswrapper[4837]: I0313 12:50:30.126860 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lkqv5/must-gather-vz7zz"] Mar 13 12:50:30 crc kubenswrapper[4837]: E0313 12:50:30.127462 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39e3042e-9415-4734-bfa5-8def0b858b6e" containerName="oc" Mar 13 12:50:30 crc kubenswrapper[4837]: I0313 12:50:30.127477 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="39e3042e-9415-4734-bfa5-8def0b858b6e" containerName="oc" Mar 13 12:50:30 crc kubenswrapper[4837]: I0313 12:50:30.127696 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="39e3042e-9415-4734-bfa5-8def0b858b6e" containerName="oc" Mar 13 12:50:30 crc kubenswrapper[4837]: I0313 12:50:30.128725 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lkqv5/must-gather-vz7zz" Mar 13 12:50:30 crc kubenswrapper[4837]: I0313 12:50:30.133170 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-lkqv5"/"kube-root-ca.crt" Mar 13 12:50:30 crc kubenswrapper[4837]: I0313 12:50:30.133802 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-lkqv5"/"openshift-service-ca.crt" Mar 13 12:50:30 crc kubenswrapper[4837]: I0313 12:50:30.148993 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lkqv5/must-gather-vz7zz"] Mar 13 12:50:30 crc kubenswrapper[4837]: I0313 12:50:30.224332 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzd72\" (UniqueName: \"kubernetes.io/projected/130c1c0e-31b1-415d-aab2-fab358576a73-kube-api-access-tzd72\") pod \"must-gather-vz7zz\" (UID: \"130c1c0e-31b1-415d-aab2-fab358576a73\") " pod="openshift-must-gather-lkqv5/must-gather-vz7zz" Mar 13 12:50:30 crc kubenswrapper[4837]: I0313 12:50:30.224400 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/130c1c0e-31b1-415d-aab2-fab358576a73-must-gather-output\") pod \"must-gather-vz7zz\" (UID: \"130c1c0e-31b1-415d-aab2-fab358576a73\") " pod="openshift-must-gather-lkqv5/must-gather-vz7zz" Mar 13 12:50:30 crc kubenswrapper[4837]: I0313 12:50:30.326075 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzd72\" (UniqueName: \"kubernetes.io/projected/130c1c0e-31b1-415d-aab2-fab358576a73-kube-api-access-tzd72\") pod \"must-gather-vz7zz\" (UID: \"130c1c0e-31b1-415d-aab2-fab358576a73\") " pod="openshift-must-gather-lkqv5/must-gather-vz7zz" Mar 13 12:50:30 crc kubenswrapper[4837]: I0313 12:50:30.326116 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/130c1c0e-31b1-415d-aab2-fab358576a73-must-gather-output\") pod \"must-gather-vz7zz\" (UID: \"130c1c0e-31b1-415d-aab2-fab358576a73\") " pod="openshift-must-gather-lkqv5/must-gather-vz7zz" Mar 13 12:50:30 crc kubenswrapper[4837]: I0313 12:50:30.326599 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/130c1c0e-31b1-415d-aab2-fab358576a73-must-gather-output\") pod \"must-gather-vz7zz\" (UID: \"130c1c0e-31b1-415d-aab2-fab358576a73\") " pod="openshift-must-gather-lkqv5/must-gather-vz7zz" Mar 13 12:50:30 crc kubenswrapper[4837]: I0313 12:50:30.351450 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzd72\" (UniqueName: \"kubernetes.io/projected/130c1c0e-31b1-415d-aab2-fab358576a73-kube-api-access-tzd72\") pod \"must-gather-vz7zz\" (UID: \"130c1c0e-31b1-415d-aab2-fab358576a73\") " pod="openshift-must-gather-lkqv5/must-gather-vz7zz" Mar 13 12:50:30 crc kubenswrapper[4837]: I0313 12:50:30.494771 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lkqv5/must-gather-vz7zz" Mar 13 12:50:30 crc kubenswrapper[4837]: I0313 12:50:30.741899 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lkqv5/must-gather-vz7zz"] Mar 13 12:50:31 crc kubenswrapper[4837]: I0313 12:50:31.180883 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lkqv5/must-gather-vz7zz" event={"ID":"130c1c0e-31b1-415d-aab2-fab358576a73","Type":"ContainerStarted","Data":"433a139fea2255c45e8580415a3deca8258493b41f46198b67c0eac345fb5a75"} Mar 13 12:50:31 crc kubenswrapper[4837]: I0313 12:50:31.180937 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lkqv5/must-gather-vz7zz" event={"ID":"130c1c0e-31b1-415d-aab2-fab358576a73","Type":"ContainerStarted","Data":"67e6ef7e4be8b057b18838efc07c184fe5840d59f0bbbe9911d4187ef608fd8c"} Mar 13 12:50:32 crc kubenswrapper[4837]: I0313 12:50:32.216809 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lkqv5/must-gather-vz7zz" event={"ID":"130c1c0e-31b1-415d-aab2-fab358576a73","Type":"ContainerStarted","Data":"bc010a3c2a92443b50c947cd27f9323f2921ea8aae80c058217be8b624f5d427"} Mar 13 12:50:32 crc kubenswrapper[4837]: I0313 12:50:32.248321 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-lkqv5/must-gather-vz7zz" podStartSLOduration=2.248297288 podStartE2EDuration="2.248297288s" podCreationTimestamp="2026-03-13 12:50:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:50:32.23555377 +0000 UTC m=+3747.873820533" watchObservedRunningTime="2026-03-13 12:50:32.248297288 +0000 UTC m=+3747.886564061" Mar 13 12:50:32 crc kubenswrapper[4837]: I0313 12:50:32.983883 4837 scope.go:117] "RemoveContainer" containerID="1dc84242c71f8e5d31bcd05b0ae44aeb29c8a625295bbab7f2eb79c610ba55a4" Mar 13 12:50:34 crc kubenswrapper[4837]: I0313 12:50:34.549522 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lkqv5/crc-debug-djqf7"] Mar 13 12:50:34 crc kubenswrapper[4837]: I0313 12:50:34.551014 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lkqv5/crc-debug-djqf7" Mar 13 12:50:34 crc kubenswrapper[4837]: I0313 12:50:34.555451 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-lkqv5"/"default-dockercfg-p2glw" Mar 13 12:50:34 crc kubenswrapper[4837]: I0313 12:50:34.707651 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb9wk\" (UniqueName: \"kubernetes.io/projected/70913aeb-b1cd-4a84-b043-8b10d8a28196-kube-api-access-qb9wk\") pod \"crc-debug-djqf7\" (UID: \"70913aeb-b1cd-4a84-b043-8b10d8a28196\") " pod="openshift-must-gather-lkqv5/crc-debug-djqf7" Mar 13 12:50:34 crc kubenswrapper[4837]: I0313 12:50:34.707894 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/70913aeb-b1cd-4a84-b043-8b10d8a28196-host\") pod \"crc-debug-djqf7\" (UID: \"70913aeb-b1cd-4a84-b043-8b10d8a28196\") " pod="openshift-must-gather-lkqv5/crc-debug-djqf7" Mar 13 12:50:34 crc kubenswrapper[4837]: I0313 12:50:34.810094 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qb9wk\" (UniqueName: \"kubernetes.io/projected/70913aeb-b1cd-4a84-b043-8b10d8a28196-kube-api-access-qb9wk\") pod \"crc-debug-djqf7\" (UID: \"70913aeb-b1cd-4a84-b043-8b10d8a28196\") " pod="openshift-must-gather-lkqv5/crc-debug-djqf7" Mar 13 12:50:34 crc kubenswrapper[4837]: I0313 12:50:34.810221 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/70913aeb-b1cd-4a84-b043-8b10d8a28196-host\") pod \"crc-debug-djqf7\" (UID: \"70913aeb-b1cd-4a84-b043-8b10d8a28196\") " pod="openshift-must-gather-lkqv5/crc-debug-djqf7" Mar 13 12:50:34 crc kubenswrapper[4837]: I0313 12:50:34.810310 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/70913aeb-b1cd-4a84-b043-8b10d8a28196-host\") pod \"crc-debug-djqf7\" (UID: \"70913aeb-b1cd-4a84-b043-8b10d8a28196\") " pod="openshift-must-gather-lkqv5/crc-debug-djqf7" Mar 13 12:50:34 crc kubenswrapper[4837]: I0313 12:50:34.828236 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb9wk\" (UniqueName: \"kubernetes.io/projected/70913aeb-b1cd-4a84-b043-8b10d8a28196-kube-api-access-qb9wk\") pod \"crc-debug-djqf7\" (UID: \"70913aeb-b1cd-4a84-b043-8b10d8a28196\") " pod="openshift-must-gather-lkqv5/crc-debug-djqf7" Mar 13 12:50:34 crc kubenswrapper[4837]: I0313 12:50:34.868476 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lkqv5/crc-debug-djqf7" Mar 13 12:50:34 crc kubenswrapper[4837]: W0313 12:50:34.917391 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70913aeb_b1cd_4a84_b043_8b10d8a28196.slice/crio-a84cdf79e4cee6ea4dcee02331b86c1229aab71ff3081c6570c2ff689e35e097 WatchSource:0}: Error finding container a84cdf79e4cee6ea4dcee02331b86c1229aab71ff3081c6570c2ff689e35e097: Status 404 returned error can't find the container with id a84cdf79e4cee6ea4dcee02331b86c1229aab71ff3081c6570c2ff689e35e097 Mar 13 12:50:35 crc kubenswrapper[4837]: I0313 12:50:35.053421 4837 scope.go:117] "RemoveContainer" containerID="103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0" Mar 13 12:50:35 crc kubenswrapper[4837]: E0313 12:50:35.053906 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:50:35 crc kubenswrapper[4837]: I0313 12:50:35.242150 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lkqv5/crc-debug-djqf7" event={"ID":"70913aeb-b1cd-4a84-b043-8b10d8a28196","Type":"ContainerStarted","Data":"bd2cc56ba5a1a7ecd3acb1a078af5a3a6894476f89e6f14c15c62f2a11f1660e"} Mar 13 12:50:35 crc kubenswrapper[4837]: I0313 12:50:35.242823 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lkqv5/crc-debug-djqf7" event={"ID":"70913aeb-b1cd-4a84-b043-8b10d8a28196","Type":"ContainerStarted","Data":"a84cdf79e4cee6ea4dcee02331b86c1229aab71ff3081c6570c2ff689e35e097"} Mar 13 12:50:35 crc kubenswrapper[4837]: I0313 12:50:35.257793 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-lkqv5/crc-debug-djqf7" podStartSLOduration=1.257773597 podStartE2EDuration="1.257773597s" podCreationTimestamp="2026-03-13 12:50:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 12:50:35.254939378 +0000 UTC m=+3750.893206141" watchObservedRunningTime="2026-03-13 12:50:35.257773597 +0000 UTC m=+3750.896040360" Mar 13 12:50:49 crc kubenswrapper[4837]: I0313 12:50:49.048678 4837 scope.go:117] "RemoveContainer" containerID="103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0" Mar 13 12:50:49 crc kubenswrapper[4837]: E0313 12:50:49.049480 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:51:02 crc kubenswrapper[4837]: I0313 12:51:02.048206 4837 scope.go:117] "RemoveContainer" containerID="103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0" Mar 13 12:51:02 crc kubenswrapper[4837]: E0313 12:51:02.049133 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:51:09 crc kubenswrapper[4837]: I0313 12:51:09.528795 4837 generic.go:334] "Generic (PLEG): container finished" podID="70913aeb-b1cd-4a84-b043-8b10d8a28196" containerID="bd2cc56ba5a1a7ecd3acb1a078af5a3a6894476f89e6f14c15c62f2a11f1660e" exitCode=0 Mar 13 12:51:09 crc kubenswrapper[4837]: I0313 12:51:09.528879 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lkqv5/crc-debug-djqf7" event={"ID":"70913aeb-b1cd-4a84-b043-8b10d8a28196","Type":"ContainerDied","Data":"bd2cc56ba5a1a7ecd3acb1a078af5a3a6894476f89e6f14c15c62f2a11f1660e"} Mar 13 12:51:10 crc kubenswrapper[4837]: I0313 12:51:10.672838 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lkqv5/crc-debug-djqf7" Mar 13 12:51:10 crc kubenswrapper[4837]: I0313 12:51:10.704535 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-lkqv5/crc-debug-djqf7"] Mar 13 12:51:10 crc kubenswrapper[4837]: I0313 12:51:10.716188 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-lkqv5/crc-debug-djqf7"] Mar 13 12:51:10 crc kubenswrapper[4837]: I0313 12:51:10.796588 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/70913aeb-b1cd-4a84-b043-8b10d8a28196-host\") pod \"70913aeb-b1cd-4a84-b043-8b10d8a28196\" (UID: \"70913aeb-b1cd-4a84-b043-8b10d8a28196\") " Mar 13 12:51:10 crc kubenswrapper[4837]: I0313 12:51:10.796784 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qb9wk\" (UniqueName: \"kubernetes.io/projected/70913aeb-b1cd-4a84-b043-8b10d8a28196-kube-api-access-qb9wk\") pod \"70913aeb-b1cd-4a84-b043-8b10d8a28196\" (UID: \"70913aeb-b1cd-4a84-b043-8b10d8a28196\") " Mar 13 12:51:10 crc kubenswrapper[4837]: I0313 12:51:10.797091 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/70913aeb-b1cd-4a84-b043-8b10d8a28196-host" (OuterVolumeSpecName: "host") pod "70913aeb-b1cd-4a84-b043-8b10d8a28196" (UID: "70913aeb-b1cd-4a84-b043-8b10d8a28196"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:51:10 crc kubenswrapper[4837]: I0313 12:51:10.797483 4837 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/70913aeb-b1cd-4a84-b043-8b10d8a28196-host\") on node \"crc\" DevicePath \"\"" Mar 13 12:51:10 crc kubenswrapper[4837]: I0313 12:51:10.805902 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70913aeb-b1cd-4a84-b043-8b10d8a28196-kube-api-access-qb9wk" (OuterVolumeSpecName: "kube-api-access-qb9wk") pod "70913aeb-b1cd-4a84-b043-8b10d8a28196" (UID: "70913aeb-b1cd-4a84-b043-8b10d8a28196"). InnerVolumeSpecName "kube-api-access-qb9wk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:51:10 crc kubenswrapper[4837]: I0313 12:51:10.899382 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qb9wk\" (UniqueName: \"kubernetes.io/projected/70913aeb-b1cd-4a84-b043-8b10d8a28196-kube-api-access-qb9wk\") on node \"crc\" DevicePath \"\"" Mar 13 12:51:11 crc kubenswrapper[4837]: I0313 12:51:11.067625 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70913aeb-b1cd-4a84-b043-8b10d8a28196" path="/var/lib/kubelet/pods/70913aeb-b1cd-4a84-b043-8b10d8a28196/volumes" Mar 13 12:51:11 crc kubenswrapper[4837]: I0313 12:51:11.548249 4837 scope.go:117] "RemoveContainer" containerID="bd2cc56ba5a1a7ecd3acb1a078af5a3a6894476f89e6f14c15c62f2a11f1660e" Mar 13 12:51:11 crc kubenswrapper[4837]: I0313 12:51:11.548288 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lkqv5/crc-debug-djqf7" Mar 13 12:51:12 crc kubenswrapper[4837]: I0313 12:51:12.082264 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lkqv5/crc-debug-4ctbv"] Mar 13 12:51:12 crc kubenswrapper[4837]: E0313 12:51:12.084066 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70913aeb-b1cd-4a84-b043-8b10d8a28196" containerName="container-00" Mar 13 12:51:12 crc kubenswrapper[4837]: I0313 12:51:12.084174 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="70913aeb-b1cd-4a84-b043-8b10d8a28196" containerName="container-00" Mar 13 12:51:12 crc kubenswrapper[4837]: I0313 12:51:12.084436 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="70913aeb-b1cd-4a84-b043-8b10d8a28196" containerName="container-00" Mar 13 12:51:12 crc kubenswrapper[4837]: I0313 12:51:12.085189 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lkqv5/crc-debug-4ctbv" Mar 13 12:51:12 crc kubenswrapper[4837]: I0313 12:51:12.088014 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-lkqv5"/"default-dockercfg-p2glw" Mar 13 12:51:12 crc kubenswrapper[4837]: I0313 12:51:12.221562 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0c98ae53-ab32-4810-8dc5-6989adb356d5-host\") pod \"crc-debug-4ctbv\" (UID: \"0c98ae53-ab32-4810-8dc5-6989adb356d5\") " pod="openshift-must-gather-lkqv5/crc-debug-4ctbv" Mar 13 12:51:12 crc kubenswrapper[4837]: I0313 12:51:12.221765 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkw9c\" (UniqueName: \"kubernetes.io/projected/0c98ae53-ab32-4810-8dc5-6989adb356d5-kube-api-access-kkw9c\") pod \"crc-debug-4ctbv\" (UID: \"0c98ae53-ab32-4810-8dc5-6989adb356d5\") " pod="openshift-must-gather-lkqv5/crc-debug-4ctbv" Mar 13 12:51:12 crc kubenswrapper[4837]: I0313 12:51:12.323686 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0c98ae53-ab32-4810-8dc5-6989adb356d5-host\") pod \"crc-debug-4ctbv\" (UID: \"0c98ae53-ab32-4810-8dc5-6989adb356d5\") " pod="openshift-must-gather-lkqv5/crc-debug-4ctbv" Mar 13 12:51:12 crc kubenswrapper[4837]: I0313 12:51:12.323813 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0c98ae53-ab32-4810-8dc5-6989adb356d5-host\") pod \"crc-debug-4ctbv\" (UID: \"0c98ae53-ab32-4810-8dc5-6989adb356d5\") " pod="openshift-must-gather-lkqv5/crc-debug-4ctbv" Mar 13 12:51:12 crc kubenswrapper[4837]: I0313 12:51:12.324196 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkw9c\" (UniqueName: \"kubernetes.io/projected/0c98ae53-ab32-4810-8dc5-6989adb356d5-kube-api-access-kkw9c\") pod \"crc-debug-4ctbv\" (UID: \"0c98ae53-ab32-4810-8dc5-6989adb356d5\") " pod="openshift-must-gather-lkqv5/crc-debug-4ctbv" Mar 13 12:51:12 crc kubenswrapper[4837]: I0313 12:51:12.348862 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkw9c\" (UniqueName: \"kubernetes.io/projected/0c98ae53-ab32-4810-8dc5-6989adb356d5-kube-api-access-kkw9c\") pod \"crc-debug-4ctbv\" (UID: \"0c98ae53-ab32-4810-8dc5-6989adb356d5\") " pod="openshift-must-gather-lkqv5/crc-debug-4ctbv" Mar 13 12:51:12 crc kubenswrapper[4837]: I0313 12:51:12.406343 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lkqv5/crc-debug-4ctbv" Mar 13 12:51:12 crc kubenswrapper[4837]: I0313 12:51:12.562955 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lkqv5/crc-debug-4ctbv" event={"ID":"0c98ae53-ab32-4810-8dc5-6989adb356d5","Type":"ContainerStarted","Data":"60a594404f1ed7f723a9298dbd0a5df295668ef5d66ffda156410c502f7d3cf2"} Mar 13 12:51:13 crc kubenswrapper[4837]: I0313 12:51:13.795771 4837 scope.go:117] "RemoveContainer" containerID="103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0" Mar 13 12:51:13 crc kubenswrapper[4837]: E0313 12:51:13.796450 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:51:13 crc kubenswrapper[4837]: I0313 12:51:13.805431 4837 generic.go:334] "Generic (PLEG): container finished" podID="0c98ae53-ab32-4810-8dc5-6989adb356d5" containerID="37226ff0b2678a7a2f65c8a485d9c9dab1a4017df2a7227ff6688c23ec1e7cd8" exitCode=0 Mar 13 12:51:13 crc kubenswrapper[4837]: I0313 12:51:13.805487 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lkqv5/crc-debug-4ctbv" event={"ID":"0c98ae53-ab32-4810-8dc5-6989adb356d5","Type":"ContainerDied","Data":"37226ff0b2678a7a2f65c8a485d9c9dab1a4017df2a7227ff6688c23ec1e7cd8"} Mar 13 12:51:14 crc kubenswrapper[4837]: I0313 12:51:14.242539 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-lkqv5/crc-debug-4ctbv"] Mar 13 12:51:14 crc kubenswrapper[4837]: I0313 12:51:14.273331 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-lkqv5/crc-debug-4ctbv"] Mar 13 12:51:14 crc kubenswrapper[4837]: I0313 12:51:14.915439 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lkqv5/crc-debug-4ctbv" Mar 13 12:51:14 crc kubenswrapper[4837]: I0313 12:51:14.975521 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkw9c\" (UniqueName: \"kubernetes.io/projected/0c98ae53-ab32-4810-8dc5-6989adb356d5-kube-api-access-kkw9c\") pod \"0c98ae53-ab32-4810-8dc5-6989adb356d5\" (UID: \"0c98ae53-ab32-4810-8dc5-6989adb356d5\") " Mar 13 12:51:14 crc kubenswrapper[4837]: I0313 12:51:14.975664 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0c98ae53-ab32-4810-8dc5-6989adb356d5-host\") pod \"0c98ae53-ab32-4810-8dc5-6989adb356d5\" (UID: \"0c98ae53-ab32-4810-8dc5-6989adb356d5\") " Mar 13 12:51:14 crc kubenswrapper[4837]: I0313 12:51:14.976226 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0c98ae53-ab32-4810-8dc5-6989adb356d5-host" (OuterVolumeSpecName: "host") pod "0c98ae53-ab32-4810-8dc5-6989adb356d5" (UID: "0c98ae53-ab32-4810-8dc5-6989adb356d5"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:51:14 crc kubenswrapper[4837]: I0313 12:51:14.981901 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c98ae53-ab32-4810-8dc5-6989adb356d5-kube-api-access-kkw9c" (OuterVolumeSpecName: "kube-api-access-kkw9c") pod "0c98ae53-ab32-4810-8dc5-6989adb356d5" (UID: "0c98ae53-ab32-4810-8dc5-6989adb356d5"). InnerVolumeSpecName "kube-api-access-kkw9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:51:15 crc kubenswrapper[4837]: I0313 12:51:15.067393 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c98ae53-ab32-4810-8dc5-6989adb356d5" path="/var/lib/kubelet/pods/0c98ae53-ab32-4810-8dc5-6989adb356d5/volumes" Mar 13 12:51:15 crc kubenswrapper[4837]: I0313 12:51:15.078213 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkw9c\" (UniqueName: \"kubernetes.io/projected/0c98ae53-ab32-4810-8dc5-6989adb356d5-kube-api-access-kkw9c\") on node \"crc\" DevicePath \"\"" Mar 13 12:51:15 crc kubenswrapper[4837]: I0313 12:51:15.078515 4837 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0c98ae53-ab32-4810-8dc5-6989adb356d5-host\") on node \"crc\" DevicePath \"\"" Mar 13 12:51:15 crc kubenswrapper[4837]: I0313 12:51:15.504055 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lkqv5/crc-debug-vdrpl"] Mar 13 12:51:15 crc kubenswrapper[4837]: E0313 12:51:15.504554 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c98ae53-ab32-4810-8dc5-6989adb356d5" containerName="container-00" Mar 13 12:51:15 crc kubenswrapper[4837]: I0313 12:51:15.504577 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c98ae53-ab32-4810-8dc5-6989adb356d5" containerName="container-00" Mar 13 12:51:15 crc kubenswrapper[4837]: I0313 12:51:15.504921 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c98ae53-ab32-4810-8dc5-6989adb356d5" containerName="container-00" Mar 13 12:51:15 crc kubenswrapper[4837]: I0313 12:51:15.505696 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lkqv5/crc-debug-vdrpl" Mar 13 12:51:15 crc kubenswrapper[4837]: I0313 12:51:15.588871 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d11db9b4-2c6a-422b-9de4-ba64c73d8db8-host\") pod \"crc-debug-vdrpl\" (UID: \"d11db9b4-2c6a-422b-9de4-ba64c73d8db8\") " pod="openshift-must-gather-lkqv5/crc-debug-vdrpl" Mar 13 12:51:15 crc kubenswrapper[4837]: I0313 12:51:15.589019 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t5h7\" (UniqueName: \"kubernetes.io/projected/d11db9b4-2c6a-422b-9de4-ba64c73d8db8-kube-api-access-4t5h7\") pod \"crc-debug-vdrpl\" (UID: \"d11db9b4-2c6a-422b-9de4-ba64c73d8db8\") " pod="openshift-must-gather-lkqv5/crc-debug-vdrpl" Mar 13 12:51:15 crc kubenswrapper[4837]: I0313 12:51:15.690938 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t5h7\" (UniqueName: \"kubernetes.io/projected/d11db9b4-2c6a-422b-9de4-ba64c73d8db8-kube-api-access-4t5h7\") pod \"crc-debug-vdrpl\" (UID: \"d11db9b4-2c6a-422b-9de4-ba64c73d8db8\") " pod="openshift-must-gather-lkqv5/crc-debug-vdrpl" Mar 13 12:51:15 crc kubenswrapper[4837]: I0313 12:51:15.691033 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d11db9b4-2c6a-422b-9de4-ba64c73d8db8-host\") pod \"crc-debug-vdrpl\" (UID: \"d11db9b4-2c6a-422b-9de4-ba64c73d8db8\") " pod="openshift-must-gather-lkqv5/crc-debug-vdrpl" Mar 13 12:51:15 crc kubenswrapper[4837]: I0313 12:51:15.691137 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d11db9b4-2c6a-422b-9de4-ba64c73d8db8-host\") pod \"crc-debug-vdrpl\" (UID: \"d11db9b4-2c6a-422b-9de4-ba64c73d8db8\") " pod="openshift-must-gather-lkqv5/crc-debug-vdrpl" Mar 13 12:51:15 crc kubenswrapper[4837]: I0313 12:51:15.708387 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t5h7\" (UniqueName: \"kubernetes.io/projected/d11db9b4-2c6a-422b-9de4-ba64c73d8db8-kube-api-access-4t5h7\") pod \"crc-debug-vdrpl\" (UID: \"d11db9b4-2c6a-422b-9de4-ba64c73d8db8\") " pod="openshift-must-gather-lkqv5/crc-debug-vdrpl" Mar 13 12:51:15 crc kubenswrapper[4837]: I0313 12:51:15.824447 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lkqv5/crc-debug-vdrpl" Mar 13 12:51:15 crc kubenswrapper[4837]: I0313 12:51:15.825858 4837 scope.go:117] "RemoveContainer" containerID="37226ff0b2678a7a2f65c8a485d9c9dab1a4017df2a7227ff6688c23ec1e7cd8" Mar 13 12:51:15 crc kubenswrapper[4837]: I0313 12:51:15.825897 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lkqv5/crc-debug-4ctbv" Mar 13 12:51:15 crc kubenswrapper[4837]: W0313 12:51:15.871721 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd11db9b4_2c6a_422b_9de4_ba64c73d8db8.slice/crio-8c5b501aee306ba1ef1efa0cce4a88cbfbd6bbed51084cd64ab93d5534477206 WatchSource:0}: Error finding container 8c5b501aee306ba1ef1efa0cce4a88cbfbd6bbed51084cd64ab93d5534477206: Status 404 returned error can't find the container with id 8c5b501aee306ba1ef1efa0cce4a88cbfbd6bbed51084cd64ab93d5534477206 Mar 13 12:51:16 crc kubenswrapper[4837]: I0313 12:51:16.835135 4837 generic.go:334] "Generic (PLEG): container finished" podID="d11db9b4-2c6a-422b-9de4-ba64c73d8db8" containerID="54fb761872f281d7c0b689685e33a78b40ef7b8a8fa21695c3ff8545600aa7ca" exitCode=0 Mar 13 12:51:16 crc kubenswrapper[4837]: I0313 12:51:16.835222 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lkqv5/crc-debug-vdrpl" event={"ID":"d11db9b4-2c6a-422b-9de4-ba64c73d8db8","Type":"ContainerDied","Data":"54fb761872f281d7c0b689685e33a78b40ef7b8a8fa21695c3ff8545600aa7ca"} Mar 13 12:51:16 crc kubenswrapper[4837]: I0313 12:51:16.835461 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lkqv5/crc-debug-vdrpl" event={"ID":"d11db9b4-2c6a-422b-9de4-ba64c73d8db8","Type":"ContainerStarted","Data":"8c5b501aee306ba1ef1efa0cce4a88cbfbd6bbed51084cd64ab93d5534477206"} Mar 13 12:51:16 crc kubenswrapper[4837]: I0313 12:51:16.867889 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-lkqv5/crc-debug-vdrpl"] Mar 13 12:51:16 crc kubenswrapper[4837]: I0313 12:51:16.876304 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-lkqv5/crc-debug-vdrpl"] Mar 13 12:51:17 crc kubenswrapper[4837]: I0313 12:51:17.986716 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lkqv5/crc-debug-vdrpl" Mar 13 12:51:18 crc kubenswrapper[4837]: I0313 12:51:18.033921 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4t5h7\" (UniqueName: \"kubernetes.io/projected/d11db9b4-2c6a-422b-9de4-ba64c73d8db8-kube-api-access-4t5h7\") pod \"d11db9b4-2c6a-422b-9de4-ba64c73d8db8\" (UID: \"d11db9b4-2c6a-422b-9de4-ba64c73d8db8\") " Mar 13 12:51:18 crc kubenswrapper[4837]: I0313 12:51:18.034075 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d11db9b4-2c6a-422b-9de4-ba64c73d8db8-host\") pod \"d11db9b4-2c6a-422b-9de4-ba64c73d8db8\" (UID: \"d11db9b4-2c6a-422b-9de4-ba64c73d8db8\") " Mar 13 12:51:18 crc kubenswrapper[4837]: I0313 12:51:18.034206 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d11db9b4-2c6a-422b-9de4-ba64c73d8db8-host" (OuterVolumeSpecName: "host") pod "d11db9b4-2c6a-422b-9de4-ba64c73d8db8" (UID: "d11db9b4-2c6a-422b-9de4-ba64c73d8db8"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 12:51:18 crc kubenswrapper[4837]: I0313 12:51:18.034623 4837 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d11db9b4-2c6a-422b-9de4-ba64c73d8db8-host\") on node \"crc\" DevicePath \"\"" Mar 13 12:51:18 crc kubenswrapper[4837]: I0313 12:51:18.040167 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d11db9b4-2c6a-422b-9de4-ba64c73d8db8-kube-api-access-4t5h7" (OuterVolumeSpecName: "kube-api-access-4t5h7") pod "d11db9b4-2c6a-422b-9de4-ba64c73d8db8" (UID: "d11db9b4-2c6a-422b-9de4-ba64c73d8db8"). InnerVolumeSpecName "kube-api-access-4t5h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:51:18 crc kubenswrapper[4837]: I0313 12:51:18.136864 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4t5h7\" (UniqueName: \"kubernetes.io/projected/d11db9b4-2c6a-422b-9de4-ba64c73d8db8-kube-api-access-4t5h7\") on node \"crc\" DevicePath \"\"" Mar 13 12:51:18 crc kubenswrapper[4837]: I0313 12:51:18.855976 4837 scope.go:117] "RemoveContainer" containerID="54fb761872f281d7c0b689685e33a78b40ef7b8a8fa21695c3ff8545600aa7ca" Mar 13 12:51:18 crc kubenswrapper[4837]: I0313 12:51:18.856019 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lkqv5/crc-debug-vdrpl" Mar 13 12:51:19 crc kubenswrapper[4837]: I0313 12:51:19.058228 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d11db9b4-2c6a-422b-9de4-ba64c73d8db8" path="/var/lib/kubelet/pods/d11db9b4-2c6a-422b-9de4-ba64c73d8db8/volumes" Mar 13 12:51:22 crc kubenswrapper[4837]: I0313 12:51:22.485935 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jn4zc"] Mar 13 12:51:22 crc kubenswrapper[4837]: E0313 12:51:22.486360 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d11db9b4-2c6a-422b-9de4-ba64c73d8db8" containerName="container-00" Mar 13 12:51:22 crc kubenswrapper[4837]: I0313 12:51:22.486372 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="d11db9b4-2c6a-422b-9de4-ba64c73d8db8" containerName="container-00" Mar 13 12:51:22 crc kubenswrapper[4837]: I0313 12:51:22.486573 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="d11db9b4-2c6a-422b-9de4-ba64c73d8db8" containerName="container-00" Mar 13 12:51:22 crc kubenswrapper[4837]: I0313 12:51:22.489896 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jn4zc" Mar 13 12:51:22 crc kubenswrapper[4837]: I0313 12:51:22.510021 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jn4zc"] Mar 13 12:51:22 crc kubenswrapper[4837]: I0313 12:51:22.627127 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n5vh\" (UniqueName: \"kubernetes.io/projected/e35b70ca-0247-45d0-aee2-1fb91eefa45c-kube-api-access-9n5vh\") pod \"certified-operators-jn4zc\" (UID: \"e35b70ca-0247-45d0-aee2-1fb91eefa45c\") " pod="openshift-marketplace/certified-operators-jn4zc" Mar 13 12:51:22 crc kubenswrapper[4837]: I0313 12:51:22.627381 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e35b70ca-0247-45d0-aee2-1fb91eefa45c-utilities\") pod \"certified-operators-jn4zc\" (UID: \"e35b70ca-0247-45d0-aee2-1fb91eefa45c\") " pod="openshift-marketplace/certified-operators-jn4zc" Mar 13 12:51:22 crc kubenswrapper[4837]: I0313 12:51:22.627434 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e35b70ca-0247-45d0-aee2-1fb91eefa45c-catalog-content\") pod \"certified-operators-jn4zc\" (UID: \"e35b70ca-0247-45d0-aee2-1fb91eefa45c\") " pod="openshift-marketplace/certified-operators-jn4zc" Mar 13 12:51:22 crc kubenswrapper[4837]: I0313 12:51:22.729806 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e35b70ca-0247-45d0-aee2-1fb91eefa45c-utilities\") pod \"certified-operators-jn4zc\" (UID: \"e35b70ca-0247-45d0-aee2-1fb91eefa45c\") " pod="openshift-marketplace/certified-operators-jn4zc" Mar 13 12:51:22 crc kubenswrapper[4837]: I0313 12:51:22.729864 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e35b70ca-0247-45d0-aee2-1fb91eefa45c-catalog-content\") pod \"certified-operators-jn4zc\" (UID: \"e35b70ca-0247-45d0-aee2-1fb91eefa45c\") " pod="openshift-marketplace/certified-operators-jn4zc" Mar 13 12:51:22 crc kubenswrapper[4837]: I0313 12:51:22.730000 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9n5vh\" (UniqueName: \"kubernetes.io/projected/e35b70ca-0247-45d0-aee2-1fb91eefa45c-kube-api-access-9n5vh\") pod \"certified-operators-jn4zc\" (UID: \"e35b70ca-0247-45d0-aee2-1fb91eefa45c\") " pod="openshift-marketplace/certified-operators-jn4zc" Mar 13 12:51:22 crc kubenswrapper[4837]: I0313 12:51:22.730326 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e35b70ca-0247-45d0-aee2-1fb91eefa45c-utilities\") pod \"certified-operators-jn4zc\" (UID: \"e35b70ca-0247-45d0-aee2-1fb91eefa45c\") " pod="openshift-marketplace/certified-operators-jn4zc" Mar 13 12:51:22 crc kubenswrapper[4837]: I0313 12:51:22.730469 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e35b70ca-0247-45d0-aee2-1fb91eefa45c-catalog-content\") pod \"certified-operators-jn4zc\" (UID: \"e35b70ca-0247-45d0-aee2-1fb91eefa45c\") " pod="openshift-marketplace/certified-operators-jn4zc" Mar 13 12:51:22 crc kubenswrapper[4837]: I0313 12:51:22.767722 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n5vh\" (UniqueName: \"kubernetes.io/projected/e35b70ca-0247-45d0-aee2-1fb91eefa45c-kube-api-access-9n5vh\") pod \"certified-operators-jn4zc\" (UID: \"e35b70ca-0247-45d0-aee2-1fb91eefa45c\") " pod="openshift-marketplace/certified-operators-jn4zc" Mar 13 12:51:22 crc kubenswrapper[4837]: I0313 12:51:22.817609 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jn4zc" Mar 13 12:51:23 crc kubenswrapper[4837]: I0313 12:51:23.546498 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jn4zc"] Mar 13 12:51:24 crc kubenswrapper[4837]: I0313 12:51:24.109236 4837 generic.go:334] "Generic (PLEG): container finished" podID="e35b70ca-0247-45d0-aee2-1fb91eefa45c" containerID="4cfa78b841c4bfd68fc339c7da7ba7a470ed2e7b9319732dc821f3f2f5974993" exitCode=0 Mar 13 12:51:24 crc kubenswrapper[4837]: I0313 12:51:24.109291 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jn4zc" event={"ID":"e35b70ca-0247-45d0-aee2-1fb91eefa45c","Type":"ContainerDied","Data":"4cfa78b841c4bfd68fc339c7da7ba7a470ed2e7b9319732dc821f3f2f5974993"} Mar 13 12:51:24 crc kubenswrapper[4837]: I0313 12:51:24.109325 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jn4zc" event={"ID":"e35b70ca-0247-45d0-aee2-1fb91eefa45c","Type":"ContainerStarted","Data":"da8784cb3a7c82be8f8439b52b3069bcb72b04da847b72613804c85ea2bb6618"} Mar 13 12:51:25 crc kubenswrapper[4837]: I0313 12:51:25.125405 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jn4zc" event={"ID":"e35b70ca-0247-45d0-aee2-1fb91eefa45c","Type":"ContainerStarted","Data":"018ea4b7b5731d21bb3788eb633c0a2608f7d7437cbf7d79b0b4ecfffaa5bb54"} Mar 13 12:51:27 crc kubenswrapper[4837]: I0313 12:51:27.049805 4837 scope.go:117] "RemoveContainer" containerID="103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0" Mar 13 12:51:27 crc kubenswrapper[4837]: E0313 12:51:27.051378 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:51:27 crc kubenswrapper[4837]: I0313 12:51:27.152243 4837 generic.go:334] "Generic (PLEG): container finished" podID="e35b70ca-0247-45d0-aee2-1fb91eefa45c" containerID="018ea4b7b5731d21bb3788eb633c0a2608f7d7437cbf7d79b0b4ecfffaa5bb54" exitCode=0 Mar 13 12:51:27 crc kubenswrapper[4837]: I0313 12:51:27.152314 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jn4zc" event={"ID":"e35b70ca-0247-45d0-aee2-1fb91eefa45c","Type":"ContainerDied","Data":"018ea4b7b5731d21bb3788eb633c0a2608f7d7437cbf7d79b0b4ecfffaa5bb54"} Mar 13 12:51:28 crc kubenswrapper[4837]: I0313 12:51:28.162890 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jn4zc" event={"ID":"e35b70ca-0247-45d0-aee2-1fb91eefa45c","Type":"ContainerStarted","Data":"75019d955c224e1828886455a7893c37749e325e23de7f29d47424fb80702f73"} Mar 13 12:51:28 crc kubenswrapper[4837]: I0313 12:51:28.190509 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jn4zc" podStartSLOduration=2.756689093 podStartE2EDuration="6.190487637s" podCreationTimestamp="2026-03-13 12:51:22 +0000 UTC" firstStartedPulling="2026-03-13 12:51:24.11215402 +0000 UTC m=+3799.750420783" lastFinishedPulling="2026-03-13 12:51:27.545952564 +0000 UTC m=+3803.184219327" observedRunningTime="2026-03-13 12:51:28.181349612 +0000 UTC m=+3803.819616375" watchObservedRunningTime="2026-03-13 12:51:28.190487637 +0000 UTC m=+3803.828754400" Mar 13 12:51:32 crc kubenswrapper[4837]: I0313 12:51:32.818990 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jn4zc" Mar 13 12:51:32 crc kubenswrapper[4837]: I0313 12:51:32.819658 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jn4zc" Mar 13 12:51:32 crc kubenswrapper[4837]: I0313 12:51:32.882274 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jn4zc" Mar 13 12:51:33 crc kubenswrapper[4837]: I0313 12:51:33.244294 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jn4zc" Mar 13 12:51:33 crc kubenswrapper[4837]: I0313 12:51:33.292574 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jn4zc"] Mar 13 12:51:35 crc kubenswrapper[4837]: I0313 12:51:35.216774 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jn4zc" podUID="e35b70ca-0247-45d0-aee2-1fb91eefa45c" containerName="registry-server" containerID="cri-o://75019d955c224e1828886455a7893c37749e325e23de7f29d47424fb80702f73" gracePeriod=2 Mar 13 12:51:35 crc kubenswrapper[4837]: I0313 12:51:35.685007 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jn4zc" Mar 13 12:51:35 crc kubenswrapper[4837]: I0313 12:51:35.824816 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e35b70ca-0247-45d0-aee2-1fb91eefa45c-catalog-content\") pod \"e35b70ca-0247-45d0-aee2-1fb91eefa45c\" (UID: \"e35b70ca-0247-45d0-aee2-1fb91eefa45c\") " Mar 13 12:51:35 crc kubenswrapper[4837]: I0313 12:51:35.824930 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9n5vh\" (UniqueName: \"kubernetes.io/projected/e35b70ca-0247-45d0-aee2-1fb91eefa45c-kube-api-access-9n5vh\") pod \"e35b70ca-0247-45d0-aee2-1fb91eefa45c\" (UID: \"e35b70ca-0247-45d0-aee2-1fb91eefa45c\") " Mar 13 12:51:35 crc kubenswrapper[4837]: I0313 12:51:35.825066 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e35b70ca-0247-45d0-aee2-1fb91eefa45c-utilities\") pod \"e35b70ca-0247-45d0-aee2-1fb91eefa45c\" (UID: \"e35b70ca-0247-45d0-aee2-1fb91eefa45c\") " Mar 13 12:51:35 crc kubenswrapper[4837]: I0313 12:51:35.825822 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e35b70ca-0247-45d0-aee2-1fb91eefa45c-utilities" (OuterVolumeSpecName: "utilities") pod "e35b70ca-0247-45d0-aee2-1fb91eefa45c" (UID: "e35b70ca-0247-45d0-aee2-1fb91eefa45c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:51:35 crc kubenswrapper[4837]: I0313 12:51:35.833500 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e35b70ca-0247-45d0-aee2-1fb91eefa45c-kube-api-access-9n5vh" (OuterVolumeSpecName: "kube-api-access-9n5vh") pod "e35b70ca-0247-45d0-aee2-1fb91eefa45c" (UID: "e35b70ca-0247-45d0-aee2-1fb91eefa45c"). InnerVolumeSpecName "kube-api-access-9n5vh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:51:35 crc kubenswrapper[4837]: I0313 12:51:35.880947 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e35b70ca-0247-45d0-aee2-1fb91eefa45c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e35b70ca-0247-45d0-aee2-1fb91eefa45c" (UID: "e35b70ca-0247-45d0-aee2-1fb91eefa45c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:51:35 crc kubenswrapper[4837]: I0313 12:51:35.927481 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e35b70ca-0247-45d0-aee2-1fb91eefa45c-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:51:35 crc kubenswrapper[4837]: I0313 12:51:35.927825 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e35b70ca-0247-45d0-aee2-1fb91eefa45c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:51:35 crc kubenswrapper[4837]: I0313 12:51:35.927840 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9n5vh\" (UniqueName: \"kubernetes.io/projected/e35b70ca-0247-45d0-aee2-1fb91eefa45c-kube-api-access-9n5vh\") on node \"crc\" DevicePath \"\"" Mar 13 12:51:36 crc kubenswrapper[4837]: I0313 12:51:36.228522 4837 generic.go:334] "Generic (PLEG): container finished" podID="e35b70ca-0247-45d0-aee2-1fb91eefa45c" containerID="75019d955c224e1828886455a7893c37749e325e23de7f29d47424fb80702f73" exitCode=0 Mar 13 12:51:36 crc kubenswrapper[4837]: I0313 12:51:36.228563 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jn4zc" event={"ID":"e35b70ca-0247-45d0-aee2-1fb91eefa45c","Type":"ContainerDied","Data":"75019d955c224e1828886455a7893c37749e325e23de7f29d47424fb80702f73"} Mar 13 12:51:36 crc kubenswrapper[4837]: I0313 12:51:36.228587 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jn4zc" Mar 13 12:51:36 crc kubenswrapper[4837]: I0313 12:51:36.228604 4837 scope.go:117] "RemoveContainer" containerID="75019d955c224e1828886455a7893c37749e325e23de7f29d47424fb80702f73" Mar 13 12:51:36 crc kubenswrapper[4837]: I0313 12:51:36.228593 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jn4zc" event={"ID":"e35b70ca-0247-45d0-aee2-1fb91eefa45c","Type":"ContainerDied","Data":"da8784cb3a7c82be8f8439b52b3069bcb72b04da847b72613804c85ea2bb6618"} Mar 13 12:51:36 crc kubenswrapper[4837]: I0313 12:51:36.250014 4837 scope.go:117] "RemoveContainer" containerID="018ea4b7b5731d21bb3788eb633c0a2608f7d7437cbf7d79b0b4ecfffaa5bb54" Mar 13 12:51:36 crc kubenswrapper[4837]: I0313 12:51:36.281698 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jn4zc"] Mar 13 12:51:36 crc kubenswrapper[4837]: I0313 12:51:36.288698 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jn4zc"] Mar 13 12:51:36 crc kubenswrapper[4837]: I0313 12:51:36.294868 4837 scope.go:117] "RemoveContainer" containerID="4cfa78b841c4bfd68fc339c7da7ba7a470ed2e7b9319732dc821f3f2f5974993" Mar 13 12:51:36 crc kubenswrapper[4837]: I0313 12:51:36.318748 4837 scope.go:117] "RemoveContainer" containerID="75019d955c224e1828886455a7893c37749e325e23de7f29d47424fb80702f73" Mar 13 12:51:36 crc kubenswrapper[4837]: E0313 12:51:36.319315 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75019d955c224e1828886455a7893c37749e325e23de7f29d47424fb80702f73\": container with ID starting with 75019d955c224e1828886455a7893c37749e325e23de7f29d47424fb80702f73 not found: ID does not exist" containerID="75019d955c224e1828886455a7893c37749e325e23de7f29d47424fb80702f73" Mar 13 12:51:36 crc kubenswrapper[4837]: I0313 12:51:36.319372 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75019d955c224e1828886455a7893c37749e325e23de7f29d47424fb80702f73"} err="failed to get container status \"75019d955c224e1828886455a7893c37749e325e23de7f29d47424fb80702f73\": rpc error: code = NotFound desc = could not find container \"75019d955c224e1828886455a7893c37749e325e23de7f29d47424fb80702f73\": container with ID starting with 75019d955c224e1828886455a7893c37749e325e23de7f29d47424fb80702f73 not found: ID does not exist" Mar 13 12:51:36 crc kubenswrapper[4837]: I0313 12:51:36.319414 4837 scope.go:117] "RemoveContainer" containerID="018ea4b7b5731d21bb3788eb633c0a2608f7d7437cbf7d79b0b4ecfffaa5bb54" Mar 13 12:51:36 crc kubenswrapper[4837]: E0313 12:51:36.320533 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"018ea4b7b5731d21bb3788eb633c0a2608f7d7437cbf7d79b0b4ecfffaa5bb54\": container with ID starting with 018ea4b7b5731d21bb3788eb633c0a2608f7d7437cbf7d79b0b4ecfffaa5bb54 not found: ID does not exist" containerID="018ea4b7b5731d21bb3788eb633c0a2608f7d7437cbf7d79b0b4ecfffaa5bb54" Mar 13 12:51:36 crc kubenswrapper[4837]: I0313 12:51:36.320564 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"018ea4b7b5731d21bb3788eb633c0a2608f7d7437cbf7d79b0b4ecfffaa5bb54"} err="failed to get container status \"018ea4b7b5731d21bb3788eb633c0a2608f7d7437cbf7d79b0b4ecfffaa5bb54\": rpc error: code = NotFound desc = could not find container \"018ea4b7b5731d21bb3788eb633c0a2608f7d7437cbf7d79b0b4ecfffaa5bb54\": container with ID starting with 018ea4b7b5731d21bb3788eb633c0a2608f7d7437cbf7d79b0b4ecfffaa5bb54 not found: ID does not exist" Mar 13 12:51:36 crc kubenswrapper[4837]: I0313 12:51:36.320586 4837 scope.go:117] "RemoveContainer" containerID="4cfa78b841c4bfd68fc339c7da7ba7a470ed2e7b9319732dc821f3f2f5974993" Mar 13 12:51:36 crc kubenswrapper[4837]: E0313 12:51:36.321929 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cfa78b841c4bfd68fc339c7da7ba7a470ed2e7b9319732dc821f3f2f5974993\": container with ID starting with 4cfa78b841c4bfd68fc339c7da7ba7a470ed2e7b9319732dc821f3f2f5974993 not found: ID does not exist" containerID="4cfa78b841c4bfd68fc339c7da7ba7a470ed2e7b9319732dc821f3f2f5974993" Mar 13 12:51:36 crc kubenswrapper[4837]: I0313 12:51:36.321954 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cfa78b841c4bfd68fc339c7da7ba7a470ed2e7b9319732dc821f3f2f5974993"} err="failed to get container status \"4cfa78b841c4bfd68fc339c7da7ba7a470ed2e7b9319732dc821f3f2f5974993\": rpc error: code = NotFound desc = could not find container \"4cfa78b841c4bfd68fc339c7da7ba7a470ed2e7b9319732dc821f3f2f5974993\": container with ID starting with 4cfa78b841c4bfd68fc339c7da7ba7a470ed2e7b9319732dc821f3f2f5974993 not found: ID does not exist" Mar 13 12:51:37 crc kubenswrapper[4837]: I0313 12:51:37.062536 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e35b70ca-0247-45d0-aee2-1fb91eefa45c" path="/var/lib/kubelet/pods/e35b70ca-0247-45d0-aee2-1fb91eefa45c/volumes" Mar 13 12:51:41 crc kubenswrapper[4837]: I0313 12:51:41.051616 4837 scope.go:117] "RemoveContainer" containerID="103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0" Mar 13 12:51:41 crc kubenswrapper[4837]: E0313 12:51:41.052166 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:51:46 crc kubenswrapper[4837]: I0313 12:51:46.172068 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6d84f6b8c8-8rrwq_74c7e377-b579-47bc-a992-cca0cf047627/barbican-api/0.log" Mar 13 12:51:46 crc kubenswrapper[4837]: I0313 12:51:46.298851 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6d84f6b8c8-8rrwq_74c7e377-b579-47bc-a992-cca0cf047627/barbican-api-log/0.log" Mar 13 12:51:46 crc kubenswrapper[4837]: I0313 12:51:46.373380 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-58c489697d-dgjtz_d1cfe08e-23bd-4f52-ab3c-3d68377de2a9/barbican-keystone-listener/0.log" Mar 13 12:51:46 crc kubenswrapper[4837]: I0313 12:51:46.400264 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-58c489697d-dgjtz_d1cfe08e-23bd-4f52-ab3c-3d68377de2a9/barbican-keystone-listener-log/0.log" Mar 13 12:51:46 crc kubenswrapper[4837]: I0313 12:51:46.565214 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6f4ff9ff9-mjmsz_55084c82-a823-4f31-926e-21702ba02ba1/barbican-worker/0.log" Mar 13 12:51:46 crc kubenswrapper[4837]: I0313 12:51:46.652249 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6f4ff9ff9-mjmsz_55084c82-a823-4f31-926e-21702ba02ba1/barbican-worker-log/0.log" Mar 13 12:51:46 crc kubenswrapper[4837]: I0313 12:51:46.785441 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-g6tlj_2980c3c3-0093-4e8f-a9fc-ce42ef57c9f6/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:51:46 crc kubenswrapper[4837]: I0313 12:51:46.863502 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_82b5b509-a674-4a89-a7cc-c01c7bfca144/ceilometer-central-agent/0.log" Mar 13 12:51:46 crc kubenswrapper[4837]: I0313 12:51:46.921570 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_82b5b509-a674-4a89-a7cc-c01c7bfca144/ceilometer-notification-agent/0.log" Mar 13 12:51:46 crc kubenswrapper[4837]: I0313 12:51:46.976840 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_82b5b509-a674-4a89-a7cc-c01c7bfca144/proxy-httpd/0.log" Mar 13 12:51:47 crc kubenswrapper[4837]: I0313 12:51:47.047723 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_82b5b509-a674-4a89-a7cc-c01c7bfca144/sg-core/0.log" Mar 13 12:51:47 crc kubenswrapper[4837]: I0313 12:51:47.223064 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a8004928-50bc-4db8-a701-4458c42bc776/cinder-api-log/0.log" Mar 13 12:51:47 crc kubenswrapper[4837]: I0313 12:51:47.228126 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a8004928-50bc-4db8-a701-4458c42bc776/cinder-api/0.log" Mar 13 12:51:47 crc kubenswrapper[4837]: I0313 12:51:47.392737 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_580b8861-16eb-4142-bd61-6d0221a07f4d/cinder-scheduler/0.log" Mar 13 12:51:47 crc kubenswrapper[4837]: I0313 12:51:47.440353 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_580b8861-16eb-4142-bd61-6d0221a07f4d/probe/0.log" Mar 13 12:51:47 crc kubenswrapper[4837]: I0313 12:51:47.503375 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-s95mk_875e3c3d-ae20-4ad7-aaeb-87b13b5fa6f4/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:51:47 crc kubenswrapper[4837]: I0313 12:51:47.618566 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-bsbxp_0e7fe83f-ec1b-4f03-8ed5-c07adb5b2de5/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:51:47 crc kubenswrapper[4837]: I0313 12:51:47.726579 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-bxc2t_98f4bdc5-6452-4630-a299-6234d8a63bf8/init/0.log" Mar 13 12:51:47 crc kubenswrapper[4837]: I0313 12:51:47.854766 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-bxc2t_98f4bdc5-6452-4630-a299-6234d8a63bf8/init/0.log" Mar 13 12:51:47 crc kubenswrapper[4837]: I0313 12:51:47.912096 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-xw8ts_121f6d1b-1277-4d68-8a48-6c4630dd6fe5/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:51:47 crc kubenswrapper[4837]: I0313 12:51:47.978928 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-bxc2t_98f4bdc5-6452-4630-a299-6234d8a63bf8/dnsmasq-dns/0.log" Mar 13 12:51:48 crc kubenswrapper[4837]: I0313 12:51:48.109855 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_d3f87d89-35d5-4dc0-9c37-5297718a9351/glance-log/0.log" Mar 13 12:51:48 crc kubenswrapper[4837]: I0313 12:51:48.146855 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_d3f87d89-35d5-4dc0-9c37-5297718a9351/glance-httpd/0.log" Mar 13 12:51:48 crc kubenswrapper[4837]: I0313 12:51:48.263328 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_d0f3b003-127f-414f-877a-8f7df2872049/glance-httpd/0.log" Mar 13 12:51:48 crc kubenswrapper[4837]: I0313 12:51:48.337054 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_d0f3b003-127f-414f-877a-8f7df2872049/glance-log/0.log" Mar 13 12:51:48 crc kubenswrapper[4837]: I0313 12:51:48.490825 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-fd6ddfd9b-f66l8_4d3df345-07a2-41bf-aae4-088b3ce83b63/horizon/0.log" Mar 13 12:51:48 crc kubenswrapper[4837]: I0313 12:51:48.668814 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-gj59c_6cc8d0dd-d1e6-4374-bb90-aaefc9197350/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:51:48 crc kubenswrapper[4837]: I0313 12:51:48.759136 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-2b48q_033a02c2-cbe4-4676-ae46-f9b9b17a60fb/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:51:48 crc kubenswrapper[4837]: I0313 12:51:48.822227 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-fd6ddfd9b-f66l8_4d3df345-07a2-41bf-aae4-088b3ce83b63/horizon-log/0.log" Mar 13 12:51:49 crc kubenswrapper[4837]: I0313 12:51:49.089075 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_abd69ff2-e72e-40c0-925f-d0c1c0a40f9a/kube-state-metrics/0.log" Mar 13 12:51:49 crc kubenswrapper[4837]: I0313 12:51:49.117843 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-55dc4d44f8-mvjvg_9cb9614d-a433-4be3-8145-4c1c8593404f/keystone-api/0.log" Mar 13 12:51:49 crc kubenswrapper[4837]: I0313 12:51:49.243608 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-wgnt5_394104d4-0291-4071-a7da-d7b71e0f4083/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:51:49 crc kubenswrapper[4837]: I0313 12:51:49.586754 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-667d547b9-4p8qm_3c00dfc0-061b-43ba-b529-a89c9157a0cf/neutron-api/0.log" Mar 13 12:51:49 crc kubenswrapper[4837]: I0313 12:51:49.931147 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-48kg4_20f35066-9c10-4433-a655-f5cef18d4deb/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:51:50 crc kubenswrapper[4837]: I0313 12:51:50.014881 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-667d547b9-4p8qm_3c00dfc0-061b-43ba-b529-a89c9157a0cf/neutron-httpd/0.log" Mar 13 12:51:50 crc kubenswrapper[4837]: I0313 12:51:50.591936 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_4e6cd1d9-f670-4e94-8322-44e471c3be71/nova-api-log/0.log" Mar 13 12:51:50 crc kubenswrapper[4837]: I0313 12:51:50.722045 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_58240a84-c8ab-43a9-8113-eaf2d0ddea2e/nova-cell0-conductor-conductor/0.log" Mar 13 12:51:50 crc kubenswrapper[4837]: I0313 12:51:50.954062 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_9a51debb-c1cb-4a55-b845-e89d89d11e86/nova-cell1-conductor-conductor/0.log" Mar 13 12:51:51 crc kubenswrapper[4837]: I0313 12:51:51.028993 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_4e6cd1d9-f670-4e94-8322-44e471c3be71/nova-api-api/0.log" Mar 13 12:51:51 crc kubenswrapper[4837]: I0313 12:51:51.030945 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_662e258d-fe94-4373-912d-c906f1e93c90/nova-cell1-novncproxy-novncproxy/0.log" Mar 13 12:51:51 crc kubenswrapper[4837]: I0313 12:51:51.233005 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-4jdmk_e6986f16-e143-49f4-81e5-58abba717876/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:51:51 crc kubenswrapper[4837]: I0313 12:51:51.813078 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_7faa5418-aa48-4e20-830c-bb171cfea0d9/nova-metadata-log/0.log" Mar 13 12:51:52 crc kubenswrapper[4837]: I0313 12:51:52.048664 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_362e31d4-ea62-40ed-8426-982d47559472/mysql-bootstrap/0.log" Mar 13 12:51:52 crc kubenswrapper[4837]: I0313 12:51:52.257937 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_d380e047-7297-4835-b948-6c86c6b6aa27/nova-scheduler-scheduler/0.log" Mar 13 12:51:52 crc kubenswrapper[4837]: I0313 12:51:52.262410 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_362e31d4-ea62-40ed-8426-982d47559472/mysql-bootstrap/0.log" Mar 13 12:51:52 crc kubenswrapper[4837]: I0313 12:51:52.322434 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_362e31d4-ea62-40ed-8426-982d47559472/galera/0.log" Mar 13 12:51:52 crc kubenswrapper[4837]: I0313 12:51:52.462277 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd/mysql-bootstrap/0.log" Mar 13 12:51:52 crc kubenswrapper[4837]: I0313 12:51:52.715227 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd/mysql-bootstrap/0.log" Mar 13 12:51:52 crc kubenswrapper[4837]: I0313 12:51:52.742371 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_adb9ab64-aa4b-45f4-8738-0ed74c3ed2bd/galera/0.log" Mar 13 12:51:52 crc kubenswrapper[4837]: I0313 12:51:52.878140 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_5d15c820-a2ee-4d4c-986f-2c2f09b43f79/openstackclient/0.log" Mar 13 12:51:52 crc kubenswrapper[4837]: I0313 12:51:52.946265 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-w69p6_18eb496a-7d9f-4bf6-af71-3b7b585d0f7d/openstack-network-exporter/0.log" Mar 13 12:51:53 crc kubenswrapper[4837]: I0313 12:51:53.071048 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_7faa5418-aa48-4e20-830c-bb171cfea0d9/nova-metadata-metadata/0.log" Mar 13 12:51:53 crc kubenswrapper[4837]: I0313 12:51:53.191871 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-nbhpw_32dc51d9-5638-4530-91c8-5be8c13e60f3/ovn-controller/0.log" Mar 13 12:51:53 crc kubenswrapper[4837]: I0313 12:51:53.335694 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-ls998_71e00962-6b2f-495c-8f34-52993f66cef9/ovsdb-server-init/0.log" Mar 13 12:51:53 crc kubenswrapper[4837]: I0313 12:51:53.465058 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-ls998_71e00962-6b2f-495c-8f34-52993f66cef9/ovsdb-server/0.log" Mar 13 12:51:53 crc kubenswrapper[4837]: I0313 12:51:53.475464 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-ls998_71e00962-6b2f-495c-8f34-52993f66cef9/ovsdb-server-init/0.log" Mar 13 12:51:53 crc kubenswrapper[4837]: I0313 12:51:53.520120 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-ls998_71e00962-6b2f-495c-8f34-52993f66cef9/ovs-vswitchd/0.log" Mar 13 12:51:53 crc kubenswrapper[4837]: I0313 12:51:53.703849 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-kbffp_092bd277-504a-450d-aca1-d8ecc18f0c9f/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:51:53 crc kubenswrapper[4837]: I0313 12:51:53.766443 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_25ea0f5e-e277-4944-8c9d-2c7709e1a8cf/openstack-network-exporter/0.log" Mar 13 12:51:53 crc kubenswrapper[4837]: I0313 12:51:53.873849 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_25ea0f5e-e277-4944-8c9d-2c7709e1a8cf/ovn-northd/0.log" Mar 13 12:51:53 crc kubenswrapper[4837]: I0313 12:51:53.951917 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_38d61ffe-3c44-4657-bc91-d849f766a3e1/openstack-network-exporter/0.log" Mar 13 12:51:54 crc kubenswrapper[4837]: I0313 12:51:54.051914 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_38d61ffe-3c44-4657-bc91-d849f766a3e1/ovsdbserver-nb/0.log" Mar 13 12:51:54 crc kubenswrapper[4837]: I0313 12:51:54.189411 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_3d10fcb0-4d45-45bf-a663-971b8ce74010/openstack-network-exporter/0.log" Mar 13 12:51:54 crc kubenswrapper[4837]: I0313 12:51:54.206533 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_3d10fcb0-4d45-45bf-a663-971b8ce74010/ovsdbserver-sb/0.log" Mar 13 12:51:54 crc kubenswrapper[4837]: I0313 12:51:54.456118 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-59f7b5dc8d-rnsz6_07eece9e-0e59-4a06-8fea-efb4217d6907/placement-api/0.log" Mar 13 12:51:54 crc kubenswrapper[4837]: I0313 12:51:54.474925 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-59f7b5dc8d-rnsz6_07eece9e-0e59-4a06-8fea-efb4217d6907/placement-log/0.log" Mar 13 12:51:54 crc kubenswrapper[4837]: I0313 12:51:54.516369 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_90028d66-5134-4c09-af15-71e754f49bf3/setup-container/0.log" Mar 13 12:51:54 crc kubenswrapper[4837]: I0313 12:51:54.735267 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_90028d66-5134-4c09-af15-71e754f49bf3/setup-container/0.log" Mar 13 12:51:54 crc kubenswrapper[4837]: I0313 12:51:54.843009 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_90028d66-5134-4c09-af15-71e754f49bf3/rabbitmq/0.log" Mar 13 12:51:54 crc kubenswrapper[4837]: I0313 12:51:54.851885 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_245e5a26-d143-4e4d-bae8-094275a91574/setup-container/0.log" Mar 13 12:51:55 crc kubenswrapper[4837]: I0313 12:51:55.015246 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_245e5a26-d143-4e4d-bae8-094275a91574/setup-container/0.log" Mar 13 12:51:55 crc kubenswrapper[4837]: I0313 12:51:55.051398 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_245e5a26-d143-4e4d-bae8-094275a91574/rabbitmq/0.log" Mar 13 12:51:55 crc kubenswrapper[4837]: I0313 12:51:55.062569 4837 scope.go:117] "RemoveContainer" containerID="103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0" Mar 13 12:51:55 crc kubenswrapper[4837]: E0313 12:51:55.062950 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:51:55 crc kubenswrapper[4837]: I0313 12:51:55.100232 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-9pv4d_3b96ea7e-2148-4659-9a26-3335c88888c1/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:51:55 crc kubenswrapper[4837]: I0313 12:51:55.310305 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-pz9nt_0b7402b1-0b76-4ffa-b37f-6e014183f6a6/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:51:55 crc kubenswrapper[4837]: I0313 12:51:55.342305 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-dxwq6_bfedd3e5-e8d7-4311-9a0d-30276ce40418/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:51:55 crc kubenswrapper[4837]: I0313 12:51:55.608563 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-s6jdp_f12ac62a-2011-4e89-a16f-e136959f9d1a/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:51:55 crc kubenswrapper[4837]: I0313 12:51:55.611840 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-vjnpx_4ddcb794-ab03-4308-a93c-c5929ed96e01/ssh-known-hosts-edpm-deployment/0.log" Mar 13 12:51:55 crc kubenswrapper[4837]: I0313 12:51:55.852619 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-bfbc874dc-vsh7q_36ffa543-526d-4d56-b599-06fcfe0988cf/proxy-server/0.log" Mar 13 12:51:55 crc kubenswrapper[4837]: I0313 12:51:55.952953 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-bfbc874dc-vsh7q_36ffa543-526d-4d56-b599-06fcfe0988cf/proxy-httpd/0.log" Mar 13 12:51:56 crc kubenswrapper[4837]: I0313 12:51:56.055602 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-69xgx_24998567-afa6-4adc-a503-4fc054946aef/swift-ring-rebalance/0.log" Mar 13 12:51:56 crc kubenswrapper[4837]: I0313 12:51:56.329153 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/account-auditor/0.log" Mar 13 12:51:56 crc kubenswrapper[4837]: I0313 12:51:56.433249 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/account-reaper/0.log" Mar 13 12:51:56 crc kubenswrapper[4837]: I0313 12:51:56.463865 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/account-replicator/0.log" Mar 13 12:51:56 crc kubenswrapper[4837]: I0313 12:51:56.478912 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/account-server/0.log" Mar 13 12:51:56 crc kubenswrapper[4837]: I0313 12:51:56.552860 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/container-auditor/0.log" Mar 13 12:51:56 crc kubenswrapper[4837]: I0313 12:51:56.672760 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/container-replicator/0.log" Mar 13 12:51:56 crc kubenswrapper[4837]: I0313 12:51:56.735422 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/container-server/0.log" Mar 13 12:51:56 crc kubenswrapper[4837]: I0313 12:51:56.754499 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/container-updater/0.log" Mar 13 12:51:56 crc kubenswrapper[4837]: I0313 12:51:56.819607 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/object-auditor/0.log" Mar 13 12:51:56 crc kubenswrapper[4837]: I0313 12:51:56.869515 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/object-expirer/0.log" Mar 13 12:51:56 crc kubenswrapper[4837]: I0313 12:51:56.974620 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/object-replicator/0.log" Mar 13 12:51:57 crc kubenswrapper[4837]: I0313 12:51:56.998241 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/object-server/0.log" Mar 13 12:51:57 crc kubenswrapper[4837]: I0313 12:51:57.064304 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/object-updater/0.log" Mar 13 12:51:57 crc kubenswrapper[4837]: I0313 12:51:57.129817 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/rsync/0.log" Mar 13 12:51:57 crc kubenswrapper[4837]: I0313 12:51:57.189083 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_59565710-b9bc-46e6-ad92-7f12376de17c/swift-recon-cron/0.log" Mar 13 12:51:57 crc kubenswrapper[4837]: I0313 12:51:57.351050 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-c8l8x_ac15848f-4f6f-4159-828f-d30a77f93a4b/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:51:57 crc kubenswrapper[4837]: I0313 12:51:57.461124 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_66bdda91-c5b6-4879-9adf-21846884c797/tempest-tests-tempest-tests-runner/0.log" Mar 13 12:51:57 crc kubenswrapper[4837]: I0313 12:51:57.538024 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_0244acef-b630-4b97-9bb5-9f99de391613/test-operator-logs-container/0.log" Mar 13 12:51:57 crc kubenswrapper[4837]: I0313 12:51:57.709723 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-42br8_e3ec33da-9091-4eb1-aafa-62b9bdf16072/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 12:52:00 crc kubenswrapper[4837]: I0313 12:52:00.160498 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556772-5k6fp"] Mar 13 12:52:00 crc kubenswrapper[4837]: E0313 12:52:00.161148 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e35b70ca-0247-45d0-aee2-1fb91eefa45c" containerName="extract-content" Mar 13 12:52:00 crc kubenswrapper[4837]: I0313 12:52:00.161161 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e35b70ca-0247-45d0-aee2-1fb91eefa45c" containerName="extract-content" Mar 13 12:52:00 crc kubenswrapper[4837]: E0313 12:52:00.161185 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e35b70ca-0247-45d0-aee2-1fb91eefa45c" containerName="extract-utilities" Mar 13 12:52:00 crc kubenswrapper[4837]: I0313 12:52:00.161191 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e35b70ca-0247-45d0-aee2-1fb91eefa45c" containerName="extract-utilities" Mar 13 12:52:00 crc kubenswrapper[4837]: E0313 12:52:00.161201 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e35b70ca-0247-45d0-aee2-1fb91eefa45c" containerName="registry-server" Mar 13 12:52:00 crc kubenswrapper[4837]: I0313 12:52:00.161207 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="e35b70ca-0247-45d0-aee2-1fb91eefa45c" containerName="registry-server" Mar 13 12:52:00 crc kubenswrapper[4837]: I0313 12:52:00.161407 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="e35b70ca-0247-45d0-aee2-1fb91eefa45c" containerName="registry-server" Mar 13 12:52:00 crc kubenswrapper[4837]: I0313 12:52:00.162035 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556772-5k6fp" Mar 13 12:52:00 crc kubenswrapper[4837]: I0313 12:52:00.165740 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:52:00 crc kubenswrapper[4837]: I0313 12:52:00.165818 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:52:00 crc kubenswrapper[4837]: I0313 12:52:00.165984 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:52:00 crc kubenswrapper[4837]: I0313 12:52:00.170674 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556772-5k6fp"] Mar 13 12:52:00 crc kubenswrapper[4837]: I0313 12:52:00.248136 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vcsv\" (UniqueName: \"kubernetes.io/projected/1ba6a258-c015-4c82-b7d0-736ea4ddf3a0-kube-api-access-6vcsv\") pod \"auto-csr-approver-29556772-5k6fp\" (UID: \"1ba6a258-c015-4c82-b7d0-736ea4ddf3a0\") " pod="openshift-infra/auto-csr-approver-29556772-5k6fp" Mar 13 12:52:00 crc kubenswrapper[4837]: I0313 12:52:00.349594 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vcsv\" (UniqueName: \"kubernetes.io/projected/1ba6a258-c015-4c82-b7d0-736ea4ddf3a0-kube-api-access-6vcsv\") pod \"auto-csr-approver-29556772-5k6fp\" (UID: \"1ba6a258-c015-4c82-b7d0-736ea4ddf3a0\") " pod="openshift-infra/auto-csr-approver-29556772-5k6fp" Mar 13 12:52:00 crc kubenswrapper[4837]: I0313 12:52:00.382340 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vcsv\" (UniqueName: \"kubernetes.io/projected/1ba6a258-c015-4c82-b7d0-736ea4ddf3a0-kube-api-access-6vcsv\") pod \"auto-csr-approver-29556772-5k6fp\" (UID: \"1ba6a258-c015-4c82-b7d0-736ea4ddf3a0\") " pod="openshift-infra/auto-csr-approver-29556772-5k6fp" Mar 13 12:52:00 crc kubenswrapper[4837]: I0313 12:52:00.490035 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556772-5k6fp" Mar 13 12:52:01 crc kubenswrapper[4837]: I0313 12:52:01.013464 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556772-5k6fp"] Mar 13 12:52:01 crc kubenswrapper[4837]: I0313 12:52:01.457155 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556772-5k6fp" event={"ID":"1ba6a258-c015-4c82-b7d0-736ea4ddf3a0","Type":"ContainerStarted","Data":"b708ecc8449a33bcb49fe655201d03102cefaaa202afe9f71e72e16452298a0f"} Mar 13 12:52:02 crc kubenswrapper[4837]: I0313 12:52:02.466572 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556772-5k6fp" event={"ID":"1ba6a258-c015-4c82-b7d0-736ea4ddf3a0","Type":"ContainerStarted","Data":"202f4741378dc74444f14dd2386ad8db9f6a085bd9e8216a4ebc85b491ab3c81"} Mar 13 12:52:02 crc kubenswrapper[4837]: I0313 12:52:02.486186 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556772-5k6fp" podStartSLOduration=1.391804637 podStartE2EDuration="2.486169278s" podCreationTimestamp="2026-03-13 12:52:00 +0000 UTC" firstStartedPulling="2026-03-13 12:52:01.001591513 +0000 UTC m=+3836.639858296" lastFinishedPulling="2026-03-13 12:52:02.095956174 +0000 UTC m=+3837.734222937" observedRunningTime="2026-03-13 12:52:02.482572665 +0000 UTC m=+3838.120839438" watchObservedRunningTime="2026-03-13 12:52:02.486169278 +0000 UTC m=+3838.124436041" Mar 13 12:52:03 crc kubenswrapper[4837]: I0313 12:52:03.480188 4837 generic.go:334] "Generic (PLEG): container finished" podID="1ba6a258-c015-4c82-b7d0-736ea4ddf3a0" containerID="202f4741378dc74444f14dd2386ad8db9f6a085bd9e8216a4ebc85b491ab3c81" exitCode=0 Mar 13 12:52:03 crc kubenswrapper[4837]: I0313 12:52:03.480259 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556772-5k6fp" event={"ID":"1ba6a258-c015-4c82-b7d0-736ea4ddf3a0","Type":"ContainerDied","Data":"202f4741378dc74444f14dd2386ad8db9f6a085bd9e8216a4ebc85b491ab3c81"} Mar 13 12:52:04 crc kubenswrapper[4837]: I0313 12:52:04.867933 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556772-5k6fp" Mar 13 12:52:05 crc kubenswrapper[4837]: I0313 12:52:05.035176 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vcsv\" (UniqueName: \"kubernetes.io/projected/1ba6a258-c015-4c82-b7d0-736ea4ddf3a0-kube-api-access-6vcsv\") pod \"1ba6a258-c015-4c82-b7d0-736ea4ddf3a0\" (UID: \"1ba6a258-c015-4c82-b7d0-736ea4ddf3a0\") " Mar 13 12:52:05 crc kubenswrapper[4837]: I0313 12:52:05.049934 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ba6a258-c015-4c82-b7d0-736ea4ddf3a0-kube-api-access-6vcsv" (OuterVolumeSpecName: "kube-api-access-6vcsv") pod "1ba6a258-c015-4c82-b7d0-736ea4ddf3a0" (UID: "1ba6a258-c015-4c82-b7d0-736ea4ddf3a0"). InnerVolumeSpecName "kube-api-access-6vcsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:52:05 crc kubenswrapper[4837]: I0313 12:52:05.137702 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vcsv\" (UniqueName: \"kubernetes.io/projected/1ba6a258-c015-4c82-b7d0-736ea4ddf3a0-kube-api-access-6vcsv\") on node \"crc\" DevicePath \"\"" Mar 13 12:52:05 crc kubenswrapper[4837]: I0313 12:52:05.500949 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556772-5k6fp" event={"ID":"1ba6a258-c015-4c82-b7d0-736ea4ddf3a0","Type":"ContainerDied","Data":"b708ecc8449a33bcb49fe655201d03102cefaaa202afe9f71e72e16452298a0f"} Mar 13 12:52:05 crc kubenswrapper[4837]: I0313 12:52:05.501325 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b708ecc8449a33bcb49fe655201d03102cefaaa202afe9f71e72e16452298a0f" Mar 13 12:52:05 crc kubenswrapper[4837]: I0313 12:52:05.501407 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556772-5k6fp" Mar 13 12:52:05 crc kubenswrapper[4837]: I0313 12:52:05.572262 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556766-lgkks"] Mar 13 12:52:05 crc kubenswrapper[4837]: I0313 12:52:05.581894 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556766-lgkks"] Mar 13 12:52:07 crc kubenswrapper[4837]: I0313 12:52:07.084103 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="528b7541-8fac-4df9-9168-c3166532618d" path="/var/lib/kubelet/pods/528b7541-8fac-4df9-9168-c3166532618d/volumes" Mar 13 12:52:08 crc kubenswrapper[4837]: E0313 12:52:08.550165 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ba6a258_c015_4c82_b7d0_736ea4ddf3a0.slice\": RecentStats: unable to find data in memory cache]" Mar 13 12:52:08 crc kubenswrapper[4837]: I0313 12:52:08.650723 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_ae39431b-5fa4-4a09-b76f-44b4d256c129/memcached/0.log" Mar 13 12:52:09 crc kubenswrapper[4837]: I0313 12:52:09.048650 4837 scope.go:117] "RemoveContainer" containerID="103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0" Mar 13 12:52:09 crc kubenswrapper[4837]: E0313 12:52:09.048949 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:52:14 crc kubenswrapper[4837]: I0313 12:52:14.924336 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dklw6"] Mar 13 12:52:14 crc kubenswrapper[4837]: E0313 12:52:14.926264 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ba6a258-c015-4c82-b7d0-736ea4ddf3a0" containerName="oc" Mar 13 12:52:14 crc kubenswrapper[4837]: I0313 12:52:14.926407 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ba6a258-c015-4c82-b7d0-736ea4ddf3a0" containerName="oc" Mar 13 12:52:14 crc kubenswrapper[4837]: I0313 12:52:14.926688 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ba6a258-c015-4c82-b7d0-736ea4ddf3a0" containerName="oc" Mar 13 12:52:14 crc kubenswrapper[4837]: I0313 12:52:14.928261 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dklw6" Mar 13 12:52:14 crc kubenswrapper[4837]: I0313 12:52:14.939421 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dklw6"] Mar 13 12:52:15 crc kubenswrapper[4837]: I0313 12:52:15.030246 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6c415fc-c9b8-4e4f-9116-024c9bee94b0-catalog-content\") pod \"redhat-operators-dklw6\" (UID: \"f6c415fc-c9b8-4e4f-9116-024c9bee94b0\") " pod="openshift-marketplace/redhat-operators-dklw6" Mar 13 12:52:15 crc kubenswrapper[4837]: I0313 12:52:15.030312 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6c415fc-c9b8-4e4f-9116-024c9bee94b0-utilities\") pod \"redhat-operators-dklw6\" (UID: \"f6c415fc-c9b8-4e4f-9116-024c9bee94b0\") " pod="openshift-marketplace/redhat-operators-dklw6" Mar 13 12:52:15 crc kubenswrapper[4837]: I0313 12:52:15.030627 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ts7w\" (UniqueName: \"kubernetes.io/projected/f6c415fc-c9b8-4e4f-9116-024c9bee94b0-kube-api-access-5ts7w\") pod \"redhat-operators-dklw6\" (UID: \"f6c415fc-c9b8-4e4f-9116-024c9bee94b0\") " pod="openshift-marketplace/redhat-operators-dklw6" Mar 13 12:52:15 crc kubenswrapper[4837]: I0313 12:52:15.132926 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ts7w\" (UniqueName: \"kubernetes.io/projected/f6c415fc-c9b8-4e4f-9116-024c9bee94b0-kube-api-access-5ts7w\") pod \"redhat-operators-dklw6\" (UID: \"f6c415fc-c9b8-4e4f-9116-024c9bee94b0\") " pod="openshift-marketplace/redhat-operators-dklw6" Mar 13 12:52:15 crc kubenswrapper[4837]: I0313 12:52:15.133231 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6c415fc-c9b8-4e4f-9116-024c9bee94b0-catalog-content\") pod \"redhat-operators-dklw6\" (UID: \"f6c415fc-c9b8-4e4f-9116-024c9bee94b0\") " pod="openshift-marketplace/redhat-operators-dklw6" Mar 13 12:52:15 crc kubenswrapper[4837]: I0313 12:52:15.133391 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6c415fc-c9b8-4e4f-9116-024c9bee94b0-utilities\") pod \"redhat-operators-dklw6\" (UID: \"f6c415fc-c9b8-4e4f-9116-024c9bee94b0\") " pod="openshift-marketplace/redhat-operators-dklw6" Mar 13 12:52:15 crc kubenswrapper[4837]: I0313 12:52:15.133945 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6c415fc-c9b8-4e4f-9116-024c9bee94b0-catalog-content\") pod \"redhat-operators-dklw6\" (UID: \"f6c415fc-c9b8-4e4f-9116-024c9bee94b0\") " pod="openshift-marketplace/redhat-operators-dklw6" Mar 13 12:52:15 crc kubenswrapper[4837]: I0313 12:52:15.133996 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6c415fc-c9b8-4e4f-9116-024c9bee94b0-utilities\") pod \"redhat-operators-dklw6\" (UID: \"f6c415fc-c9b8-4e4f-9116-024c9bee94b0\") " pod="openshift-marketplace/redhat-operators-dklw6" Mar 13 12:52:15 crc kubenswrapper[4837]: I0313 12:52:15.154928 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ts7w\" (UniqueName: \"kubernetes.io/projected/f6c415fc-c9b8-4e4f-9116-024c9bee94b0-kube-api-access-5ts7w\") pod \"redhat-operators-dklw6\" (UID: \"f6c415fc-c9b8-4e4f-9116-024c9bee94b0\") " pod="openshift-marketplace/redhat-operators-dklw6" Mar 13 12:52:15 crc kubenswrapper[4837]: I0313 12:52:15.258070 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dklw6" Mar 13 12:52:15 crc kubenswrapper[4837]: I0313 12:52:15.757817 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dklw6"] Mar 13 12:52:16 crc kubenswrapper[4837]: I0313 12:52:16.604274 4837 generic.go:334] "Generic (PLEG): container finished" podID="f6c415fc-c9b8-4e4f-9116-024c9bee94b0" containerID="466256fcf8dddb5197d5eca2dd860959fb51e832e51a92f91899b05aa8236f7b" exitCode=0 Mar 13 12:52:16 crc kubenswrapper[4837]: I0313 12:52:16.604345 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dklw6" event={"ID":"f6c415fc-c9b8-4e4f-9116-024c9bee94b0","Type":"ContainerDied","Data":"466256fcf8dddb5197d5eca2dd860959fb51e832e51a92f91899b05aa8236f7b"} Mar 13 12:52:16 crc kubenswrapper[4837]: I0313 12:52:16.604603 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dklw6" event={"ID":"f6c415fc-c9b8-4e4f-9116-024c9bee94b0","Type":"ContainerStarted","Data":"a5c56fb8f59524a48c89f88654723d62ee24c6c21452fed4f86ed021cc052e22"} Mar 13 12:52:17 crc kubenswrapper[4837]: I0313 12:52:17.615385 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dklw6" event={"ID":"f6c415fc-c9b8-4e4f-9116-024c9bee94b0","Type":"ContainerStarted","Data":"105c28a231210141f36873f32a835117fd67229caf2bcd61214068a7268ee516"} Mar 13 12:52:18 crc kubenswrapper[4837]: E0313 12:52:18.779629 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ba6a258_c015_4c82_b7d0_736ea4ddf3a0.slice\": RecentStats: unable to find data in memory cache]" Mar 13 12:52:20 crc kubenswrapper[4837]: I0313 12:52:20.051350 4837 scope.go:117] "RemoveContainer" containerID="103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0" Mar 13 12:52:20 crc kubenswrapper[4837]: E0313 12:52:20.052349 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:52:22 crc kubenswrapper[4837]: I0313 12:52:22.656974 4837 generic.go:334] "Generic (PLEG): container finished" podID="f6c415fc-c9b8-4e4f-9116-024c9bee94b0" containerID="105c28a231210141f36873f32a835117fd67229caf2bcd61214068a7268ee516" exitCode=0 Mar 13 12:52:22 crc kubenswrapper[4837]: I0313 12:52:22.657055 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dklw6" event={"ID":"f6c415fc-c9b8-4e4f-9116-024c9bee94b0","Type":"ContainerDied","Data":"105c28a231210141f36873f32a835117fd67229caf2bcd61214068a7268ee516"} Mar 13 12:52:24 crc kubenswrapper[4837]: I0313 12:52:24.675653 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dklw6" event={"ID":"f6c415fc-c9b8-4e4f-9116-024c9bee94b0","Type":"ContainerStarted","Data":"2d7bff1e3c3855fc8cb2aa32ef5753d57469af94cb0b8d6de0d346b01f07fc87"} Mar 13 12:52:24 crc kubenswrapper[4837]: I0313 12:52:24.700475 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dklw6" podStartSLOduration=3.44520617 podStartE2EDuration="10.700452771s" podCreationTimestamp="2026-03-13 12:52:14 +0000 UTC" firstStartedPulling="2026-03-13 12:52:16.607686116 +0000 UTC m=+3852.245952879" lastFinishedPulling="2026-03-13 12:52:23.862932717 +0000 UTC m=+3859.501199480" observedRunningTime="2026-03-13 12:52:24.692306275 +0000 UTC m=+3860.330573038" watchObservedRunningTime="2026-03-13 12:52:24.700452771 +0000 UTC m=+3860.338719544" Mar 13 12:52:25 crc kubenswrapper[4837]: I0313 12:52:25.258159 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dklw6" Mar 13 12:52:25 crc kubenswrapper[4837]: I0313 12:52:25.258209 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dklw6" Mar 13 12:52:25 crc kubenswrapper[4837]: I0313 12:52:25.658155 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66d56f6ff4-b7cdx_e645f00a-8463-4fac-b010-f0500b54d68a/manager/0.log" Mar 13 12:52:25 crc kubenswrapper[4837]: I0313 12:52:25.902833 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b_53ac9dfc-487a-47cf-83f2-91542b93bb95/util/0.log" Mar 13 12:52:26 crc kubenswrapper[4837]: I0313 12:52:26.124207 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b_53ac9dfc-487a-47cf-83f2-91542b93bb95/pull/0.log" Mar 13 12:52:26 crc kubenswrapper[4837]: I0313 12:52:26.181032 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b_53ac9dfc-487a-47cf-83f2-91542b93bb95/util/0.log" Mar 13 12:52:26 crc kubenswrapper[4837]: I0313 12:52:26.303873 4837 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dklw6" podUID="f6c415fc-c9b8-4e4f-9116-024c9bee94b0" containerName="registry-server" probeResult="failure" output=< Mar 13 12:52:26 crc kubenswrapper[4837]: timeout: failed to connect service ":50051" within 1s Mar 13 12:52:26 crc kubenswrapper[4837]: > Mar 13 12:52:26 crc kubenswrapper[4837]: I0313 12:52:26.426693 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b_53ac9dfc-487a-47cf-83f2-91542b93bb95/pull/0.log" Mar 13 12:52:26 crc kubenswrapper[4837]: I0313 12:52:26.571838 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b_53ac9dfc-487a-47cf-83f2-91542b93bb95/util/0.log" Mar 13 12:52:26 crc kubenswrapper[4837]: I0313 12:52:26.627065 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b_53ac9dfc-487a-47cf-83f2-91542b93bb95/pull/0.log" Mar 13 12:52:26 crc kubenswrapper[4837]: I0313 12:52:26.789466 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e41c3cdd9c45c7396d76384269ab30db6ed7d2a76905cbc997544c01b04tv4b_53ac9dfc-487a-47cf-83f2-91542b93bb95/extract/0.log" Mar 13 12:52:26 crc kubenswrapper[4837]: I0313 12:52:26.800111 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-984cd4dcf-kbn8z_0a24601d-8e41-4f99-9e33-870d791a3e7e/manager/0.log" Mar 13 12:52:27 crc kubenswrapper[4837]: I0313 12:52:27.250047 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5964f64c48-mrgb9_1870e3ae-40fd-479c-9aa7-9ce3a3e2dd2e/manager/0.log" Mar 13 12:52:27 crc kubenswrapper[4837]: I0313 12:52:27.425076 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-77b6666d85-ss4rm_b2c881d7-03db-4608-a3f4-9a9ad8b2f5da/manager/0.log" Mar 13 12:52:27 crc kubenswrapper[4837]: I0313 12:52:27.600411 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d9d6b584d-bvmr7_11a29883-0638-4da4-a1dc-bf2127a3645c/manager/0.log" Mar 13 12:52:27 crc kubenswrapper[4837]: I0313 12:52:27.975202 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6bbb499bbc-9zvxf_89e6d6f8-7bd3-4862-b41c-cd5c1f05f3e5/manager/0.log" Mar 13 12:52:28 crc kubenswrapper[4837]: I0313 12:52:28.258619 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5995f4446f-fhlk9_c19c3466-ab50-4be3-8299-d7b8b3d263df/manager/0.log" Mar 13 12:52:28 crc kubenswrapper[4837]: I0313 12:52:28.331079 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-684f77d66d-kc2x6_9bd066a9-3999-405a-b619-540678a46ded/manager/0.log" Mar 13 12:52:28 crc kubenswrapper[4837]: I0313 12:52:28.447546 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-68f45f9d9f-twrg7_fa1b1ba2-3856-49cb-bda4-8ac5e63b5298/manager/0.log" Mar 13 12:52:28 crc kubenswrapper[4837]: I0313 12:52:28.769462 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-658d4cdd5-7nm95_046bdee0-f0cf-4d17-916b-68d301502473/manager/0.log" Mar 13 12:52:29 crc kubenswrapper[4837]: E0313 12:52:29.026140 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ba6a258_c015_4c82_b7d0_736ea4ddf3a0.slice\": RecentStats: unable to find data in memory cache]" Mar 13 12:52:29 crc kubenswrapper[4837]: I0313 12:52:29.507627 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-776c5696bf-6ht9l_3059d7c0-2624-4d3e-af0f-de054401f1ec/manager/0.log" Mar 13 12:52:29 crc kubenswrapper[4837]: I0313 12:52:29.522223 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-569cc54c5-shrx7_ee1c592d-7979-4b75-b8e4-7ccd6d7d6048/manager/0.log" Mar 13 12:52:29 crc kubenswrapper[4837]: I0313 12:52:29.867825 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5f4f55cb5c-7f7zd_561aed86-f289-4dd1-8c53-307ccdc99165/manager/0.log" Mar 13 12:52:30 crc kubenswrapper[4837]: I0313 12:52:30.040783 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-557ccf57b77x9vc_7b38159c-e030-4734-963d-dfc38d29c75c/manager/0.log" Mar 13 12:52:30 crc kubenswrapper[4837]: I0313 12:52:30.334062 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-c99df78b8-qxmfb_4f8c5e9e-7680-4bc3-8096-0c62a1de4da5/operator/0.log" Mar 13 12:52:30 crc kubenswrapper[4837]: I0313 12:52:30.518908 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-mdjzs_9da10ec5-aa1b-4797-91ce-04a91266831a/registry-server/0.log" Mar 13 12:52:31 crc kubenswrapper[4837]: I0313 12:52:31.174452 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-bbc5b68f9-nxwr9_5f00cf34-6fc4-4ee9-93e5-5ff8c6b1128d/manager/0.log" Mar 13 12:52:31 crc kubenswrapper[4837]: I0313 12:52:31.470358 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-574d45c66c-fwblp_35a21ab1-95b5-446a-ae10-d004e5aa2995/manager/0.log" Mar 13 12:52:31 crc kubenswrapper[4837]: I0313 12:52:31.559779 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-677bd678f7-jvdqq_1d59bb7f-598d-4c70-9b8c-ce4e3048691f/manager/0.log" Mar 13 12:52:31 crc kubenswrapper[4837]: I0313 12:52:31.606966 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-xkk4z_ce0c89e1-3fc0-473d-875f-461c8b423061/operator/0.log" Mar 13 12:52:31 crc kubenswrapper[4837]: I0313 12:52:31.815440 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-55876d85bb-96mp7_eaf3fa29-f441-43df-9fbe-409d9d8ad871/manager/0.log" Mar 13 12:52:31 crc kubenswrapper[4837]: I0313 12:52:31.864877 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6cd66dbd4b-8lkmx_cb20db22-bd0e-4897-8ed6-a6a80a91ffff/manager/0.log" Mar 13 12:52:31 crc kubenswrapper[4837]: I0313 12:52:31.938919 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-677c674df7-cfv8z_55649f1c-678e-4e03-be55-7c4435446199/manager/0.log" Mar 13 12:52:32 crc kubenswrapper[4837]: I0313 12:52:32.114901 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-dk4nr_fe107e39-b5ec-473d-8851-b57775dadafc/manager/0.log" Mar 13 12:52:32 crc kubenswrapper[4837]: I0313 12:52:32.119871 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6dd88c6f67-hrcp9_5ef20b1d-5c03-4993-b635-b031ddcab3bf/manager/0.log" Mar 13 12:52:33 crc kubenswrapper[4837]: I0313 12:52:33.116594 4837 scope.go:117] "RemoveContainer" containerID="ef54f8788c02611298b8e701f2aadf7ddd17abb3f2f5d777925238c78d7d9c68" Mar 13 12:52:34 crc kubenswrapper[4837]: I0313 12:52:34.048515 4837 scope.go:117] "RemoveContainer" containerID="103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0" Mar 13 12:52:34 crc kubenswrapper[4837]: E0313 12:52:34.049144 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:52:35 crc kubenswrapper[4837]: I0313 12:52:35.308891 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dklw6" Mar 13 12:52:35 crc kubenswrapper[4837]: I0313 12:52:35.357148 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dklw6" Mar 13 12:52:35 crc kubenswrapper[4837]: I0313 12:52:35.548810 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dklw6"] Mar 13 12:52:36 crc kubenswrapper[4837]: I0313 12:52:36.785905 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dklw6" podUID="f6c415fc-c9b8-4e4f-9116-024c9bee94b0" containerName="registry-server" containerID="cri-o://2d7bff1e3c3855fc8cb2aa32ef5753d57469af94cb0b8d6de0d346b01f07fc87" gracePeriod=2 Mar 13 12:52:37 crc kubenswrapper[4837]: I0313 12:52:37.311550 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dklw6" Mar 13 12:52:37 crc kubenswrapper[4837]: I0313 12:52:37.469151 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6c415fc-c9b8-4e4f-9116-024c9bee94b0-catalog-content\") pod \"f6c415fc-c9b8-4e4f-9116-024c9bee94b0\" (UID: \"f6c415fc-c9b8-4e4f-9116-024c9bee94b0\") " Mar 13 12:52:37 crc kubenswrapper[4837]: I0313 12:52:37.469350 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6c415fc-c9b8-4e4f-9116-024c9bee94b0-utilities\") pod \"f6c415fc-c9b8-4e4f-9116-024c9bee94b0\" (UID: \"f6c415fc-c9b8-4e4f-9116-024c9bee94b0\") " Mar 13 12:52:37 crc kubenswrapper[4837]: I0313 12:52:37.469390 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ts7w\" (UniqueName: \"kubernetes.io/projected/f6c415fc-c9b8-4e4f-9116-024c9bee94b0-kube-api-access-5ts7w\") pod \"f6c415fc-c9b8-4e4f-9116-024c9bee94b0\" (UID: \"f6c415fc-c9b8-4e4f-9116-024c9bee94b0\") " Mar 13 12:52:37 crc kubenswrapper[4837]: I0313 12:52:37.470104 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6c415fc-c9b8-4e4f-9116-024c9bee94b0-utilities" (OuterVolumeSpecName: "utilities") pod "f6c415fc-c9b8-4e4f-9116-024c9bee94b0" (UID: "f6c415fc-c9b8-4e4f-9116-024c9bee94b0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:52:37 crc kubenswrapper[4837]: I0313 12:52:37.478133 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6c415fc-c9b8-4e4f-9116-024c9bee94b0-kube-api-access-5ts7w" (OuterVolumeSpecName: "kube-api-access-5ts7w") pod "f6c415fc-c9b8-4e4f-9116-024c9bee94b0" (UID: "f6c415fc-c9b8-4e4f-9116-024c9bee94b0"). InnerVolumeSpecName "kube-api-access-5ts7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:52:37 crc kubenswrapper[4837]: I0313 12:52:37.571831 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6c415fc-c9b8-4e4f-9116-024c9bee94b0-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:52:37 crc kubenswrapper[4837]: I0313 12:52:37.572213 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ts7w\" (UniqueName: \"kubernetes.io/projected/f6c415fc-c9b8-4e4f-9116-024c9bee94b0-kube-api-access-5ts7w\") on node \"crc\" DevicePath \"\"" Mar 13 12:52:37 crc kubenswrapper[4837]: I0313 12:52:37.607778 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6c415fc-c9b8-4e4f-9116-024c9bee94b0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f6c415fc-c9b8-4e4f-9116-024c9bee94b0" (UID: "f6c415fc-c9b8-4e4f-9116-024c9bee94b0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:52:37 crc kubenswrapper[4837]: I0313 12:52:37.674117 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6c415fc-c9b8-4e4f-9116-024c9bee94b0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:52:37 crc kubenswrapper[4837]: I0313 12:52:37.798369 4837 generic.go:334] "Generic (PLEG): container finished" podID="f6c415fc-c9b8-4e4f-9116-024c9bee94b0" containerID="2d7bff1e3c3855fc8cb2aa32ef5753d57469af94cb0b8d6de0d346b01f07fc87" exitCode=0 Mar 13 12:52:37 crc kubenswrapper[4837]: I0313 12:52:37.798413 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dklw6" event={"ID":"f6c415fc-c9b8-4e4f-9116-024c9bee94b0","Type":"ContainerDied","Data":"2d7bff1e3c3855fc8cb2aa32ef5753d57469af94cb0b8d6de0d346b01f07fc87"} Mar 13 12:52:37 crc kubenswrapper[4837]: I0313 12:52:37.798443 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dklw6" event={"ID":"f6c415fc-c9b8-4e4f-9116-024c9bee94b0","Type":"ContainerDied","Data":"a5c56fb8f59524a48c89f88654723d62ee24c6c21452fed4f86ed021cc052e22"} Mar 13 12:52:37 crc kubenswrapper[4837]: I0313 12:52:37.798465 4837 scope.go:117] "RemoveContainer" containerID="2d7bff1e3c3855fc8cb2aa32ef5753d57469af94cb0b8d6de0d346b01f07fc87" Mar 13 12:52:37 crc kubenswrapper[4837]: I0313 12:52:37.798476 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dklw6" Mar 13 12:52:37 crc kubenswrapper[4837]: I0313 12:52:37.821396 4837 scope.go:117] "RemoveContainer" containerID="105c28a231210141f36873f32a835117fd67229caf2bcd61214068a7268ee516" Mar 13 12:52:37 crc kubenswrapper[4837]: I0313 12:52:37.840781 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dklw6"] Mar 13 12:52:37 crc kubenswrapper[4837]: I0313 12:52:37.856714 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dklw6"] Mar 13 12:52:37 crc kubenswrapper[4837]: I0313 12:52:37.871587 4837 scope.go:117] "RemoveContainer" containerID="466256fcf8dddb5197d5eca2dd860959fb51e832e51a92f91899b05aa8236f7b" Mar 13 12:52:37 crc kubenswrapper[4837]: I0313 12:52:37.899827 4837 scope.go:117] "RemoveContainer" containerID="2d7bff1e3c3855fc8cb2aa32ef5753d57469af94cb0b8d6de0d346b01f07fc87" Mar 13 12:52:37 crc kubenswrapper[4837]: E0313 12:52:37.900262 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d7bff1e3c3855fc8cb2aa32ef5753d57469af94cb0b8d6de0d346b01f07fc87\": container with ID starting with 2d7bff1e3c3855fc8cb2aa32ef5753d57469af94cb0b8d6de0d346b01f07fc87 not found: ID does not exist" containerID="2d7bff1e3c3855fc8cb2aa32ef5753d57469af94cb0b8d6de0d346b01f07fc87" Mar 13 12:52:37 crc kubenswrapper[4837]: I0313 12:52:37.900355 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d7bff1e3c3855fc8cb2aa32ef5753d57469af94cb0b8d6de0d346b01f07fc87"} err="failed to get container status \"2d7bff1e3c3855fc8cb2aa32ef5753d57469af94cb0b8d6de0d346b01f07fc87\": rpc error: code = NotFound desc = could not find container \"2d7bff1e3c3855fc8cb2aa32ef5753d57469af94cb0b8d6de0d346b01f07fc87\": container with ID starting with 2d7bff1e3c3855fc8cb2aa32ef5753d57469af94cb0b8d6de0d346b01f07fc87 not found: ID does not exist" Mar 13 12:52:37 crc kubenswrapper[4837]: I0313 12:52:37.900387 4837 scope.go:117] "RemoveContainer" containerID="105c28a231210141f36873f32a835117fd67229caf2bcd61214068a7268ee516" Mar 13 12:52:37 crc kubenswrapper[4837]: E0313 12:52:37.900696 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"105c28a231210141f36873f32a835117fd67229caf2bcd61214068a7268ee516\": container with ID starting with 105c28a231210141f36873f32a835117fd67229caf2bcd61214068a7268ee516 not found: ID does not exist" containerID="105c28a231210141f36873f32a835117fd67229caf2bcd61214068a7268ee516" Mar 13 12:52:37 crc kubenswrapper[4837]: I0313 12:52:37.900749 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"105c28a231210141f36873f32a835117fd67229caf2bcd61214068a7268ee516"} err="failed to get container status \"105c28a231210141f36873f32a835117fd67229caf2bcd61214068a7268ee516\": rpc error: code = NotFound desc = could not find container \"105c28a231210141f36873f32a835117fd67229caf2bcd61214068a7268ee516\": container with ID starting with 105c28a231210141f36873f32a835117fd67229caf2bcd61214068a7268ee516 not found: ID does not exist" Mar 13 12:52:37 crc kubenswrapper[4837]: I0313 12:52:37.900785 4837 scope.go:117] "RemoveContainer" containerID="466256fcf8dddb5197d5eca2dd860959fb51e832e51a92f91899b05aa8236f7b" Mar 13 12:52:37 crc kubenswrapper[4837]: E0313 12:52:37.901058 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"466256fcf8dddb5197d5eca2dd860959fb51e832e51a92f91899b05aa8236f7b\": container with ID starting with 466256fcf8dddb5197d5eca2dd860959fb51e832e51a92f91899b05aa8236f7b not found: ID does not exist" containerID="466256fcf8dddb5197d5eca2dd860959fb51e832e51a92f91899b05aa8236f7b" Mar 13 12:52:37 crc kubenswrapper[4837]: I0313 12:52:37.901094 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"466256fcf8dddb5197d5eca2dd860959fb51e832e51a92f91899b05aa8236f7b"} err="failed to get container status \"466256fcf8dddb5197d5eca2dd860959fb51e832e51a92f91899b05aa8236f7b\": rpc error: code = NotFound desc = could not find container \"466256fcf8dddb5197d5eca2dd860959fb51e832e51a92f91899b05aa8236f7b\": container with ID starting with 466256fcf8dddb5197d5eca2dd860959fb51e832e51a92f91899b05aa8236f7b not found: ID does not exist" Mar 13 12:52:39 crc kubenswrapper[4837]: I0313 12:52:39.063949 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6c415fc-c9b8-4e4f-9116-024c9bee94b0" path="/var/lib/kubelet/pods/f6c415fc-c9b8-4e4f-9116-024c9bee94b0/volumes" Mar 13 12:52:39 crc kubenswrapper[4837]: E0313 12:52:39.255676 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ba6a258_c015_4c82_b7d0_736ea4ddf3a0.slice\": RecentStats: unable to find data in memory cache]" Mar 13 12:52:40 crc kubenswrapper[4837]: I0313 12:52:40.958358 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wln6h"] Mar 13 12:52:40 crc kubenswrapper[4837]: E0313 12:52:40.959141 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6c415fc-c9b8-4e4f-9116-024c9bee94b0" containerName="extract-utilities" Mar 13 12:52:40 crc kubenswrapper[4837]: I0313 12:52:40.959159 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6c415fc-c9b8-4e4f-9116-024c9bee94b0" containerName="extract-utilities" Mar 13 12:52:40 crc kubenswrapper[4837]: E0313 12:52:40.959190 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6c415fc-c9b8-4e4f-9116-024c9bee94b0" containerName="extract-content" Mar 13 12:52:40 crc kubenswrapper[4837]: I0313 12:52:40.959198 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6c415fc-c9b8-4e4f-9116-024c9bee94b0" containerName="extract-content" Mar 13 12:52:40 crc kubenswrapper[4837]: E0313 12:52:40.959210 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6c415fc-c9b8-4e4f-9116-024c9bee94b0" containerName="registry-server" Mar 13 12:52:40 crc kubenswrapper[4837]: I0313 12:52:40.959218 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6c415fc-c9b8-4e4f-9116-024c9bee94b0" containerName="registry-server" Mar 13 12:52:40 crc kubenswrapper[4837]: I0313 12:52:40.959478 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6c415fc-c9b8-4e4f-9116-024c9bee94b0" containerName="registry-server" Mar 13 12:52:40 crc kubenswrapper[4837]: I0313 12:52:40.961190 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wln6h" Mar 13 12:52:40 crc kubenswrapper[4837]: I0313 12:52:40.969297 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wln6h"] Mar 13 12:52:41 crc kubenswrapper[4837]: I0313 12:52:41.143620 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33a9f660-ec35-4581-bf36-1daa67adf647-utilities\") pod \"redhat-marketplace-wln6h\" (UID: \"33a9f660-ec35-4581-bf36-1daa67adf647\") " pod="openshift-marketplace/redhat-marketplace-wln6h" Mar 13 12:52:41 crc kubenswrapper[4837]: I0313 12:52:41.143734 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33a9f660-ec35-4581-bf36-1daa67adf647-catalog-content\") pod \"redhat-marketplace-wln6h\" (UID: \"33a9f660-ec35-4581-bf36-1daa67adf647\") " pod="openshift-marketplace/redhat-marketplace-wln6h" Mar 13 12:52:41 crc kubenswrapper[4837]: I0313 12:52:41.143835 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8crp\" (UniqueName: \"kubernetes.io/projected/33a9f660-ec35-4581-bf36-1daa67adf647-kube-api-access-x8crp\") pod \"redhat-marketplace-wln6h\" (UID: \"33a9f660-ec35-4581-bf36-1daa67adf647\") " pod="openshift-marketplace/redhat-marketplace-wln6h" Mar 13 12:52:41 crc kubenswrapper[4837]: I0313 12:52:41.245333 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33a9f660-ec35-4581-bf36-1daa67adf647-utilities\") pod \"redhat-marketplace-wln6h\" (UID: \"33a9f660-ec35-4581-bf36-1daa67adf647\") " pod="openshift-marketplace/redhat-marketplace-wln6h" Mar 13 12:52:41 crc kubenswrapper[4837]: I0313 12:52:41.245377 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33a9f660-ec35-4581-bf36-1daa67adf647-catalog-content\") pod \"redhat-marketplace-wln6h\" (UID: \"33a9f660-ec35-4581-bf36-1daa67adf647\") " pod="openshift-marketplace/redhat-marketplace-wln6h" Mar 13 12:52:41 crc kubenswrapper[4837]: I0313 12:52:41.245437 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8crp\" (UniqueName: \"kubernetes.io/projected/33a9f660-ec35-4581-bf36-1daa67adf647-kube-api-access-x8crp\") pod \"redhat-marketplace-wln6h\" (UID: \"33a9f660-ec35-4581-bf36-1daa67adf647\") " pod="openshift-marketplace/redhat-marketplace-wln6h" Mar 13 12:52:41 crc kubenswrapper[4837]: I0313 12:52:41.246175 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33a9f660-ec35-4581-bf36-1daa67adf647-utilities\") pod \"redhat-marketplace-wln6h\" (UID: \"33a9f660-ec35-4581-bf36-1daa67adf647\") " pod="openshift-marketplace/redhat-marketplace-wln6h" Mar 13 12:52:41 crc kubenswrapper[4837]: I0313 12:52:41.246395 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33a9f660-ec35-4581-bf36-1daa67adf647-catalog-content\") pod \"redhat-marketplace-wln6h\" (UID: \"33a9f660-ec35-4581-bf36-1daa67adf647\") " pod="openshift-marketplace/redhat-marketplace-wln6h" Mar 13 12:52:41 crc kubenswrapper[4837]: I0313 12:52:41.273856 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8crp\" (UniqueName: \"kubernetes.io/projected/33a9f660-ec35-4581-bf36-1daa67adf647-kube-api-access-x8crp\") pod \"redhat-marketplace-wln6h\" (UID: \"33a9f660-ec35-4581-bf36-1daa67adf647\") " pod="openshift-marketplace/redhat-marketplace-wln6h" Mar 13 12:52:41 crc kubenswrapper[4837]: I0313 12:52:41.279282 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wln6h" Mar 13 12:52:41 crc kubenswrapper[4837]: I0313 12:52:41.796886 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wln6h"] Mar 13 12:52:41 crc kubenswrapper[4837]: I0313 12:52:41.862907 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wln6h" event={"ID":"33a9f660-ec35-4581-bf36-1daa67adf647","Type":"ContainerStarted","Data":"78c851a3cb8a707bbc9d75331f4c8ac4e37dc37da153ff8366f930f1e8fb0aff"} Mar 13 12:52:42 crc kubenswrapper[4837]: I0313 12:52:42.871834 4837 generic.go:334] "Generic (PLEG): container finished" podID="33a9f660-ec35-4581-bf36-1daa67adf647" containerID="3b4f16f40bd6bd84c7d2ec9b39478225f132d233a78e3209165694d1ece42356" exitCode=0 Mar 13 12:52:42 crc kubenswrapper[4837]: I0313 12:52:42.871917 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wln6h" event={"ID":"33a9f660-ec35-4581-bf36-1daa67adf647","Type":"ContainerDied","Data":"3b4f16f40bd6bd84c7d2ec9b39478225f132d233a78e3209165694d1ece42356"} Mar 13 12:52:43 crc kubenswrapper[4837]: I0313 12:52:43.883359 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wln6h" event={"ID":"33a9f660-ec35-4581-bf36-1daa67adf647","Type":"ContainerStarted","Data":"de474c833342c79cab38b58fd8149189bb30432f5e3998f127c61412ecca9b36"} Mar 13 12:52:44 crc kubenswrapper[4837]: I0313 12:52:44.892698 4837 generic.go:334] "Generic (PLEG): container finished" podID="33a9f660-ec35-4581-bf36-1daa67adf647" containerID="de474c833342c79cab38b58fd8149189bb30432f5e3998f127c61412ecca9b36" exitCode=0 Mar 13 12:52:44 crc kubenswrapper[4837]: I0313 12:52:44.892801 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wln6h" event={"ID":"33a9f660-ec35-4581-bf36-1daa67adf647","Type":"ContainerDied","Data":"de474c833342c79cab38b58fd8149189bb30432f5e3998f127c61412ecca9b36"} Mar 13 12:52:45 crc kubenswrapper[4837]: I0313 12:52:45.904218 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wln6h" event={"ID":"33a9f660-ec35-4581-bf36-1daa67adf647","Type":"ContainerStarted","Data":"f8d10e2bc125b9421cfea61f6dc8b3970cac439f803759f0dcc1aff668088bea"} Mar 13 12:52:45 crc kubenswrapper[4837]: I0313 12:52:45.929414 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wln6h" podStartSLOduration=3.439762606 podStartE2EDuration="5.929382887s" podCreationTimestamp="2026-03-13 12:52:40 +0000 UTC" firstStartedPulling="2026-03-13 12:52:42.875584458 +0000 UTC m=+3878.513851211" lastFinishedPulling="2026-03-13 12:52:45.365204719 +0000 UTC m=+3881.003471492" observedRunningTime="2026-03-13 12:52:45.926903589 +0000 UTC m=+3881.565170382" watchObservedRunningTime="2026-03-13 12:52:45.929382887 +0000 UTC m=+3881.567649640" Mar 13 12:52:48 crc kubenswrapper[4837]: I0313 12:52:48.048697 4837 scope.go:117] "RemoveContainer" containerID="103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0" Mar 13 12:52:48 crc kubenswrapper[4837]: E0313 12:52:48.049391 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:52:49 crc kubenswrapper[4837]: E0313 12:52:49.481422 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ba6a258_c015_4c82_b7d0_736ea4ddf3a0.slice\": RecentStats: unable to find data in memory cache]" Mar 13 12:52:51 crc kubenswrapper[4837]: I0313 12:52:51.279920 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wln6h" Mar 13 12:52:51 crc kubenswrapper[4837]: I0313 12:52:51.280448 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wln6h" Mar 13 12:52:51 crc kubenswrapper[4837]: I0313 12:52:51.338205 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wln6h" Mar 13 12:52:51 crc kubenswrapper[4837]: I0313 12:52:51.998269 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wln6h" Mar 13 12:52:52 crc kubenswrapper[4837]: I0313 12:52:52.058175 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wln6h"] Mar 13 12:52:52 crc kubenswrapper[4837]: I0313 12:52:52.567076 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-jrm5t_00848ba6-522a-45c7-81bd-7ab287d77626/control-plane-machine-set-operator/0.log" Mar 13 12:52:52 crc kubenswrapper[4837]: I0313 12:52:52.688867 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-vsp2m_6db10103-96be-4420-b302-a7064e347f61/kube-rbac-proxy/0.log" Mar 13 12:52:52 crc kubenswrapper[4837]: I0313 12:52:52.789484 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-vsp2m_6db10103-96be-4420-b302-a7064e347f61/machine-api-operator/0.log" Mar 13 12:52:53 crc kubenswrapper[4837]: I0313 12:52:53.970002 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wln6h" podUID="33a9f660-ec35-4581-bf36-1daa67adf647" containerName="registry-server" containerID="cri-o://f8d10e2bc125b9421cfea61f6dc8b3970cac439f803759f0dcc1aff668088bea" gracePeriod=2 Mar 13 12:52:54 crc kubenswrapper[4837]: I0313 12:52:54.978773 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wln6h" Mar 13 12:52:54 crc kubenswrapper[4837]: I0313 12:52:54.980225 4837 generic.go:334] "Generic (PLEG): container finished" podID="33a9f660-ec35-4581-bf36-1daa67adf647" containerID="f8d10e2bc125b9421cfea61f6dc8b3970cac439f803759f0dcc1aff668088bea" exitCode=0 Mar 13 12:52:54 crc kubenswrapper[4837]: I0313 12:52:54.980291 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wln6h" event={"ID":"33a9f660-ec35-4581-bf36-1daa67adf647","Type":"ContainerDied","Data":"f8d10e2bc125b9421cfea61f6dc8b3970cac439f803759f0dcc1aff668088bea"} Mar 13 12:52:54 crc kubenswrapper[4837]: I0313 12:52:54.980370 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wln6h" event={"ID":"33a9f660-ec35-4581-bf36-1daa67adf647","Type":"ContainerDied","Data":"78c851a3cb8a707bbc9d75331f4c8ac4e37dc37da153ff8366f930f1e8fb0aff"} Mar 13 12:52:54 crc kubenswrapper[4837]: I0313 12:52:54.980396 4837 scope.go:117] "RemoveContainer" containerID="f8d10e2bc125b9421cfea61f6dc8b3970cac439f803759f0dcc1aff668088bea" Mar 13 12:52:55 crc kubenswrapper[4837]: I0313 12:52:55.006904 4837 scope.go:117] "RemoveContainer" containerID="de474c833342c79cab38b58fd8149189bb30432f5e3998f127c61412ecca9b36" Mar 13 12:52:55 crc kubenswrapper[4837]: I0313 12:52:55.056405 4837 scope.go:117] "RemoveContainer" containerID="3b4f16f40bd6bd84c7d2ec9b39478225f132d233a78e3209165694d1ece42356" Mar 13 12:52:55 crc kubenswrapper[4837]: I0313 12:52:55.087737 4837 scope.go:117] "RemoveContainer" containerID="f8d10e2bc125b9421cfea61f6dc8b3970cac439f803759f0dcc1aff668088bea" Mar 13 12:52:55 crc kubenswrapper[4837]: E0313 12:52:55.088179 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8d10e2bc125b9421cfea61f6dc8b3970cac439f803759f0dcc1aff668088bea\": container with ID starting with f8d10e2bc125b9421cfea61f6dc8b3970cac439f803759f0dcc1aff668088bea not found: ID does not exist" containerID="f8d10e2bc125b9421cfea61f6dc8b3970cac439f803759f0dcc1aff668088bea" Mar 13 12:52:55 crc kubenswrapper[4837]: I0313 12:52:55.088232 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8d10e2bc125b9421cfea61f6dc8b3970cac439f803759f0dcc1aff668088bea"} err="failed to get container status \"f8d10e2bc125b9421cfea61f6dc8b3970cac439f803759f0dcc1aff668088bea\": rpc error: code = NotFound desc = could not find container \"f8d10e2bc125b9421cfea61f6dc8b3970cac439f803759f0dcc1aff668088bea\": container with ID starting with f8d10e2bc125b9421cfea61f6dc8b3970cac439f803759f0dcc1aff668088bea not found: ID does not exist" Mar 13 12:52:55 crc kubenswrapper[4837]: I0313 12:52:55.088266 4837 scope.go:117] "RemoveContainer" containerID="de474c833342c79cab38b58fd8149189bb30432f5e3998f127c61412ecca9b36" Mar 13 12:52:55 crc kubenswrapper[4837]: E0313 12:52:55.088704 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de474c833342c79cab38b58fd8149189bb30432f5e3998f127c61412ecca9b36\": container with ID starting with de474c833342c79cab38b58fd8149189bb30432f5e3998f127c61412ecca9b36 not found: ID does not exist" containerID="de474c833342c79cab38b58fd8149189bb30432f5e3998f127c61412ecca9b36" Mar 13 12:52:55 crc kubenswrapper[4837]: I0313 12:52:55.088772 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de474c833342c79cab38b58fd8149189bb30432f5e3998f127c61412ecca9b36"} err="failed to get container status \"de474c833342c79cab38b58fd8149189bb30432f5e3998f127c61412ecca9b36\": rpc error: code = NotFound desc = could not find container \"de474c833342c79cab38b58fd8149189bb30432f5e3998f127c61412ecca9b36\": container with ID starting with de474c833342c79cab38b58fd8149189bb30432f5e3998f127c61412ecca9b36 not found: ID does not exist" Mar 13 12:52:55 crc kubenswrapper[4837]: I0313 12:52:55.088807 4837 scope.go:117] "RemoveContainer" containerID="3b4f16f40bd6bd84c7d2ec9b39478225f132d233a78e3209165694d1ece42356" Mar 13 12:52:55 crc kubenswrapper[4837]: E0313 12:52:55.089196 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b4f16f40bd6bd84c7d2ec9b39478225f132d233a78e3209165694d1ece42356\": container with ID starting with 3b4f16f40bd6bd84c7d2ec9b39478225f132d233a78e3209165694d1ece42356 not found: ID does not exist" containerID="3b4f16f40bd6bd84c7d2ec9b39478225f132d233a78e3209165694d1ece42356" Mar 13 12:52:55 crc kubenswrapper[4837]: I0313 12:52:55.089238 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b4f16f40bd6bd84c7d2ec9b39478225f132d233a78e3209165694d1ece42356"} err="failed to get container status \"3b4f16f40bd6bd84c7d2ec9b39478225f132d233a78e3209165694d1ece42356\": rpc error: code = NotFound desc = could not find container \"3b4f16f40bd6bd84c7d2ec9b39478225f132d233a78e3209165694d1ece42356\": container with ID starting with 3b4f16f40bd6bd84c7d2ec9b39478225f132d233a78e3209165694d1ece42356 not found: ID does not exist" Mar 13 12:52:55 crc kubenswrapper[4837]: I0313 12:52:55.125509 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8crp\" (UniqueName: \"kubernetes.io/projected/33a9f660-ec35-4581-bf36-1daa67adf647-kube-api-access-x8crp\") pod \"33a9f660-ec35-4581-bf36-1daa67adf647\" (UID: \"33a9f660-ec35-4581-bf36-1daa67adf647\") " Mar 13 12:52:55 crc kubenswrapper[4837]: I0313 12:52:55.125570 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33a9f660-ec35-4581-bf36-1daa67adf647-utilities\") pod \"33a9f660-ec35-4581-bf36-1daa67adf647\" (UID: \"33a9f660-ec35-4581-bf36-1daa67adf647\") " Mar 13 12:52:55 crc kubenswrapper[4837]: I0313 12:52:55.125837 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33a9f660-ec35-4581-bf36-1daa67adf647-catalog-content\") pod \"33a9f660-ec35-4581-bf36-1daa67adf647\" (UID: \"33a9f660-ec35-4581-bf36-1daa67adf647\") " Mar 13 12:52:55 crc kubenswrapper[4837]: I0313 12:52:55.126843 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33a9f660-ec35-4581-bf36-1daa67adf647-utilities" (OuterVolumeSpecName: "utilities") pod "33a9f660-ec35-4581-bf36-1daa67adf647" (UID: "33a9f660-ec35-4581-bf36-1daa67adf647"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:52:55 crc kubenswrapper[4837]: I0313 12:52:55.132354 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33a9f660-ec35-4581-bf36-1daa67adf647-kube-api-access-x8crp" (OuterVolumeSpecName: "kube-api-access-x8crp") pod "33a9f660-ec35-4581-bf36-1daa67adf647" (UID: "33a9f660-ec35-4581-bf36-1daa67adf647"). InnerVolumeSpecName "kube-api-access-x8crp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:52:55 crc kubenswrapper[4837]: I0313 12:52:55.162941 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33a9f660-ec35-4581-bf36-1daa67adf647-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "33a9f660-ec35-4581-bf36-1daa67adf647" (UID: "33a9f660-ec35-4581-bf36-1daa67adf647"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:52:55 crc kubenswrapper[4837]: I0313 12:52:55.228210 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33a9f660-ec35-4581-bf36-1daa67adf647-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:52:55 crc kubenswrapper[4837]: I0313 12:52:55.228247 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8crp\" (UniqueName: \"kubernetes.io/projected/33a9f660-ec35-4581-bf36-1daa67adf647-kube-api-access-x8crp\") on node \"crc\" DevicePath \"\"" Mar 13 12:52:55 crc kubenswrapper[4837]: I0313 12:52:55.228257 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33a9f660-ec35-4581-bf36-1daa67adf647-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:52:55 crc kubenswrapper[4837]: I0313 12:52:55.989934 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wln6h" Mar 13 12:52:56 crc kubenswrapper[4837]: I0313 12:52:56.024070 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wln6h"] Mar 13 12:52:56 crc kubenswrapper[4837]: I0313 12:52:56.039406 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wln6h"] Mar 13 12:52:57 crc kubenswrapper[4837]: I0313 12:52:57.058686 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33a9f660-ec35-4581-bf36-1daa67adf647" path="/var/lib/kubelet/pods/33a9f660-ec35-4581-bf36-1daa67adf647/volumes" Mar 13 12:52:59 crc kubenswrapper[4837]: E0313 12:52:59.691448 4837 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ba6a258_c015_4c82_b7d0_736ea4ddf3a0.slice\": RecentStats: unable to find data in memory cache]" Mar 13 12:53:03 crc kubenswrapper[4837]: I0313 12:53:03.048415 4837 scope.go:117] "RemoveContainer" containerID="103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0" Mar 13 12:53:03 crc kubenswrapper[4837]: E0313 12:53:03.049073 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:53:05 crc kubenswrapper[4837]: I0313 12:53:05.542957 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-dlspp_5ecc1237-3421-41d5-8efb-a62399ae1d73/cert-manager-controller/0.log" Mar 13 12:53:05 crc kubenswrapper[4837]: I0313 12:53:05.722668 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-xzv5h_67507b8e-35d5-4dff-9239-45b5ef997e53/cert-manager-cainjector/0.log" Mar 13 12:53:05 crc kubenswrapper[4837]: I0313 12:53:05.791370 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-ht9vn_0e500b82-1f14-4a1e-937d-00248f195033/cert-manager-webhook/0.log" Mar 13 12:53:16 crc kubenswrapper[4837]: I0313 12:53:16.048879 4837 scope.go:117] "RemoveContainer" containerID="103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0" Mar 13 12:53:16 crc kubenswrapper[4837]: E0313 12:53:16.049631 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:53:18 crc kubenswrapper[4837]: I0313 12:53:18.512453 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-fpxmr_00b31b3f-b520-493a-ad26-679e09376e81/nmstate-console-plugin/0.log" Mar 13 12:53:18 crc kubenswrapper[4837]: I0313 12:53:18.735996 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-vqqqz_ebe31727-805d-472e-89d3-e99b11435be1/nmstate-handler/0.log" Mar 13 12:53:18 crc kubenswrapper[4837]: I0313 12:53:18.769614 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-8xzdk_5d1f2d02-86ab-4679-a4e4-530ad37e4302/kube-rbac-proxy/0.log" Mar 13 12:53:18 crc kubenswrapper[4837]: I0313 12:53:18.899411 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-8xzdk_5d1f2d02-86ab-4679-a4e4-530ad37e4302/nmstate-metrics/0.log" Mar 13 12:53:18 crc kubenswrapper[4837]: I0313 12:53:18.950247 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-zf78q_ef7096b9-861a-4889-9318-535c35151777/nmstate-operator/0.log" Mar 13 12:53:19 crc kubenswrapper[4837]: I0313 12:53:19.112108 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-6cx5h_0b06c77a-f41d-41a6-b115-f12cc5109c0c/nmstate-webhook/0.log" Mar 13 12:53:28 crc kubenswrapper[4837]: I0313 12:53:28.048461 4837 scope.go:117] "RemoveContainer" containerID="103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0" Mar 13 12:53:28 crc kubenswrapper[4837]: E0313 12:53:28.049179 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:53:41 crc kubenswrapper[4837]: I0313 12:53:41.048887 4837 scope.go:117] "RemoveContainer" containerID="103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0" Mar 13 12:53:41 crc kubenswrapper[4837]: E0313 12:53:41.049721 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:53:44 crc kubenswrapper[4837]: I0313 12:53:44.922475 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-zm9dj_0ad270d6-2fc1-4ed0-8a87-bef0e59a4c88/kube-rbac-proxy/0.log" Mar 13 12:53:45 crc kubenswrapper[4837]: I0313 12:53:45.063096 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-zm9dj_0ad270d6-2fc1-4ed0-8a87-bef0e59a4c88/controller/0.log" Mar 13 12:53:45 crc kubenswrapper[4837]: I0313 12:53:45.172508 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/cp-frr-files/0.log" Mar 13 12:53:45 crc kubenswrapper[4837]: I0313 12:53:45.342707 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/cp-frr-files/0.log" Mar 13 12:53:45 crc kubenswrapper[4837]: I0313 12:53:45.354345 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/cp-reloader/0.log" Mar 13 12:53:45 crc kubenswrapper[4837]: I0313 12:53:45.396819 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/cp-reloader/0.log" Mar 13 12:53:45 crc kubenswrapper[4837]: I0313 12:53:45.430055 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/cp-metrics/0.log" Mar 13 12:53:45 crc kubenswrapper[4837]: I0313 12:53:45.550663 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/cp-frr-files/0.log" Mar 13 12:53:45 crc kubenswrapper[4837]: I0313 12:53:45.601213 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/cp-reloader/0.log" Mar 13 12:53:45 crc kubenswrapper[4837]: I0313 12:53:45.607196 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/cp-metrics/0.log" Mar 13 12:53:45 crc kubenswrapper[4837]: I0313 12:53:45.650217 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/cp-metrics/0.log" Mar 13 12:53:45 crc kubenswrapper[4837]: I0313 12:53:45.792106 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/cp-reloader/0.log" Mar 13 12:53:45 crc kubenswrapper[4837]: I0313 12:53:45.807130 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/cp-frr-files/0.log" Mar 13 12:53:45 crc kubenswrapper[4837]: I0313 12:53:45.825995 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/controller/0.log" Mar 13 12:53:45 crc kubenswrapper[4837]: I0313 12:53:45.843032 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/cp-metrics/0.log" Mar 13 12:53:45 crc kubenswrapper[4837]: I0313 12:53:45.975820 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/frr-metrics/0.log" Mar 13 12:53:45 crc kubenswrapper[4837]: I0313 12:53:45.992689 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/kube-rbac-proxy/0.log" Mar 13 12:53:46 crc kubenswrapper[4837]: I0313 12:53:46.020288 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/kube-rbac-proxy-frr/0.log" Mar 13 12:53:46 crc kubenswrapper[4837]: I0313 12:53:46.169124 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/reloader/0.log" Mar 13 12:53:46 crc kubenswrapper[4837]: I0313 12:53:46.299319 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-jwgl7_c72405c5-2c81-43f4-93c6-f73f9771be8b/frr-k8s-webhook-server/0.log" Mar 13 12:53:46 crc kubenswrapper[4837]: I0313 12:53:46.434515 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-dcfbdf95f-7x96d_41898fd8-d078-444c-bb55-33f4fb6f3dcc/manager/0.log" Mar 13 12:53:46 crc kubenswrapper[4837]: I0313 12:53:46.630467 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-59b847b88-lrvzm_eabfad13-4fe4-495d-8b6a-2da56ef3b826/webhook-server/0.log" Mar 13 12:53:46 crc kubenswrapper[4837]: I0313 12:53:46.823842 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-8skdh_82a5fe00-90be-47b1-a357-69942f385d4f/kube-rbac-proxy/0.log" Mar 13 12:53:47 crc kubenswrapper[4837]: I0313 12:53:47.441337 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-8skdh_82a5fe00-90be-47b1-a357-69942f385d4f/speaker/0.log" Mar 13 12:53:47 crc kubenswrapper[4837]: I0313 12:53:47.916477 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-f8m9m_387739fd-caae-44d0-8cbb-50808d69618b/frr/0.log" Mar 13 12:53:52 crc kubenswrapper[4837]: I0313 12:53:52.049664 4837 scope.go:117] "RemoveContainer" containerID="103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0" Mar 13 12:53:52 crc kubenswrapper[4837]: E0313 12:53:52.050363 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:54:00 crc kubenswrapper[4837]: I0313 12:54:00.152332 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556774-vbb4z"] Mar 13 12:54:00 crc kubenswrapper[4837]: E0313 12:54:00.153325 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33a9f660-ec35-4581-bf36-1daa67adf647" containerName="extract-content" Mar 13 12:54:00 crc kubenswrapper[4837]: I0313 12:54:00.153339 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="33a9f660-ec35-4581-bf36-1daa67adf647" containerName="extract-content" Mar 13 12:54:00 crc kubenswrapper[4837]: E0313 12:54:00.153351 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33a9f660-ec35-4581-bf36-1daa67adf647" containerName="registry-server" Mar 13 12:54:00 crc kubenswrapper[4837]: I0313 12:54:00.153357 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="33a9f660-ec35-4581-bf36-1daa67adf647" containerName="registry-server" Mar 13 12:54:00 crc kubenswrapper[4837]: E0313 12:54:00.153365 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33a9f660-ec35-4581-bf36-1daa67adf647" containerName="extract-utilities" Mar 13 12:54:00 crc kubenswrapper[4837]: I0313 12:54:00.153372 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="33a9f660-ec35-4581-bf36-1daa67adf647" containerName="extract-utilities" Mar 13 12:54:00 crc kubenswrapper[4837]: I0313 12:54:00.153594 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="33a9f660-ec35-4581-bf36-1daa67adf647" containerName="registry-server" Mar 13 12:54:00 crc kubenswrapper[4837]: I0313 12:54:00.154241 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556774-vbb4z" Mar 13 12:54:00 crc kubenswrapper[4837]: I0313 12:54:00.156415 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:54:00 crc kubenswrapper[4837]: I0313 12:54:00.157187 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:54:00 crc kubenswrapper[4837]: I0313 12:54:00.157464 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:54:00 crc kubenswrapper[4837]: I0313 12:54:00.163352 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556774-vbb4z"] Mar 13 12:54:00 crc kubenswrapper[4837]: I0313 12:54:00.279815 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k49dn\" (UniqueName: \"kubernetes.io/projected/0473d8e9-f078-403a-a76b-c5bb02c0840d-kube-api-access-k49dn\") pod \"auto-csr-approver-29556774-vbb4z\" (UID: \"0473d8e9-f078-403a-a76b-c5bb02c0840d\") " pod="openshift-infra/auto-csr-approver-29556774-vbb4z" Mar 13 12:54:00 crc kubenswrapper[4837]: I0313 12:54:00.381960 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k49dn\" (UniqueName: \"kubernetes.io/projected/0473d8e9-f078-403a-a76b-c5bb02c0840d-kube-api-access-k49dn\") pod \"auto-csr-approver-29556774-vbb4z\" (UID: \"0473d8e9-f078-403a-a76b-c5bb02c0840d\") " pod="openshift-infra/auto-csr-approver-29556774-vbb4z" Mar 13 12:54:00 crc kubenswrapper[4837]: I0313 12:54:00.403834 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k49dn\" (UniqueName: \"kubernetes.io/projected/0473d8e9-f078-403a-a76b-c5bb02c0840d-kube-api-access-k49dn\") pod \"auto-csr-approver-29556774-vbb4z\" (UID: \"0473d8e9-f078-403a-a76b-c5bb02c0840d\") " pod="openshift-infra/auto-csr-approver-29556774-vbb4z" Mar 13 12:54:00 crc kubenswrapper[4837]: I0313 12:54:00.476006 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556774-vbb4z" Mar 13 12:54:00 crc kubenswrapper[4837]: I0313 12:54:00.972804 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556774-vbb4z"] Mar 13 12:54:00 crc kubenswrapper[4837]: I0313 12:54:00.984329 4837 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 12:54:01 crc kubenswrapper[4837]: I0313 12:54:01.542255 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556774-vbb4z" event={"ID":"0473d8e9-f078-403a-a76b-c5bb02c0840d","Type":"ContainerStarted","Data":"d84d2372cbdd47ff9d3dd65af9962256a6b22fb77f77936dea44b769d8318cb6"} Mar 13 12:54:01 crc kubenswrapper[4837]: I0313 12:54:01.578281 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h_c49e70e5-a4f6-4782-aa38-2faeb20ec38a/util/0.log" Mar 13 12:54:01 crc kubenswrapper[4837]: I0313 12:54:01.743330 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h_c49e70e5-a4f6-4782-aa38-2faeb20ec38a/pull/0.log" Mar 13 12:54:01 crc kubenswrapper[4837]: I0313 12:54:01.746954 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h_c49e70e5-a4f6-4782-aa38-2faeb20ec38a/util/0.log" Mar 13 12:54:01 crc kubenswrapper[4837]: I0313 12:54:01.790503 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h_c49e70e5-a4f6-4782-aa38-2faeb20ec38a/pull/0.log" Mar 13 12:54:01 crc kubenswrapper[4837]: I0313 12:54:01.971584 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h_c49e70e5-a4f6-4782-aa38-2faeb20ec38a/pull/0.log" Mar 13 12:54:01 crc kubenswrapper[4837]: I0313 12:54:01.987184 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h_c49e70e5-a4f6-4782-aa38-2faeb20ec38a/extract/0.log" Mar 13 12:54:02 crc kubenswrapper[4837]: I0313 12:54:02.013820 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8749gc4h_c49e70e5-a4f6-4782-aa38-2faeb20ec38a/util/0.log" Mar 13 12:54:02 crc kubenswrapper[4837]: I0313 12:54:02.199100 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf_b1863878-b849-4485-9e78-35c9f9856697/util/0.log" Mar 13 12:54:02 crc kubenswrapper[4837]: I0313 12:54:02.339213 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf_b1863878-b849-4485-9e78-35c9f9856697/util/0.log" Mar 13 12:54:02 crc kubenswrapper[4837]: I0313 12:54:02.387379 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf_b1863878-b849-4485-9e78-35c9f9856697/pull/0.log" Mar 13 12:54:02 crc kubenswrapper[4837]: I0313 12:54:02.390337 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf_b1863878-b849-4485-9e78-35c9f9856697/pull/0.log" Mar 13 12:54:02 crc kubenswrapper[4837]: I0313 12:54:02.554384 4837 generic.go:334] "Generic (PLEG): container finished" podID="0473d8e9-f078-403a-a76b-c5bb02c0840d" containerID="122fd9d8a5ad9ec96047d911c5e084e75217bd7ee019902064f096162b6ade7b" exitCode=0 Mar 13 12:54:02 crc kubenswrapper[4837]: I0313 12:54:02.554452 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556774-vbb4z" event={"ID":"0473d8e9-f078-403a-a76b-c5bb02c0840d","Type":"ContainerDied","Data":"122fd9d8a5ad9ec96047d911c5e084e75217bd7ee019902064f096162b6ade7b"} Mar 13 12:54:02 crc kubenswrapper[4837]: I0313 12:54:02.556574 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf_b1863878-b849-4485-9e78-35c9f9856697/pull/0.log" Mar 13 12:54:02 crc kubenswrapper[4837]: I0313 12:54:02.585526 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf_b1863878-b849-4485-9e78-35c9f9856697/util/0.log" Mar 13 12:54:02 crc kubenswrapper[4837]: I0313 12:54:02.586038 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmcnf_b1863878-b849-4485-9e78-35c9f9856697/extract/0.log" Mar 13 12:54:02 crc kubenswrapper[4837]: I0313 12:54:02.748985 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zckjb_4298f221-fd11-49a1-a0e9-6f95dbdedc44/extract-utilities/0.log" Mar 13 12:54:02 crc kubenswrapper[4837]: I0313 12:54:02.955598 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zckjb_4298f221-fd11-49a1-a0e9-6f95dbdedc44/extract-utilities/0.log" Mar 13 12:54:02 crc kubenswrapper[4837]: I0313 12:54:02.973270 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zckjb_4298f221-fd11-49a1-a0e9-6f95dbdedc44/extract-content/0.log" Mar 13 12:54:02 crc kubenswrapper[4837]: I0313 12:54:02.996380 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zckjb_4298f221-fd11-49a1-a0e9-6f95dbdedc44/extract-content/0.log" Mar 13 12:54:03 crc kubenswrapper[4837]: I0313 12:54:03.186498 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zckjb_4298f221-fd11-49a1-a0e9-6f95dbdedc44/extract-content/0.log" Mar 13 12:54:03 crc kubenswrapper[4837]: I0313 12:54:03.186659 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zckjb_4298f221-fd11-49a1-a0e9-6f95dbdedc44/extract-utilities/0.log" Mar 13 12:54:03 crc kubenswrapper[4837]: I0313 12:54:03.399515 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tjvc6_07889497-1048-4f7a-9245-132767bb28b6/extract-utilities/0.log" Mar 13 12:54:03 crc kubenswrapper[4837]: I0313 12:54:03.570861 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tjvc6_07889497-1048-4f7a-9245-132767bb28b6/extract-utilities/0.log" Mar 13 12:54:03 crc kubenswrapper[4837]: I0313 12:54:03.601681 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tjvc6_07889497-1048-4f7a-9245-132767bb28b6/extract-content/0.log" Mar 13 12:54:03 crc kubenswrapper[4837]: I0313 12:54:03.612773 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tjvc6_07889497-1048-4f7a-9245-132767bb28b6/extract-content/0.log" Mar 13 12:54:03 crc kubenswrapper[4837]: I0313 12:54:03.753481 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zckjb_4298f221-fd11-49a1-a0e9-6f95dbdedc44/registry-server/0.log" Mar 13 12:54:03 crc kubenswrapper[4837]: I0313 12:54:03.841419 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tjvc6_07889497-1048-4f7a-9245-132767bb28b6/extract-content/0.log" Mar 13 12:54:03 crc kubenswrapper[4837]: I0313 12:54:03.854710 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tjvc6_07889497-1048-4f7a-9245-132767bb28b6/extract-utilities/0.log" Mar 13 12:54:04 crc kubenswrapper[4837]: I0313 12:54:04.060273 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556774-vbb4z" Mar 13 12:54:04 crc kubenswrapper[4837]: I0313 12:54:04.150368 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-7rzpc_b87c8f86-a346-4907-9441-048c3220646f/marketplace-operator/0.log" Mar 13 12:54:04 crc kubenswrapper[4837]: I0313 12:54:04.173561 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k49dn\" (UniqueName: \"kubernetes.io/projected/0473d8e9-f078-403a-a76b-c5bb02c0840d-kube-api-access-k49dn\") pod \"0473d8e9-f078-403a-a76b-c5bb02c0840d\" (UID: \"0473d8e9-f078-403a-a76b-c5bb02c0840d\") " Mar 13 12:54:04 crc kubenswrapper[4837]: I0313 12:54:04.182895 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0473d8e9-f078-403a-a76b-c5bb02c0840d-kube-api-access-k49dn" (OuterVolumeSpecName: "kube-api-access-k49dn") pod "0473d8e9-f078-403a-a76b-c5bb02c0840d" (UID: "0473d8e9-f078-403a-a76b-c5bb02c0840d"). InnerVolumeSpecName "kube-api-access-k49dn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:54:04 crc kubenswrapper[4837]: I0313 12:54:04.275510 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k49dn\" (UniqueName: \"kubernetes.io/projected/0473d8e9-f078-403a-a76b-c5bb02c0840d-kube-api-access-k49dn\") on node \"crc\" DevicePath \"\"" Mar 13 12:54:04 crc kubenswrapper[4837]: I0313 12:54:04.286470 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m5v5n_fec78503-41e5-45f4-9217-1debe55ec107/extract-utilities/0.log" Mar 13 12:54:04 crc kubenswrapper[4837]: I0313 12:54:04.435527 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tjvc6_07889497-1048-4f7a-9245-132767bb28b6/registry-server/0.log" Mar 13 12:54:04 crc kubenswrapper[4837]: I0313 12:54:04.439598 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m5v5n_fec78503-41e5-45f4-9217-1debe55ec107/extract-content/0.log" Mar 13 12:54:04 crc kubenswrapper[4837]: I0313 12:54:04.501747 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m5v5n_fec78503-41e5-45f4-9217-1debe55ec107/extract-utilities/0.log" Mar 13 12:54:04 crc kubenswrapper[4837]: I0313 12:54:04.567319 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m5v5n_fec78503-41e5-45f4-9217-1debe55ec107/extract-content/0.log" Mar 13 12:54:04 crc kubenswrapper[4837]: I0313 12:54:04.571913 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556774-vbb4z" event={"ID":"0473d8e9-f078-403a-a76b-c5bb02c0840d","Type":"ContainerDied","Data":"d84d2372cbdd47ff9d3dd65af9962256a6b22fb77f77936dea44b769d8318cb6"} Mar 13 12:54:04 crc kubenswrapper[4837]: I0313 12:54:04.571960 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d84d2372cbdd47ff9d3dd65af9962256a6b22fb77f77936dea44b769d8318cb6" Mar 13 12:54:04 crc kubenswrapper[4837]: I0313 12:54:04.572892 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556774-vbb4z" Mar 13 12:54:04 crc kubenswrapper[4837]: I0313 12:54:04.693996 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m5v5n_fec78503-41e5-45f4-9217-1debe55ec107/extract-utilities/0.log" Mar 13 12:54:04 crc kubenswrapper[4837]: I0313 12:54:04.740058 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m5v5n_fec78503-41e5-45f4-9217-1debe55ec107/extract-content/0.log" Mar 13 12:54:04 crc kubenswrapper[4837]: I0313 12:54:04.820943 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m5v5n_fec78503-41e5-45f4-9217-1debe55ec107/registry-server/0.log" Mar 13 12:54:04 crc kubenswrapper[4837]: I0313 12:54:04.962899 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kpp2z_8d96905d-521e-4ab9-87a8-d6edd0c027ed/extract-utilities/0.log" Mar 13 12:54:05 crc kubenswrapper[4837]: I0313 12:54:05.057341 4837 scope.go:117] "RemoveContainer" containerID="103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0" Mar 13 12:54:05 crc kubenswrapper[4837]: E0313 12:54:05.057586 4837 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2td4d_openshift-machine-config-operator(338e0d25-c97d-42ec-a8ec-51ddf77a5ed8)\"" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" Mar 13 12:54:05 crc kubenswrapper[4837]: I0313 12:54:05.135212 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556768-f66sk"] Mar 13 12:54:05 crc kubenswrapper[4837]: I0313 12:54:05.136166 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kpp2z_8d96905d-521e-4ab9-87a8-d6edd0c027ed/extract-utilities/0.log" Mar 13 12:54:05 crc kubenswrapper[4837]: I0313 12:54:05.146822 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556768-f66sk"] Mar 13 12:54:05 crc kubenswrapper[4837]: I0313 12:54:05.169548 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kpp2z_8d96905d-521e-4ab9-87a8-d6edd0c027ed/extract-content/0.log" Mar 13 12:54:05 crc kubenswrapper[4837]: I0313 12:54:05.197912 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kpp2z_8d96905d-521e-4ab9-87a8-d6edd0c027ed/extract-content/0.log" Mar 13 12:54:05 crc kubenswrapper[4837]: I0313 12:54:05.369599 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kpp2z_8d96905d-521e-4ab9-87a8-d6edd0c027ed/extract-utilities/0.log" Mar 13 12:54:05 crc kubenswrapper[4837]: I0313 12:54:05.384411 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kpp2z_8d96905d-521e-4ab9-87a8-d6edd0c027ed/extract-content/0.log" Mar 13 12:54:05 crc kubenswrapper[4837]: I0313 12:54:05.886744 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kpp2z_8d96905d-521e-4ab9-87a8-d6edd0c027ed/registry-server/0.log" Mar 13 12:54:07 crc kubenswrapper[4837]: I0313 12:54:07.089213 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b297ac1-71ba-4b15-b915-a38f9da4ebb7" path="/var/lib/kubelet/pods/1b297ac1-71ba-4b15-b915-a38f9da4ebb7/volumes" Mar 13 12:54:20 crc kubenswrapper[4837]: I0313 12:54:20.048817 4837 scope.go:117] "RemoveContainer" containerID="103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0" Mar 13 12:54:20 crc kubenswrapper[4837]: I0313 12:54:20.728910 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerStarted","Data":"29354e15c5b23a19853789916f8ed484d322f379033f92e55fb369856d2b8dbb"} Mar 13 12:54:33 crc kubenswrapper[4837]: I0313 12:54:33.245021 4837 scope.go:117] "RemoveContainer" containerID="6da52e600ecb49afa497ca1fed54ebec9623af66e73a4cbe5e0c9804569c398b" Mar 13 12:54:37 crc kubenswrapper[4837]: E0313 12:54:37.140818 4837 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.138:35258->38.102.83.138:43005: write tcp 38.102.83.138:35258->38.102.83.138:43005: write: broken pipe Mar 13 12:55:45 crc kubenswrapper[4837]: I0313 12:55:45.871738 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rhlvr"] Mar 13 12:55:45 crc kubenswrapper[4837]: E0313 12:55:45.873033 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0473d8e9-f078-403a-a76b-c5bb02c0840d" containerName="oc" Mar 13 12:55:45 crc kubenswrapper[4837]: I0313 12:55:45.873052 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="0473d8e9-f078-403a-a76b-c5bb02c0840d" containerName="oc" Mar 13 12:55:45 crc kubenswrapper[4837]: I0313 12:55:45.873311 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="0473d8e9-f078-403a-a76b-c5bb02c0840d" containerName="oc" Mar 13 12:55:45 crc kubenswrapper[4837]: I0313 12:55:45.874992 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rhlvr" Mar 13 12:55:45 crc kubenswrapper[4837]: I0313 12:55:45.904165 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rhlvr"] Mar 13 12:55:45 crc kubenswrapper[4837]: I0313 12:55:45.990280 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b9fb61f-c188-4889-a632-b3e0e4807ced-catalog-content\") pod \"community-operators-rhlvr\" (UID: \"1b9fb61f-c188-4889-a632-b3e0e4807ced\") " pod="openshift-marketplace/community-operators-rhlvr" Mar 13 12:55:45 crc kubenswrapper[4837]: I0313 12:55:45.990356 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b9fb61f-c188-4889-a632-b3e0e4807ced-utilities\") pod \"community-operators-rhlvr\" (UID: \"1b9fb61f-c188-4889-a632-b3e0e4807ced\") " pod="openshift-marketplace/community-operators-rhlvr" Mar 13 12:55:45 crc kubenswrapper[4837]: I0313 12:55:45.990692 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgp98\" (UniqueName: \"kubernetes.io/projected/1b9fb61f-c188-4889-a632-b3e0e4807ced-kube-api-access-lgp98\") pod \"community-operators-rhlvr\" (UID: \"1b9fb61f-c188-4889-a632-b3e0e4807ced\") " pod="openshift-marketplace/community-operators-rhlvr" Mar 13 12:55:46 crc kubenswrapper[4837]: I0313 12:55:46.092538 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgp98\" (UniqueName: \"kubernetes.io/projected/1b9fb61f-c188-4889-a632-b3e0e4807ced-kube-api-access-lgp98\") pod \"community-operators-rhlvr\" (UID: \"1b9fb61f-c188-4889-a632-b3e0e4807ced\") " pod="openshift-marketplace/community-operators-rhlvr" Mar 13 12:55:46 crc kubenswrapper[4837]: I0313 12:55:46.092675 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b9fb61f-c188-4889-a632-b3e0e4807ced-catalog-content\") pod \"community-operators-rhlvr\" (UID: \"1b9fb61f-c188-4889-a632-b3e0e4807ced\") " pod="openshift-marketplace/community-operators-rhlvr" Mar 13 12:55:46 crc kubenswrapper[4837]: I0313 12:55:46.092721 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b9fb61f-c188-4889-a632-b3e0e4807ced-utilities\") pod \"community-operators-rhlvr\" (UID: \"1b9fb61f-c188-4889-a632-b3e0e4807ced\") " pod="openshift-marketplace/community-operators-rhlvr" Mar 13 12:55:46 crc kubenswrapper[4837]: I0313 12:55:46.093272 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b9fb61f-c188-4889-a632-b3e0e4807ced-utilities\") pod \"community-operators-rhlvr\" (UID: \"1b9fb61f-c188-4889-a632-b3e0e4807ced\") " pod="openshift-marketplace/community-operators-rhlvr" Mar 13 12:55:46 crc kubenswrapper[4837]: I0313 12:55:46.093409 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b9fb61f-c188-4889-a632-b3e0e4807ced-catalog-content\") pod \"community-operators-rhlvr\" (UID: \"1b9fb61f-c188-4889-a632-b3e0e4807ced\") " pod="openshift-marketplace/community-operators-rhlvr" Mar 13 12:55:46 crc kubenswrapper[4837]: I0313 12:55:46.622498 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgp98\" (UniqueName: \"kubernetes.io/projected/1b9fb61f-c188-4889-a632-b3e0e4807ced-kube-api-access-lgp98\") pod \"community-operators-rhlvr\" (UID: \"1b9fb61f-c188-4889-a632-b3e0e4807ced\") " pod="openshift-marketplace/community-operators-rhlvr" Mar 13 12:55:46 crc kubenswrapper[4837]: I0313 12:55:46.802278 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rhlvr" Mar 13 12:55:47 crc kubenswrapper[4837]: I0313 12:55:47.273953 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rhlvr"] Mar 13 12:55:47 crc kubenswrapper[4837]: I0313 12:55:47.641415 4837 generic.go:334] "Generic (PLEG): container finished" podID="1b9fb61f-c188-4889-a632-b3e0e4807ced" containerID="b7bf1cee88f3eaca26285ce714fdc8c7b26a974cc8bd89f746a3db6e78eba902" exitCode=0 Mar 13 12:55:47 crc kubenswrapper[4837]: I0313 12:55:47.641463 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rhlvr" event={"ID":"1b9fb61f-c188-4889-a632-b3e0e4807ced","Type":"ContainerDied","Data":"b7bf1cee88f3eaca26285ce714fdc8c7b26a974cc8bd89f746a3db6e78eba902"} Mar 13 12:55:47 crc kubenswrapper[4837]: I0313 12:55:47.641491 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rhlvr" event={"ID":"1b9fb61f-c188-4889-a632-b3e0e4807ced","Type":"ContainerStarted","Data":"76392729e1c1144daa565ff3166e05881bf3e6a854cfffe6c7fa94d800f4c807"} Mar 13 12:55:48 crc kubenswrapper[4837]: I0313 12:55:48.651027 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rhlvr" event={"ID":"1b9fb61f-c188-4889-a632-b3e0e4807ced","Type":"ContainerStarted","Data":"40ba339fa75a80c8cfd85992b52dd9d9f36065f16925c616d3a71c9c95bd8fd1"} Mar 13 12:55:50 crc kubenswrapper[4837]: I0313 12:55:50.676469 4837 generic.go:334] "Generic (PLEG): container finished" podID="1b9fb61f-c188-4889-a632-b3e0e4807ced" containerID="40ba339fa75a80c8cfd85992b52dd9d9f36065f16925c616d3a71c9c95bd8fd1" exitCode=0 Mar 13 12:55:50 crc kubenswrapper[4837]: I0313 12:55:50.676554 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rhlvr" event={"ID":"1b9fb61f-c188-4889-a632-b3e0e4807ced","Type":"ContainerDied","Data":"40ba339fa75a80c8cfd85992b52dd9d9f36065f16925c616d3a71c9c95bd8fd1"} Mar 13 12:55:51 crc kubenswrapper[4837]: I0313 12:55:51.688793 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rhlvr" event={"ID":"1b9fb61f-c188-4889-a632-b3e0e4807ced","Type":"ContainerStarted","Data":"fe95202103b15f3830e412f24d36b5902a1bf208f03066384d4ee35a8f90949b"} Mar 13 12:55:51 crc kubenswrapper[4837]: I0313 12:55:51.710231 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rhlvr" podStartSLOduration=3.249867477 podStartE2EDuration="6.710203664s" podCreationTimestamp="2026-03-13 12:55:45 +0000 UTC" firstStartedPulling="2026-03-13 12:55:47.64371904 +0000 UTC m=+4063.281985813" lastFinishedPulling="2026-03-13 12:55:51.104055237 +0000 UTC m=+4066.742322000" observedRunningTime="2026-03-13 12:55:51.704623059 +0000 UTC m=+4067.342889822" watchObservedRunningTime="2026-03-13 12:55:51.710203664 +0000 UTC m=+4067.348470427" Mar 13 12:55:56 crc kubenswrapper[4837]: I0313 12:55:56.803616 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rhlvr" Mar 13 12:55:56 crc kubenswrapper[4837]: I0313 12:55:56.804145 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rhlvr" Mar 13 12:55:56 crc kubenswrapper[4837]: I0313 12:55:56.851577 4837 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rhlvr" Mar 13 12:55:57 crc kubenswrapper[4837]: I0313 12:55:57.761214 4837 generic.go:334] "Generic (PLEG): container finished" podID="130c1c0e-31b1-415d-aab2-fab358576a73" containerID="433a139fea2255c45e8580415a3deca8258493b41f46198b67c0eac345fb5a75" exitCode=0 Mar 13 12:55:57 crc kubenswrapper[4837]: I0313 12:55:57.761417 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lkqv5/must-gather-vz7zz" event={"ID":"130c1c0e-31b1-415d-aab2-fab358576a73","Type":"ContainerDied","Data":"433a139fea2255c45e8580415a3deca8258493b41f46198b67c0eac345fb5a75"} Mar 13 12:55:57 crc kubenswrapper[4837]: I0313 12:55:57.763376 4837 scope.go:117] "RemoveContainer" containerID="433a139fea2255c45e8580415a3deca8258493b41f46198b67c0eac345fb5a75" Mar 13 12:55:57 crc kubenswrapper[4837]: I0313 12:55:57.824369 4837 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rhlvr" Mar 13 12:55:57 crc kubenswrapper[4837]: I0313 12:55:57.871945 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rhlvr"] Mar 13 12:55:58 crc kubenswrapper[4837]: I0313 12:55:58.414684 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-lkqv5_must-gather-vz7zz_130c1c0e-31b1-415d-aab2-fab358576a73/gather/0.log" Mar 13 12:55:59 crc kubenswrapper[4837]: I0313 12:55:59.778460 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rhlvr" podUID="1b9fb61f-c188-4889-a632-b3e0e4807ced" containerName="registry-server" containerID="cri-o://fe95202103b15f3830e412f24d36b5902a1bf208f03066384d4ee35a8f90949b" gracePeriod=2 Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.146198 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556776-t6fxv"] Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.151117 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556776-t6fxv" Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.155057 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.155217 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.155408 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.157219 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556776-t6fxv"] Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.157761 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwg5x\" (UniqueName: \"kubernetes.io/projected/d64a81f2-3643-4e57-8322-c09c0360d46b-kube-api-access-jwg5x\") pod \"auto-csr-approver-29556776-t6fxv\" (UID: \"d64a81f2-3643-4e57-8322-c09c0360d46b\") " pod="openshift-infra/auto-csr-approver-29556776-t6fxv" Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.259059 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwg5x\" (UniqueName: \"kubernetes.io/projected/d64a81f2-3643-4e57-8322-c09c0360d46b-kube-api-access-jwg5x\") pod \"auto-csr-approver-29556776-t6fxv\" (UID: \"d64a81f2-3643-4e57-8322-c09c0360d46b\") " pod="openshift-infra/auto-csr-approver-29556776-t6fxv" Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.270126 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rhlvr" Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.286312 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwg5x\" (UniqueName: \"kubernetes.io/projected/d64a81f2-3643-4e57-8322-c09c0360d46b-kube-api-access-jwg5x\") pod \"auto-csr-approver-29556776-t6fxv\" (UID: \"d64a81f2-3643-4e57-8322-c09c0360d46b\") " pod="openshift-infra/auto-csr-approver-29556776-t6fxv" Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.364991 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b9fb61f-c188-4889-a632-b3e0e4807ced-catalog-content\") pod \"1b9fb61f-c188-4889-a632-b3e0e4807ced\" (UID: \"1b9fb61f-c188-4889-a632-b3e0e4807ced\") " Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.365097 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgp98\" (UniqueName: \"kubernetes.io/projected/1b9fb61f-c188-4889-a632-b3e0e4807ced-kube-api-access-lgp98\") pod \"1b9fb61f-c188-4889-a632-b3e0e4807ced\" (UID: \"1b9fb61f-c188-4889-a632-b3e0e4807ced\") " Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.365186 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b9fb61f-c188-4889-a632-b3e0e4807ced-utilities\") pod \"1b9fb61f-c188-4889-a632-b3e0e4807ced\" (UID: \"1b9fb61f-c188-4889-a632-b3e0e4807ced\") " Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.366571 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b9fb61f-c188-4889-a632-b3e0e4807ced-utilities" (OuterVolumeSpecName: "utilities") pod "1b9fb61f-c188-4889-a632-b3e0e4807ced" (UID: "1b9fb61f-c188-4889-a632-b3e0e4807ced"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.370010 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b9fb61f-c188-4889-a632-b3e0e4807ced-kube-api-access-lgp98" (OuterVolumeSpecName: "kube-api-access-lgp98") pod "1b9fb61f-c188-4889-a632-b3e0e4807ced" (UID: "1b9fb61f-c188-4889-a632-b3e0e4807ced"). InnerVolumeSpecName "kube-api-access-lgp98". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.447749 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b9fb61f-c188-4889-a632-b3e0e4807ced-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1b9fb61f-c188-4889-a632-b3e0e4807ced" (UID: "1b9fb61f-c188-4889-a632-b3e0e4807ced"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.468092 4837 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b9fb61f-c188-4889-a632-b3e0e4807ced-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.468131 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgp98\" (UniqueName: \"kubernetes.io/projected/1b9fb61f-c188-4889-a632-b3e0e4807ced-kube-api-access-lgp98\") on node \"crc\" DevicePath \"\"" Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.468149 4837 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b9fb61f-c188-4889-a632-b3e0e4807ced-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.559688 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556776-t6fxv" Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.795742 4837 generic.go:334] "Generic (PLEG): container finished" podID="1b9fb61f-c188-4889-a632-b3e0e4807ced" containerID="fe95202103b15f3830e412f24d36b5902a1bf208f03066384d4ee35a8f90949b" exitCode=0 Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.795813 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rhlvr" event={"ID":"1b9fb61f-c188-4889-a632-b3e0e4807ced","Type":"ContainerDied","Data":"fe95202103b15f3830e412f24d36b5902a1bf208f03066384d4ee35a8f90949b"} Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.795854 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rhlvr" event={"ID":"1b9fb61f-c188-4889-a632-b3e0e4807ced","Type":"ContainerDied","Data":"76392729e1c1144daa565ff3166e05881bf3e6a854cfffe6c7fa94d800f4c807"} Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.795876 4837 scope.go:117] "RemoveContainer" containerID="fe95202103b15f3830e412f24d36b5902a1bf208f03066384d4ee35a8f90949b" Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.796052 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rhlvr" Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.837781 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rhlvr"] Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.845152 4837 scope.go:117] "RemoveContainer" containerID="40ba339fa75a80c8cfd85992b52dd9d9f36065f16925c616d3a71c9c95bd8fd1" Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.848504 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rhlvr"] Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.865606 4837 scope.go:117] "RemoveContainer" containerID="b7bf1cee88f3eaca26285ce714fdc8c7b26a974cc8bd89f746a3db6e78eba902" Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.882957 4837 scope.go:117] "RemoveContainer" containerID="fe95202103b15f3830e412f24d36b5902a1bf208f03066384d4ee35a8f90949b" Mar 13 12:56:00 crc kubenswrapper[4837]: E0313 12:56:00.887913 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe95202103b15f3830e412f24d36b5902a1bf208f03066384d4ee35a8f90949b\": container with ID starting with fe95202103b15f3830e412f24d36b5902a1bf208f03066384d4ee35a8f90949b not found: ID does not exist" containerID="fe95202103b15f3830e412f24d36b5902a1bf208f03066384d4ee35a8f90949b" Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.887973 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe95202103b15f3830e412f24d36b5902a1bf208f03066384d4ee35a8f90949b"} err="failed to get container status \"fe95202103b15f3830e412f24d36b5902a1bf208f03066384d4ee35a8f90949b\": rpc error: code = NotFound desc = could not find container \"fe95202103b15f3830e412f24d36b5902a1bf208f03066384d4ee35a8f90949b\": container with ID starting with fe95202103b15f3830e412f24d36b5902a1bf208f03066384d4ee35a8f90949b not found: ID does not exist" Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.888250 4837 scope.go:117] "RemoveContainer" containerID="40ba339fa75a80c8cfd85992b52dd9d9f36065f16925c616d3a71c9c95bd8fd1" Mar 13 12:56:00 crc kubenswrapper[4837]: E0313 12:56:00.888695 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40ba339fa75a80c8cfd85992b52dd9d9f36065f16925c616d3a71c9c95bd8fd1\": container with ID starting with 40ba339fa75a80c8cfd85992b52dd9d9f36065f16925c616d3a71c9c95bd8fd1 not found: ID does not exist" containerID="40ba339fa75a80c8cfd85992b52dd9d9f36065f16925c616d3a71c9c95bd8fd1" Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.888719 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40ba339fa75a80c8cfd85992b52dd9d9f36065f16925c616d3a71c9c95bd8fd1"} err="failed to get container status \"40ba339fa75a80c8cfd85992b52dd9d9f36065f16925c616d3a71c9c95bd8fd1\": rpc error: code = NotFound desc = could not find container \"40ba339fa75a80c8cfd85992b52dd9d9f36065f16925c616d3a71c9c95bd8fd1\": container with ID starting with 40ba339fa75a80c8cfd85992b52dd9d9f36065f16925c616d3a71c9c95bd8fd1 not found: ID does not exist" Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.888735 4837 scope.go:117] "RemoveContainer" containerID="b7bf1cee88f3eaca26285ce714fdc8c7b26a974cc8bd89f746a3db6e78eba902" Mar 13 12:56:00 crc kubenswrapper[4837]: E0313 12:56:00.889020 4837 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7bf1cee88f3eaca26285ce714fdc8c7b26a974cc8bd89f746a3db6e78eba902\": container with ID starting with b7bf1cee88f3eaca26285ce714fdc8c7b26a974cc8bd89f746a3db6e78eba902 not found: ID does not exist" containerID="b7bf1cee88f3eaca26285ce714fdc8c7b26a974cc8bd89f746a3db6e78eba902" Mar 13 12:56:00 crc kubenswrapper[4837]: I0313 12:56:00.889066 4837 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7bf1cee88f3eaca26285ce714fdc8c7b26a974cc8bd89f746a3db6e78eba902"} err="failed to get container status \"b7bf1cee88f3eaca26285ce714fdc8c7b26a974cc8bd89f746a3db6e78eba902\": rpc error: code = NotFound desc = could not find container \"b7bf1cee88f3eaca26285ce714fdc8c7b26a974cc8bd89f746a3db6e78eba902\": container with ID starting with b7bf1cee88f3eaca26285ce714fdc8c7b26a974cc8bd89f746a3db6e78eba902 not found: ID does not exist" Mar 13 12:56:01 crc kubenswrapper[4837]: I0313 12:56:01.012023 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556776-t6fxv"] Mar 13 12:56:01 crc kubenswrapper[4837]: I0313 12:56:01.060885 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b9fb61f-c188-4889-a632-b3e0e4807ced" path="/var/lib/kubelet/pods/1b9fb61f-c188-4889-a632-b3e0e4807ced/volumes" Mar 13 12:56:01 crc kubenswrapper[4837]: I0313 12:56:01.806967 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556776-t6fxv" event={"ID":"d64a81f2-3643-4e57-8322-c09c0360d46b","Type":"ContainerStarted","Data":"d96b73b2fd795c760721a9677f17c0f047b16fbbb6085f7694336b1ec8d78177"} Mar 13 12:56:02 crc kubenswrapper[4837]: I0313 12:56:02.820446 4837 generic.go:334] "Generic (PLEG): container finished" podID="d64a81f2-3643-4e57-8322-c09c0360d46b" containerID="1086c80585379365c2bff27c51b687e139e8eb2c034f632dcdf9de8104b8d107" exitCode=0 Mar 13 12:56:02 crc kubenswrapper[4837]: I0313 12:56:02.820587 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556776-t6fxv" event={"ID":"d64a81f2-3643-4e57-8322-c09c0360d46b","Type":"ContainerDied","Data":"1086c80585379365c2bff27c51b687e139e8eb2c034f632dcdf9de8104b8d107"} Mar 13 12:56:04 crc kubenswrapper[4837]: I0313 12:56:04.168847 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556776-t6fxv" Mar 13 12:56:04 crc kubenswrapper[4837]: I0313 12:56:04.341652 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwg5x\" (UniqueName: \"kubernetes.io/projected/d64a81f2-3643-4e57-8322-c09c0360d46b-kube-api-access-jwg5x\") pod \"d64a81f2-3643-4e57-8322-c09c0360d46b\" (UID: \"d64a81f2-3643-4e57-8322-c09c0360d46b\") " Mar 13 12:56:04 crc kubenswrapper[4837]: I0313 12:56:04.349072 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d64a81f2-3643-4e57-8322-c09c0360d46b-kube-api-access-jwg5x" (OuterVolumeSpecName: "kube-api-access-jwg5x") pod "d64a81f2-3643-4e57-8322-c09c0360d46b" (UID: "d64a81f2-3643-4e57-8322-c09c0360d46b"). InnerVolumeSpecName "kube-api-access-jwg5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:56:04 crc kubenswrapper[4837]: I0313 12:56:04.444144 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwg5x\" (UniqueName: \"kubernetes.io/projected/d64a81f2-3643-4e57-8322-c09c0360d46b-kube-api-access-jwg5x\") on node \"crc\" DevicePath \"\"" Mar 13 12:56:04 crc kubenswrapper[4837]: I0313 12:56:04.837344 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556776-t6fxv" event={"ID":"d64a81f2-3643-4e57-8322-c09c0360d46b","Type":"ContainerDied","Data":"d96b73b2fd795c760721a9677f17c0f047b16fbbb6085f7694336b1ec8d78177"} Mar 13 12:56:04 crc kubenswrapper[4837]: I0313 12:56:04.837613 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d96b73b2fd795c760721a9677f17c0f047b16fbbb6085f7694336b1ec8d78177" Mar 13 12:56:04 crc kubenswrapper[4837]: I0313 12:56:04.837410 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556776-t6fxv" Mar 13 12:56:05 crc kubenswrapper[4837]: I0313 12:56:05.229944 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556770-5vjcn"] Mar 13 12:56:05 crc kubenswrapper[4837]: I0313 12:56:05.241321 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556770-5vjcn"] Mar 13 12:56:07 crc kubenswrapper[4837]: I0313 12:56:07.066791 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39e3042e-9415-4734-bfa5-8def0b858b6e" path="/var/lib/kubelet/pods/39e3042e-9415-4734-bfa5-8def0b858b6e/volumes" Mar 13 12:56:10 crc kubenswrapper[4837]: I0313 12:56:10.250384 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-lkqv5/must-gather-vz7zz"] Mar 13 12:56:10 crc kubenswrapper[4837]: I0313 12:56:10.251283 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-lkqv5/must-gather-vz7zz" podUID="130c1c0e-31b1-415d-aab2-fab358576a73" containerName="copy" containerID="cri-o://bc010a3c2a92443b50c947cd27f9323f2921ea8aae80c058217be8b624f5d427" gracePeriod=2 Mar 13 12:56:10 crc kubenswrapper[4837]: I0313 12:56:10.263938 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-lkqv5/must-gather-vz7zz"] Mar 13 12:56:10 crc kubenswrapper[4837]: I0313 12:56:10.887833 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-lkqv5_must-gather-vz7zz_130c1c0e-31b1-415d-aab2-fab358576a73/copy/0.log" Mar 13 12:56:10 crc kubenswrapper[4837]: I0313 12:56:10.888188 4837 generic.go:334] "Generic (PLEG): container finished" podID="130c1c0e-31b1-415d-aab2-fab358576a73" containerID="bc010a3c2a92443b50c947cd27f9323f2921ea8aae80c058217be8b624f5d427" exitCode=143 Mar 13 12:56:11 crc kubenswrapper[4837]: I0313 12:56:11.169826 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-lkqv5_must-gather-vz7zz_130c1c0e-31b1-415d-aab2-fab358576a73/copy/0.log" Mar 13 12:56:11 crc kubenswrapper[4837]: I0313 12:56:11.170515 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lkqv5/must-gather-vz7zz" Mar 13 12:56:11 crc kubenswrapper[4837]: I0313 12:56:11.203661 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/130c1c0e-31b1-415d-aab2-fab358576a73-must-gather-output\") pod \"130c1c0e-31b1-415d-aab2-fab358576a73\" (UID: \"130c1c0e-31b1-415d-aab2-fab358576a73\") " Mar 13 12:56:11 crc kubenswrapper[4837]: I0313 12:56:11.203730 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzd72\" (UniqueName: \"kubernetes.io/projected/130c1c0e-31b1-415d-aab2-fab358576a73-kube-api-access-tzd72\") pod \"130c1c0e-31b1-415d-aab2-fab358576a73\" (UID: \"130c1c0e-31b1-415d-aab2-fab358576a73\") " Mar 13 12:56:11 crc kubenswrapper[4837]: I0313 12:56:11.209064 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/130c1c0e-31b1-415d-aab2-fab358576a73-kube-api-access-tzd72" (OuterVolumeSpecName: "kube-api-access-tzd72") pod "130c1c0e-31b1-415d-aab2-fab358576a73" (UID: "130c1c0e-31b1-415d-aab2-fab358576a73"). InnerVolumeSpecName "kube-api-access-tzd72". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:56:11 crc kubenswrapper[4837]: I0313 12:56:11.305958 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzd72\" (UniqueName: \"kubernetes.io/projected/130c1c0e-31b1-415d-aab2-fab358576a73-kube-api-access-tzd72\") on node \"crc\" DevicePath \"\"" Mar 13 12:56:11 crc kubenswrapper[4837]: I0313 12:56:11.371987 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/130c1c0e-31b1-415d-aab2-fab358576a73-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "130c1c0e-31b1-415d-aab2-fab358576a73" (UID: "130c1c0e-31b1-415d-aab2-fab358576a73"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 12:56:11 crc kubenswrapper[4837]: I0313 12:56:11.408285 4837 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/130c1c0e-31b1-415d-aab2-fab358576a73-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 13 12:56:11 crc kubenswrapper[4837]: I0313 12:56:11.897399 4837 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-lkqv5_must-gather-vz7zz_130c1c0e-31b1-415d-aab2-fab358576a73/copy/0.log" Mar 13 12:56:11 crc kubenswrapper[4837]: I0313 12:56:11.897919 4837 scope.go:117] "RemoveContainer" containerID="bc010a3c2a92443b50c947cd27f9323f2921ea8aae80c058217be8b624f5d427" Mar 13 12:56:11 crc kubenswrapper[4837]: I0313 12:56:11.897938 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lkqv5/must-gather-vz7zz" Mar 13 12:56:11 crc kubenswrapper[4837]: I0313 12:56:11.921507 4837 scope.go:117] "RemoveContainer" containerID="433a139fea2255c45e8580415a3deca8258493b41f46198b67c0eac345fb5a75" Mar 13 12:56:13 crc kubenswrapper[4837]: I0313 12:56:13.059945 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="130c1c0e-31b1-415d-aab2-fab358576a73" path="/var/lib/kubelet/pods/130c1c0e-31b1-415d-aab2-fab358576a73/volumes" Mar 13 12:56:33 crc kubenswrapper[4837]: I0313 12:56:33.341550 4837 scope.go:117] "RemoveContainer" containerID="8a03a622bd1e0141b38071e7ff2bc9ecddb0162408970736756d5805f18fdf44" Mar 13 12:56:35 crc kubenswrapper[4837]: I0313 12:56:35.483409 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:56:35 crc kubenswrapper[4837]: I0313 12:56:35.483905 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:57:05 crc kubenswrapper[4837]: I0313 12:57:05.483965 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:57:05 crc kubenswrapper[4837]: I0313 12:57:05.484380 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:57:35 crc kubenswrapper[4837]: I0313 12:57:35.484215 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:57:35 crc kubenswrapper[4837]: I0313 12:57:35.484767 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 12:57:35 crc kubenswrapper[4837]: I0313 12:57:35.484814 4837 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" Mar 13 12:57:35 crc kubenswrapper[4837]: I0313 12:57:35.485485 4837 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"29354e15c5b23a19853789916f8ed484d322f379033f92e55fb369856d2b8dbb"} pod="openshift-machine-config-operator/machine-config-daemon-2td4d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 12:57:35 crc kubenswrapper[4837]: I0313 12:57:35.485531 4837 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" containerID="cri-o://29354e15c5b23a19853789916f8ed484d322f379033f92e55fb369856d2b8dbb" gracePeriod=600 Mar 13 12:57:35 crc kubenswrapper[4837]: I0313 12:57:35.855973 4837 generic.go:334] "Generic (PLEG): container finished" podID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerID="29354e15c5b23a19853789916f8ed484d322f379033f92e55fb369856d2b8dbb" exitCode=0 Mar 13 12:57:35 crc kubenswrapper[4837]: I0313 12:57:35.856020 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerDied","Data":"29354e15c5b23a19853789916f8ed484d322f379033f92e55fb369856d2b8dbb"} Mar 13 12:57:35 crc kubenswrapper[4837]: I0313 12:57:35.856240 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" event={"ID":"338e0d25-c97d-42ec-a8ec-51ddf77a5ed8","Type":"ContainerStarted","Data":"d587321d51a1203fc11d365d3e6abfaf3ed8b51e170ae3c59f3a432ec954d9de"} Mar 13 12:57:35 crc kubenswrapper[4837]: I0313 12:57:35.856260 4837 scope.go:117] "RemoveContainer" containerID="103d1c88d8df65c5ee1ffd3b6a941f712068bd4bab0d918b54b0ad8617d9e9b0" Mar 13 12:58:00 crc kubenswrapper[4837]: I0313 12:58:00.140282 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556778-pbsh7"] Mar 13 12:58:00 crc kubenswrapper[4837]: E0313 12:58:00.142662 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b9fb61f-c188-4889-a632-b3e0e4807ced" containerName="registry-server" Mar 13 12:58:00 crc kubenswrapper[4837]: I0313 12:58:00.142786 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b9fb61f-c188-4889-a632-b3e0e4807ced" containerName="registry-server" Mar 13 12:58:00 crc kubenswrapper[4837]: E0313 12:58:00.142873 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b9fb61f-c188-4889-a632-b3e0e4807ced" containerName="extract-utilities" Mar 13 12:58:00 crc kubenswrapper[4837]: I0313 12:58:00.142951 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b9fb61f-c188-4889-a632-b3e0e4807ced" containerName="extract-utilities" Mar 13 12:58:00 crc kubenswrapper[4837]: E0313 12:58:00.143043 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="130c1c0e-31b1-415d-aab2-fab358576a73" containerName="gather" Mar 13 12:58:00 crc kubenswrapper[4837]: I0313 12:58:00.143146 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="130c1c0e-31b1-415d-aab2-fab358576a73" containerName="gather" Mar 13 12:58:00 crc kubenswrapper[4837]: E0313 12:58:00.143232 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b9fb61f-c188-4889-a632-b3e0e4807ced" containerName="extract-content" Mar 13 12:58:00 crc kubenswrapper[4837]: I0313 12:58:00.143314 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b9fb61f-c188-4889-a632-b3e0e4807ced" containerName="extract-content" Mar 13 12:58:00 crc kubenswrapper[4837]: E0313 12:58:00.143405 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d64a81f2-3643-4e57-8322-c09c0360d46b" containerName="oc" Mar 13 12:58:00 crc kubenswrapper[4837]: I0313 12:58:00.143484 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="d64a81f2-3643-4e57-8322-c09c0360d46b" containerName="oc" Mar 13 12:58:00 crc kubenswrapper[4837]: E0313 12:58:00.143582 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="130c1c0e-31b1-415d-aab2-fab358576a73" containerName="copy" Mar 13 12:58:00 crc kubenswrapper[4837]: I0313 12:58:00.143675 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="130c1c0e-31b1-415d-aab2-fab358576a73" containerName="copy" Mar 13 12:58:00 crc kubenswrapper[4837]: I0313 12:58:00.143977 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b9fb61f-c188-4889-a632-b3e0e4807ced" containerName="registry-server" Mar 13 12:58:00 crc kubenswrapper[4837]: I0313 12:58:00.144085 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="130c1c0e-31b1-415d-aab2-fab358576a73" containerName="copy" Mar 13 12:58:00 crc kubenswrapper[4837]: I0313 12:58:00.144177 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="d64a81f2-3643-4e57-8322-c09c0360d46b" containerName="oc" Mar 13 12:58:00 crc kubenswrapper[4837]: I0313 12:58:00.144259 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="130c1c0e-31b1-415d-aab2-fab358576a73" containerName="gather" Mar 13 12:58:00 crc kubenswrapper[4837]: I0313 12:58:00.145086 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556778-pbsh7" Mar 13 12:58:00 crc kubenswrapper[4837]: I0313 12:58:00.147371 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 12:58:00 crc kubenswrapper[4837]: I0313 12:58:00.148585 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 12:58:00 crc kubenswrapper[4837]: I0313 12:58:00.148610 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 12:58:00 crc kubenswrapper[4837]: I0313 12:58:00.152552 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556778-pbsh7"] Mar 13 12:58:00 crc kubenswrapper[4837]: I0313 12:58:00.292821 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv4qx\" (UniqueName: \"kubernetes.io/projected/3f053d99-932e-4b5e-812a-6f58a6580bef-kube-api-access-qv4qx\") pod \"auto-csr-approver-29556778-pbsh7\" (UID: \"3f053d99-932e-4b5e-812a-6f58a6580bef\") " pod="openshift-infra/auto-csr-approver-29556778-pbsh7" Mar 13 12:58:00 crc kubenswrapper[4837]: I0313 12:58:00.394198 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv4qx\" (UniqueName: \"kubernetes.io/projected/3f053d99-932e-4b5e-812a-6f58a6580bef-kube-api-access-qv4qx\") pod \"auto-csr-approver-29556778-pbsh7\" (UID: \"3f053d99-932e-4b5e-812a-6f58a6580bef\") " pod="openshift-infra/auto-csr-approver-29556778-pbsh7" Mar 13 12:58:00 crc kubenswrapper[4837]: I0313 12:58:00.413585 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv4qx\" (UniqueName: \"kubernetes.io/projected/3f053d99-932e-4b5e-812a-6f58a6580bef-kube-api-access-qv4qx\") pod \"auto-csr-approver-29556778-pbsh7\" (UID: \"3f053d99-932e-4b5e-812a-6f58a6580bef\") " pod="openshift-infra/auto-csr-approver-29556778-pbsh7" Mar 13 12:58:00 crc kubenswrapper[4837]: I0313 12:58:00.462419 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556778-pbsh7" Mar 13 12:58:00 crc kubenswrapper[4837]: W0313 12:58:00.905758 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f053d99_932e_4b5e_812a_6f58a6580bef.slice/crio-a7aee66ea0867e20081ad9cc8dd0c99b97ad869ab35b6a16cece45cd867b13aa WatchSource:0}: Error finding container a7aee66ea0867e20081ad9cc8dd0c99b97ad869ab35b6a16cece45cd867b13aa: Status 404 returned error can't find the container with id a7aee66ea0867e20081ad9cc8dd0c99b97ad869ab35b6a16cece45cd867b13aa Mar 13 12:58:00 crc kubenswrapper[4837]: I0313 12:58:00.915258 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556778-pbsh7"] Mar 13 12:58:01 crc kubenswrapper[4837]: I0313 12:58:01.091528 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556778-pbsh7" event={"ID":"3f053d99-932e-4b5e-812a-6f58a6580bef","Type":"ContainerStarted","Data":"a7aee66ea0867e20081ad9cc8dd0c99b97ad869ab35b6a16cece45cd867b13aa"} Mar 13 12:58:02 crc kubenswrapper[4837]: I0313 12:58:02.100905 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556778-pbsh7" event={"ID":"3f053d99-932e-4b5e-812a-6f58a6580bef","Type":"ContainerStarted","Data":"7f19f392263b0041f3e39b1838209aac48dd556dd18ed9e53e8ba99d4a96b20e"} Mar 13 12:58:02 crc kubenswrapper[4837]: I0313 12:58:02.122996 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556778-pbsh7" podStartSLOduration=1.321761548 podStartE2EDuration="2.122977652s" podCreationTimestamp="2026-03-13 12:58:00 +0000 UTC" firstStartedPulling="2026-03-13 12:58:00.907841349 +0000 UTC m=+4196.546108112" lastFinishedPulling="2026-03-13 12:58:01.709057453 +0000 UTC m=+4197.347324216" observedRunningTime="2026-03-13 12:58:02.112575925 +0000 UTC m=+4197.750842688" watchObservedRunningTime="2026-03-13 12:58:02.122977652 +0000 UTC m=+4197.761244415" Mar 13 12:58:03 crc kubenswrapper[4837]: I0313 12:58:03.110447 4837 generic.go:334] "Generic (PLEG): container finished" podID="3f053d99-932e-4b5e-812a-6f58a6580bef" containerID="7f19f392263b0041f3e39b1838209aac48dd556dd18ed9e53e8ba99d4a96b20e" exitCode=0 Mar 13 12:58:03 crc kubenswrapper[4837]: I0313 12:58:03.110501 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556778-pbsh7" event={"ID":"3f053d99-932e-4b5e-812a-6f58a6580bef","Type":"ContainerDied","Data":"7f19f392263b0041f3e39b1838209aac48dd556dd18ed9e53e8ba99d4a96b20e"} Mar 13 12:58:04 crc kubenswrapper[4837]: I0313 12:58:04.483260 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556778-pbsh7" Mar 13 12:58:04 crc kubenswrapper[4837]: I0313 12:58:04.675873 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qv4qx\" (UniqueName: \"kubernetes.io/projected/3f053d99-932e-4b5e-812a-6f58a6580bef-kube-api-access-qv4qx\") pod \"3f053d99-932e-4b5e-812a-6f58a6580bef\" (UID: \"3f053d99-932e-4b5e-812a-6f58a6580bef\") " Mar 13 12:58:04 crc kubenswrapper[4837]: I0313 12:58:04.681245 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f053d99-932e-4b5e-812a-6f58a6580bef-kube-api-access-qv4qx" (OuterVolumeSpecName: "kube-api-access-qv4qx") pod "3f053d99-932e-4b5e-812a-6f58a6580bef" (UID: "3f053d99-932e-4b5e-812a-6f58a6580bef"). InnerVolumeSpecName "kube-api-access-qv4qx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 12:58:04 crc kubenswrapper[4837]: I0313 12:58:04.778560 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qv4qx\" (UniqueName: \"kubernetes.io/projected/3f053d99-932e-4b5e-812a-6f58a6580bef-kube-api-access-qv4qx\") on node \"crc\" DevicePath \"\"" Mar 13 12:58:05 crc kubenswrapper[4837]: I0313 12:58:05.134070 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556778-pbsh7" event={"ID":"3f053d99-932e-4b5e-812a-6f58a6580bef","Type":"ContainerDied","Data":"a7aee66ea0867e20081ad9cc8dd0c99b97ad869ab35b6a16cece45cd867b13aa"} Mar 13 12:58:05 crc kubenswrapper[4837]: I0313 12:58:05.134120 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7aee66ea0867e20081ad9cc8dd0c99b97ad869ab35b6a16cece45cd867b13aa" Mar 13 12:58:05 crc kubenswrapper[4837]: I0313 12:58:05.134163 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556778-pbsh7" Mar 13 12:58:05 crc kubenswrapper[4837]: I0313 12:58:05.188830 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556772-5k6fp"] Mar 13 12:58:05 crc kubenswrapper[4837]: I0313 12:58:05.197170 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556772-5k6fp"] Mar 13 12:58:07 crc kubenswrapper[4837]: I0313 12:58:07.058768 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ba6a258-c015-4c82-b7d0-736ea4ddf3a0" path="/var/lib/kubelet/pods/1ba6a258-c015-4c82-b7d0-736ea4ddf3a0/volumes" Mar 13 12:58:33 crc kubenswrapper[4837]: I0313 12:58:33.482504 4837 scope.go:117] "RemoveContainer" containerID="202f4741378dc74444f14dd2386ad8db9f6a085bd9e8216a4ebc85b491ab3c81" Mar 13 12:59:35 crc kubenswrapper[4837]: I0313 12:59:35.484771 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 12:59:35 crc kubenswrapper[4837]: I0313 12:59:35.486235 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.148262 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556780-bhggf"] Mar 13 13:00:00 crc kubenswrapper[4837]: E0313 13:00:00.149290 4837 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f053d99-932e-4b5e-812a-6f58a6580bef" containerName="oc" Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.149305 4837 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f053d99-932e-4b5e-812a-6f58a6580bef" containerName="oc" Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.149503 4837 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f053d99-932e-4b5e-812a-6f58a6580bef" containerName="oc" Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.150081 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556780-bhggf" Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.156594 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jlzkj" Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.157293 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.157292 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.159499 4837 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556780-6ffww"] Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.160861 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556780-6ffww" Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.162262 4837 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.162698 4837 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.171218 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556780-6ffww"] Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.182230 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556780-bhggf"] Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.243142 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa971b27-9617-4542-83d2-0f99d06a6d7a-config-volume\") pod \"collect-profiles-29556780-6ffww\" (UID: \"fa971b27-9617-4542-83d2-0f99d06a6d7a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556780-6ffww" Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.243208 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa971b27-9617-4542-83d2-0f99d06a6d7a-secret-volume\") pod \"collect-profiles-29556780-6ffww\" (UID: \"fa971b27-9617-4542-83d2-0f99d06a6d7a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556780-6ffww" Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.243513 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5nnn\" (UniqueName: \"kubernetes.io/projected/fa971b27-9617-4542-83d2-0f99d06a6d7a-kube-api-access-j5nnn\") pod \"collect-profiles-29556780-6ffww\" (UID: \"fa971b27-9617-4542-83d2-0f99d06a6d7a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556780-6ffww" Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.243618 4837 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7nsj\" (UniqueName: \"kubernetes.io/projected/293685bd-214f-4596-863a-1e9ecee9d95b-kube-api-access-z7nsj\") pod \"auto-csr-approver-29556780-bhggf\" (UID: \"293685bd-214f-4596-863a-1e9ecee9d95b\") " pod="openshift-infra/auto-csr-approver-29556780-bhggf" Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.345767 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa971b27-9617-4542-83d2-0f99d06a6d7a-config-volume\") pod \"collect-profiles-29556780-6ffww\" (UID: \"fa971b27-9617-4542-83d2-0f99d06a6d7a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556780-6ffww" Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.345875 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa971b27-9617-4542-83d2-0f99d06a6d7a-secret-volume\") pod \"collect-profiles-29556780-6ffww\" (UID: \"fa971b27-9617-4542-83d2-0f99d06a6d7a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556780-6ffww" Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.346051 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5nnn\" (UniqueName: \"kubernetes.io/projected/fa971b27-9617-4542-83d2-0f99d06a6d7a-kube-api-access-j5nnn\") pod \"collect-profiles-29556780-6ffww\" (UID: \"fa971b27-9617-4542-83d2-0f99d06a6d7a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556780-6ffww" Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.346122 4837 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7nsj\" (UniqueName: \"kubernetes.io/projected/293685bd-214f-4596-863a-1e9ecee9d95b-kube-api-access-z7nsj\") pod \"auto-csr-approver-29556780-bhggf\" (UID: \"293685bd-214f-4596-863a-1e9ecee9d95b\") " pod="openshift-infra/auto-csr-approver-29556780-bhggf" Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.346859 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa971b27-9617-4542-83d2-0f99d06a6d7a-config-volume\") pod \"collect-profiles-29556780-6ffww\" (UID: \"fa971b27-9617-4542-83d2-0f99d06a6d7a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556780-6ffww" Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.352159 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa971b27-9617-4542-83d2-0f99d06a6d7a-secret-volume\") pod \"collect-profiles-29556780-6ffww\" (UID: \"fa971b27-9617-4542-83d2-0f99d06a6d7a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556780-6ffww" Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.366097 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7nsj\" (UniqueName: \"kubernetes.io/projected/293685bd-214f-4596-863a-1e9ecee9d95b-kube-api-access-z7nsj\") pod \"auto-csr-approver-29556780-bhggf\" (UID: \"293685bd-214f-4596-863a-1e9ecee9d95b\") " pod="openshift-infra/auto-csr-approver-29556780-bhggf" Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.366881 4837 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5nnn\" (UniqueName: \"kubernetes.io/projected/fa971b27-9617-4542-83d2-0f99d06a6d7a-kube-api-access-j5nnn\") pod \"collect-profiles-29556780-6ffww\" (UID: \"fa971b27-9617-4542-83d2-0f99d06a6d7a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556780-6ffww" Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.481530 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556780-bhggf" Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.496681 4837 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556780-6ffww" Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.941001 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556780-6ffww"] Mar 13 13:00:00 crc kubenswrapper[4837]: W0313 13:00:00.951513 4837 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod293685bd_214f_4596_863a_1e9ecee9d95b.slice/crio-ebbeba1290baa1b891587325e104a68b547b63d168b34d3f0777b61e45c84747 WatchSource:0}: Error finding container ebbeba1290baa1b891587325e104a68b547b63d168b34d3f0777b61e45c84747: Status 404 returned error can't find the container with id ebbeba1290baa1b891587325e104a68b547b63d168b34d3f0777b61e45c84747 Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.959620 4837 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 13:00:00 crc kubenswrapper[4837]: I0313 13:00:00.964652 4837 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556780-bhggf"] Mar 13 13:00:01 crc kubenswrapper[4837]: I0313 13:00:01.228591 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556780-6ffww" event={"ID":"fa971b27-9617-4542-83d2-0f99d06a6d7a","Type":"ContainerStarted","Data":"8d7cdc4c7b18315fcb8174acd2c4d4ae189b674f4f7257f266a090ea7f6dfe22"} Mar 13 13:00:01 crc kubenswrapper[4837]: I0313 13:00:01.228713 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556780-6ffww" event={"ID":"fa971b27-9617-4542-83d2-0f99d06a6d7a","Type":"ContainerStarted","Data":"76dab8b6b65f6b1976c605d530d9fb847ddfd01e344a115de3fff71ccf82fb1e"} Mar 13 13:00:01 crc kubenswrapper[4837]: I0313 13:00:01.230602 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556780-bhggf" event={"ID":"293685bd-214f-4596-863a-1e9ecee9d95b","Type":"ContainerStarted","Data":"ebbeba1290baa1b891587325e104a68b547b63d168b34d3f0777b61e45c84747"} Mar 13 13:00:01 crc kubenswrapper[4837]: I0313 13:00:01.243930 4837 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29556780-6ffww" podStartSLOduration=1.243914389 podStartE2EDuration="1.243914389s" podCreationTimestamp="2026-03-13 13:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:00:01.241941067 +0000 UTC m=+4316.880207830" watchObservedRunningTime="2026-03-13 13:00:01.243914389 +0000 UTC m=+4316.882181152" Mar 13 13:00:02 crc kubenswrapper[4837]: I0313 13:00:02.240913 4837 generic.go:334] "Generic (PLEG): container finished" podID="fa971b27-9617-4542-83d2-0f99d06a6d7a" containerID="8d7cdc4c7b18315fcb8174acd2c4d4ae189b674f4f7257f266a090ea7f6dfe22" exitCode=0 Mar 13 13:00:02 crc kubenswrapper[4837]: I0313 13:00:02.240982 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556780-6ffww" event={"ID":"fa971b27-9617-4542-83d2-0f99d06a6d7a","Type":"ContainerDied","Data":"8d7cdc4c7b18315fcb8174acd2c4d4ae189b674f4f7257f266a090ea7f6dfe22"} Mar 13 13:00:04 crc kubenswrapper[4837]: I0313 13:00:04.019146 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556780-6ffww" Mar 13 13:00:04 crc kubenswrapper[4837]: I0313 13:00:04.112968 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa971b27-9617-4542-83d2-0f99d06a6d7a-config-volume\") pod \"fa971b27-9617-4542-83d2-0f99d06a6d7a\" (UID: \"fa971b27-9617-4542-83d2-0f99d06a6d7a\") " Mar 13 13:00:04 crc kubenswrapper[4837]: I0313 13:00:04.113152 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5nnn\" (UniqueName: \"kubernetes.io/projected/fa971b27-9617-4542-83d2-0f99d06a6d7a-kube-api-access-j5nnn\") pod \"fa971b27-9617-4542-83d2-0f99d06a6d7a\" (UID: \"fa971b27-9617-4542-83d2-0f99d06a6d7a\") " Mar 13 13:00:04 crc kubenswrapper[4837]: I0313 13:00:04.113182 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa971b27-9617-4542-83d2-0f99d06a6d7a-secret-volume\") pod \"fa971b27-9617-4542-83d2-0f99d06a6d7a\" (UID: \"fa971b27-9617-4542-83d2-0f99d06a6d7a\") " Mar 13 13:00:04 crc kubenswrapper[4837]: I0313 13:00:04.113896 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa971b27-9617-4542-83d2-0f99d06a6d7a-config-volume" (OuterVolumeSpecName: "config-volume") pod "fa971b27-9617-4542-83d2-0f99d06a6d7a" (UID: "fa971b27-9617-4542-83d2-0f99d06a6d7a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:00:04 crc kubenswrapper[4837]: I0313 13:00:04.115132 4837 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa971b27-9617-4542-83d2-0f99d06a6d7a-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 13:00:04 crc kubenswrapper[4837]: I0313 13:00:04.119097 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa971b27-9617-4542-83d2-0f99d06a6d7a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "fa971b27-9617-4542-83d2-0f99d06a6d7a" (UID: "fa971b27-9617-4542-83d2-0f99d06a6d7a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:00:04 crc kubenswrapper[4837]: I0313 13:00:04.119552 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa971b27-9617-4542-83d2-0f99d06a6d7a-kube-api-access-j5nnn" (OuterVolumeSpecName: "kube-api-access-j5nnn") pod "fa971b27-9617-4542-83d2-0f99d06a6d7a" (UID: "fa971b27-9617-4542-83d2-0f99d06a6d7a"). InnerVolumeSpecName "kube-api-access-j5nnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:00:04 crc kubenswrapper[4837]: I0313 13:00:04.217159 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5nnn\" (UniqueName: \"kubernetes.io/projected/fa971b27-9617-4542-83d2-0f99d06a6d7a-kube-api-access-j5nnn\") on node \"crc\" DevicePath \"\"" Mar 13 13:00:04 crc kubenswrapper[4837]: I0313 13:00:04.217196 4837 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa971b27-9617-4542-83d2-0f99d06a6d7a-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 13:00:04 crc kubenswrapper[4837]: I0313 13:00:04.257938 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556780-6ffww" event={"ID":"fa971b27-9617-4542-83d2-0f99d06a6d7a","Type":"ContainerDied","Data":"76dab8b6b65f6b1976c605d530d9fb847ddfd01e344a115de3fff71ccf82fb1e"} Mar 13 13:00:04 crc kubenswrapper[4837]: I0313 13:00:04.257980 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76dab8b6b65f6b1976c605d530d9fb847ddfd01e344a115de3fff71ccf82fb1e" Mar 13 13:00:04 crc kubenswrapper[4837]: I0313 13:00:04.258061 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556780-6ffww" Mar 13 13:00:04 crc kubenswrapper[4837]: I0313 13:00:04.314883 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556735-548l8"] Mar 13 13:00:04 crc kubenswrapper[4837]: I0313 13:00:04.324885 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556735-548l8"] Mar 13 13:00:05 crc kubenswrapper[4837]: I0313 13:00:05.063222 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c6ce131-8677-48bc-8f07-b53837bd751b" path="/var/lib/kubelet/pods/3c6ce131-8677-48bc-8f07-b53837bd751b/volumes" Mar 13 13:00:05 crc kubenswrapper[4837]: I0313 13:00:05.272748 4837 generic.go:334] "Generic (PLEG): container finished" podID="293685bd-214f-4596-863a-1e9ecee9d95b" containerID="66ef54191b2db003069fe2c0851e73076dafa7515d7eec4f4c56684a523cab10" exitCode=0 Mar 13 13:00:05 crc kubenswrapper[4837]: I0313 13:00:05.272808 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556780-bhggf" event={"ID":"293685bd-214f-4596-863a-1e9ecee9d95b","Type":"ContainerDied","Data":"66ef54191b2db003069fe2c0851e73076dafa7515d7eec4f4c56684a523cab10"} Mar 13 13:00:05 crc kubenswrapper[4837]: I0313 13:00:05.484484 4837 patch_prober.go:28] interesting pod/machine-config-daemon-2td4d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 13:00:05 crc kubenswrapper[4837]: I0313 13:00:05.485021 4837 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2td4d" podUID="338e0d25-c97d-42ec-a8ec-51ddf77a5ed8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 13:00:06 crc kubenswrapper[4837]: I0313 13:00:06.599872 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556780-bhggf" Mar 13 13:00:06 crc kubenswrapper[4837]: I0313 13:00:06.664001 4837 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7nsj\" (UniqueName: \"kubernetes.io/projected/293685bd-214f-4596-863a-1e9ecee9d95b-kube-api-access-z7nsj\") pod \"293685bd-214f-4596-863a-1e9ecee9d95b\" (UID: \"293685bd-214f-4596-863a-1e9ecee9d95b\") " Mar 13 13:00:06 crc kubenswrapper[4837]: I0313 13:00:06.669562 4837 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/293685bd-214f-4596-863a-1e9ecee9d95b-kube-api-access-z7nsj" (OuterVolumeSpecName: "kube-api-access-z7nsj") pod "293685bd-214f-4596-863a-1e9ecee9d95b" (UID: "293685bd-214f-4596-863a-1e9ecee9d95b"). InnerVolumeSpecName "kube-api-access-z7nsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:00:06 crc kubenswrapper[4837]: I0313 13:00:06.765870 4837 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7nsj\" (UniqueName: \"kubernetes.io/projected/293685bd-214f-4596-863a-1e9ecee9d95b-kube-api-access-z7nsj\") on node \"crc\" DevicePath \"\"" Mar 13 13:00:07 crc kubenswrapper[4837]: I0313 13:00:07.292915 4837 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556780-bhggf" event={"ID":"293685bd-214f-4596-863a-1e9ecee9d95b","Type":"ContainerDied","Data":"ebbeba1290baa1b891587325e104a68b547b63d168b34d3f0777b61e45c84747"} Mar 13 13:00:07 crc kubenswrapper[4837]: I0313 13:00:07.292950 4837 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556780-bhggf" Mar 13 13:00:07 crc kubenswrapper[4837]: I0313 13:00:07.292962 4837 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ebbeba1290baa1b891587325e104a68b547b63d168b34d3f0777b61e45c84747" Mar 13 13:00:07 crc kubenswrapper[4837]: I0313 13:00:07.666223 4837 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556774-vbb4z"] Mar 13 13:00:07 crc kubenswrapper[4837]: I0313 13:00:07.674901 4837 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556774-vbb4z"] Mar 13 13:00:09 crc kubenswrapper[4837]: I0313 13:00:09.069378 4837 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0473d8e9-f078-403a-a76b-c5bb02c0840d" path="/var/lib/kubelet/pods/0473d8e9-f078-403a-a76b-c5bb02c0840d/volumes"